TOC {:toc} Motivation Sometimes the matrix (samples) to be correlated is too large, then you need to compute the correlation when the data is pouring in, i.e. online computing correlation.
May 22, 2020
Note on Laplacian-Beltrami (Diffusion) Operator Motivation Laplacian on graph and on discrete geometry (mesh) are very useful tools. One core intuition, just like Laplacian in $\R^n$ space, it’s related to diffusion and heat equation. Recall the diffusion equation is
May 8, 2020
Spectral Graph Theory and Segmentation Motivation Spectral Graph Theory is a powerful tool as it sits at the center of multiple representation. Connects to Graph and manifold, and linear algrbra. It’s related to dynamics on graph, related to Markov chain, random walk (diffusion.) Could be applied to any point cloud: images, meshes are suited. Could be used to perform clustering, segmentation etc. Linear Algebra Review There are several ways to see a eigenvalue problem
Apr 22, 2020
Note on Hyperbolic Geometry Reference Notes 2018 Lec Note 2015 Lecture note Ch5-3 Measurement in Hyperbolic Geometry [Cheatsheet / Note](http://home.iiserb.ac.in/~kashyap/MTH 520/lp.pdf) Motivation Hyperbolic geometry is a great source of inspiration for math art. Besides it is used to model some hierarchical data structure. Here I collected a few models
Apr 10, 2020
Motivation Major Reference Zeroth order optimization, or derivative free optimization is also known as the oracle problem. It’s nothing new to optimization community. Interest in ZOO algorithm resurges partly because it could be used in black box adversarial attack, if the softmax probability is given; and it could also be used in optimization of experimental output; and it could also be used for many design problem as the result has a non-analytical relationship with the parameters.
Jan 21, 2020
Krylov Subspace, Lancosz Iteration, QR and Conjugate Gradient Motivation In practise, many numerical algorithms include iteratively multiply a matrix, like power method and QR algorithm. All these algorithms have their core connected to a single construct, Krylov subspace and a operation, Lancosz Iteration. So this note motivates to understand this core.
Jan 1, 2020
TOC {:toc} Deep Learning Environment Currently we find that multiple version of CUDA could be installed on windows. And different frameworks could use different CUDA version nicely together. PyTorch Tensorflow Co-environment Currently, we can have
Dec 19, 2019
Objective Here I want to compare several common deep learning frameworks and make sense of their workflow. Core Logic Tensorflow General Comments: TF is more like a library, in which many low-level operations are defined and programs are long. In contrast, Keras which can use tensorflow as backend has the similar level of abstraction as PyTorch, which is a higher level deep learning package. TFLearn may also be a higher level wrapper.
Dec 18, 2019
TOC {:toc} Note on Online Regression Algorithm Least Square Problem Classical least square linear regression is $$ \hat \beta_{ls}=\arg\min_\beta\|y-X\beta\|^2_2 $$ With regularizations it becomes a ridge or lasso regression problem
Dec 15, 2019
Motivation Sometimes we want to examine the Hessian or Jacobian of a function w.r.t some variables. For that purpose, autogradient algorithm can help us. Autograd mechanism In Essence, Autograd requires a computational graph. (Directed Acyclic Graph) For each computational node (e.g. $z=f(x,y)$), we define a forward computation $(x,y)\mapsto z,\ z=f(x,y)$ mapping bottom to top, and a backward computation mapping the partial derivative to top to the partial derivative to bottom. $\partial_z\mapsto (\partial_x,\partial_y); (gx,gy)=g(gz;x,y)$ .
Dec 9, 2019