Linear Algebra

Note on Hippo- Recurrent Memory with Optimal Polynomial Projection

Hippo: Recurrent Memory with Optimal Polynomial Projection Motivation Hidden state in RNN represents a form of memory of the past. For a sequence, a natural way to represent the past sequence is to project it onto an orthonormal basis set. Here depending on the different emphasis of the past, we could define different measures on the time axis and define the basis set based on this measure. Then we can keep track of the projection coefficient on this basis when observing new data points.

Jul 25, 2022

Note on S4-Efficiently Modeling Long Sequences with Structured State Spaces

[TOC] Motivation S4 sequence model is rising in the sequence modelling field. It dominates on long sequence modelling over RNN, LSTM and transformers. It’s both mathematically elegant and useful, and it’s trending, so why not write about it.

Jul 17, 2022

Some Famous Matrix Determinant and Inversion Identities

Motivation How to compute determinant or inversion of matrix with a low rank modificaiton? This is a very interesting and important math technique in statistical methods, since people frequently model covariance matrix or connectivity matrix as such: a matrix plus a low rank modification.

Apr 16, 2022

Note on Kernel PCA

TOC {:toc} Motivation Simply put, “kernel trick” is the finding that sometimes only inner product appears in the formulation of some algorithms. because of this, we could substitute the inner product with some fancier kernel function, i.e. inner product in some other spaces. This post is about another usage of kernel trick. Another usage is Kernel (ridge) Regression.

Mar 21, 2022

Analytical Techniques for Finding Spectrum of Tridiagonal Toeplitz Matrix

Motivation As mentioned in our high-dimensional PCA note, understanding the spectrum of Toeplitz matrix is important. The subject itself is a bit technical, but the analytical techniques involved in it are splendid and general. So here I took note from this paper and present way to calculate the spectrum on paper (or by mathematica).

Feb 4, 2022

PCA of High Dimensional Trajectory

Motivation When you think about random walks, what shape do you think about? Is it like this? Or this? These are good examples of random walks in two or three dimensions. But what about random walks in higher dimensions?

Jan 22, 2022

Note on Kernel Ridge Regression

TOC {:toc} Motivation Understand the use of kernel in regression problems. For usage in unsupervised learning / dimension reduction, see notes on Kernel PCA. Kernel in Classification Kernel is usually introduced in SVM classification problems. The rationale is that a linearly non-separable dataset could be separable in a high-dimensional feature space using the mapping $\phi:\mathcal X\to\mathcal F$ .

Dec 17, 2021

Note on Laplacian Operator (Diffusion) in Geometry Processing

Note on Laplacian-Beltrami (Diffusion) Operator Motivation Laplacian on graph and on discrete geometry (mesh) are very useful tools. One core intuition, just like Laplacian in $\R^n$ space, it’s related to diffusion and heat equation. Recall the diffusion equation is

May 8, 2020

Spectral Graph Theory and Segmentation

Spectral Graph Theory and Segmentation Motivation Spectral Graph Theory is a powerful tool as it sits at the center of multiple representation. Connects to Graph and manifold, and linear algrbra. It’s related to dynamics on graph, related to Markov chain, random walk (diffusion.) Could be applied to any point cloud: images, meshes are suited. Could be used to perform clustering, segmentation etc. Linear Algebra Review There are several ways to see a eigenvalue problem

Apr 22, 2020

Krylov Subspace, Conjugate Gradient and Lancosz Iteration

Krylov Subspace, Lancosz Iteration, QR and Conjugate Gradient Motivation In practise, many numerical algorithms include iteratively multiply a matrix, like power method and QR algorithm. All these algorithms have their core connected to a single construct, Krylov subspace and a operation, Lancosz Iteration. So this note motivates to understand this core.

Jan 1, 2020