Motivation When you think about random walks, what shape do you think about? Is it like this? Or this? These are good examples of random walks in two or three dimensions. But what about random walks in higher dimensions?
TOC {:toc} Motivation Understand the use of kernel in regression problems. For usage in unsupervised learning / dimension reduction, see notes on Kernel PCA. Kernel in Classification Kernel is usually introduced in SVM classification problems. The rationale is that a linearly non-separable dataset could be separable in a high-dimensional feature space using the mapping $\phi:\mathcal X\to\mathcal F$ .
Motivation There is a resurgent of interest in investigating and developing Hopfield network in recent years. This development is quite exciting in that it connect classic models in physics and machine learning to modern techniques like transformers.
Rationale Hopfield Network can be viewed an energy based model: deriving all properties from it. General RNN has many complex behaviors, but setting symmetric connections can prohibit it! No oscillation is possible in a symmetric matrix.
Stability Theory When a system has control, then comes the questions of whether we could make it stable under the control. Problem Setup A control affine system is this $$ \dot x =f(x)+\sum_i g_i(x)\bar u_i=f(x)+g(x)\bar u $$Interpretation:
Basic Notions Def Topological Equivalence: 2 dynamic systems are topological equivalence when there is a homeomorphism between their solutions. Def Conjugate: Two maps are connected by $$g=h^{-1}\circ f \circ h$$.
Bifurcation Normal Form Invariance and Stable Manifold Lyapnov Stability Theory Feedback Stablization
Stability Theory Motivation We want to know for a dynamic system, in this note majorly autonomous system, when it is stable? The meaning of stability? If it’s stable how to prove so. Majorly we are going to use Lyapnov functions and spectral properties of linearized system to prove.
Invariance Properties of an Invariance Set Stable and Unstable Manifold Theorem
Motivations Many CNN models have become the bread and butter in modern deep learning pipeline. Here I’m summarizing some famous CNN structure and their key innovations as I use them.