ML

Note on Modern Hopfield Network and Transformers

Motivation There is a resurgent of interest in investigating and developing Hopfield network in recent years. This development is quite exciting in that it connect classic models in physics and machine learning to modern techniques like transformers.

Nov 15, 2021

Note on Hopfield Network

Rationale Hopfield Network can be viewed an energy based model: deriving all properties from it. General RNN has many complex behaviors, but setting symmetric connections can prohibit it! No oscillation is possible in a symmetric matrix.

Nov 15, 2021

Note on Word2Vec

Motivation Word2Vec is a very famous method that I heard of since the freshman year in college (yeah it comes out in 2013). Recently, some reviewer reminds us of the similarity of the “analogy” learnt by the vector representation of words and the vector analogy of image space in GAN or VAE.

Nov 27, 2020

Note on Non-Parametric Regression

Problem Statement Given a bunch of noisy data, you want a smooth curve going through the cloud. As the points are noisy, there is no need to going through each point.

Oct 13, 2020

Note on Gaussian Process

Note on Gaussian Process Gaussian Process can be thought of as a Gaussian distribution in function space (or infinite dimension vector). One of its major usage is to tackle nonlinear regression problem and provide mean estimate and errorbar around it.

Jul 1, 2020

Note on Bayesian Optimization

Note on Bayesian Optimization Related to Gaussain Process model Philosophy Bayesian Optimization applies to black box functions and it employs the active learning philosophy. Use Case and Limitation BO is preferred in such cases

Jul 1, 2020