TOC {:toc} Note on Online Regression Algorithm Least Square Problem Classical least square linear regression is $$ \hat \beta_{ls}=\arg\min_\beta\|y-X\beta\|^2_2 $$ With regularizations it becomes a ridge or lasso regression problem
Dec 15, 2019
TOC {:toc} Problem Setting The original problem of non-negative matrix factorization is simple, if the dissimarity $D(A\|HW)$ between original matrix and reconstructed one is L2 distance than, $$ argmin_{H,W} \|A-HW\|_F^2, \\ s.t.\ W\succeq0, H\succeq0 $$The non-negative constraint applies element-wise.
Jun 25, 2019