8 items tagged
Motivation Here we summarize a few common probabilistic neural population models. Adapted from reading notes and class presentations from Neuro QC316 taught by Jan Drugowitsch. LNP, GLM These are the simplist models of neurons.
Problem Statement Given a bunch of noisy data, you want a smooth curve going through the cloud. As the points are noisy, there is no need to going through each point.
Note on Neural Tuning and Information Given a stimuli with $D$ intrinsic dimensions, we consider how one neuron or a population of neurons is informative about this stimulus space. Specific Information (Mutual Information) Setup for specific information computation is easy given a certain response $r$ , compute the reduction of entropy of stimuli $\mathbb s$ .
Note on Gaussian Process Gaussian Process can be thought of as a Gaussian distribution in function space (or infinite dimension vector). One of its major usage is to tackle nonlinear regression problem and provide mean estimate and errorbar around it.
Note on Bayesian Optimization Related to Gaussain Process model Philosophy Bayesian Optimization applies to black box functions and it employs the active learning philosophy. Use Case and Limitation BO is preferred in such cases
TOC {:toc} Motivation Sometimes the matrix (samples) to be correlated is too large, then you need to compute the correlation when the data is pouring in, i.e. online computing correlation.
Some Computation on Sphere (Updating) Motivation Recently, in research, we encounter quite a few statistical problems on sphere. For example, Head direction tuning 3d direction of object 3d direction of body parts Some 3d tuning There are many standard statistical operations on Euclidean space, like getting mean, standard deviation and generate uniform distribution, fitting a model etc. We can perform these operation without thinking.