Current Research Interests

After establishing myself in the Redwood center for theoretical neuroscience, I have come to enjoy a number of general research areas and techniques. Here is a outline of the concepts that currently inform my research:

Statistics and Machine Learning:

Bayesian Models, Graphical Modeling, Hidden Markov Models, GLM Modeling, Sampling Algorithms (eg. Particle Filtering, MCMC, HMC, Gibbs Sampling), Information theory, Rate-Distortion Theory, Maximum Entropy Models, Optimization Methods (eg. Improving on Gradient Descent)

Deep Learning

Back Propagation (including Automatic Differentiation), Convolutional Networks, Recurrent Networks, LSTMs, Energy Models (including contrastive divergence, Minimum Probability Flow, Ising Models, RBMs), Neural Network Art, Deep Reinforcement Learning, Diffusion Probabilistic Models

Vision

Retina, LGN, Visual Cortex, Fixational Eye Movements, Scanning Laser Ophthalmoscopy, Models of Feedback

Computational Neuroscience

Redundancy Reduction, Efficient Coding, Sparse Coding (and variants, such as Locally Competitive Algorithm, SAILnet), Natural Scene Statistics, GLM Models of Early Visual Processing, Hierarchical Models, Models of Feedback, Hopfield Networks

Computer Vision

Image Pyramids, Fourier Methods for Images, Optical Flow, Image Segmentation

Programming

Object Oriented Programming, Python, C++, emacs, vim, Sublime Text 3, theano, blocks, caffe, git

Past Interests

High School Math Competitions (focus on Euclidean Geometry and Algebra), Abstract Algebra, Complex Analysis, Behavior of Dynamical Systems, Perturbation Theory, Asymptotics, Non-Diffusive Random Walks