Skip to main content

Scalable nonparametric models for large-scale neural datasets

Difficulty level

This talk covers statistical methods for characterizing neural population responses and extracting structure from high-dimensional neural data.

Topics covered in this lesson

Talk abstract: Statistical models for binary spike responses provide a powerful tool for understanding the statistical dependencies in large-scale neural recordings.  Maximum entropy (or "maxent") models, which seek to explain dependencies in terms of low-order interactions between neurons, have enjoyed remarkable success in modeling such patterns, particularly for small groups of neurons. However, these models are computationally intractable for large populations, and low-order maxent models do not accurately describe many datasets.  To overcome these limitations, I will describe a family of ``universal'' models for binary spike patterns, where universality refers to the ability to model arbitrary distributions over all possible 2^M binary patterns.  The basic approach, which exploits ideas from Bayesian nonparametrics, consists of Dirichlet process combined with a well-behaved parametric "base" model, which naturally combines the flexibility of a histogram and the parsimony of a parametric model.  I will explore computational and statistical issues for scaling these models to large-scale recordings and show applications to neural data from primate V1.