Skip to main content

Introduction to neurons, synaptic transmission, and ion channels.

Difficulty level: Beginner
Duration: 46:07

Introduction to the types of glial cells, homeostasis (influence of cerebral blood flow and influence on neurons), insulation and protection of axons (myelin sheath; nodes of Ranvier), microglia and reactions of the CNS to injury.

Difficulty level: Beginner
Duration: 40:32

This lecture covers: integrating information within a network, modulating and controlling networks, functions and dysfunctions of hippocampal networks, and the integrative network controlling sleep and arousal.

Difficulty level: Beginner
Duration: 47:05

This lecture focuses on the comprehension of nociception and pain sensation. It highlights how the somatosensory system and different molecular partners are involved in nociception and how nociception and pain sensation are studied in rodents and humans and the development of pain therapy.

Difficulty level: Beginner
Duration: 28:09
Speaker: : Serena Quarta

Introduction to simple spiking neuron models.

Difficulty level: Beginner
Duration: 48 Slides
Speaker: : Zubin Bhuyan

Introduction to simple spiking neuron models.

Difficulty level: Beginner
Duration: 48 Slides
Speaker: : Zubin Bhuyan

Introductory presentation on how data science can help with scientific reproducibility.

Difficulty level: Beginner
Duration:
Speaker: : Michel Dumontier

FAIR principles and methods currently in development for assessing FAIRness.

Difficulty level: Beginner
Duration:
Speaker: : Michel Dumontier

Audio slides presentation to accompany the paper titled: An automated pipeline for constructing personalized virtual brains from multimodal neuroimaging data. Authors: M. Schirner, S. Rothmeier, V. Jirsa, A.R. McIntosh, P. Ritter.

Difficulty level: Beginner
Duration: 4:56
Speaker: :
Course:

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture on model types introduces the advantages of modeling, provide examples of different model types, and explain what modeling is all about. This lecture contains links to 3 tutorials, lecture/tutorial slides, suggested reading list, and 3 recorded question and answer sessions.

Difficulty level: Beginner
Duration: 27:48
Speaker: : Gunnar Blohm

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture focuses on how to get from a scientific question to a model using concrete examples. We will present a 10-step practical guide on how to succeed in modeling. This lecture contains links to 2 tutorials, lecture/tutorial slides, suggested reading list, and 3 recorded question and answer sessions.

Difficulty level: Beginner
Duration: 29:52
Speaker: : Megan Peters

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture formalizes modeling as a decision process that is constrained by a precise problem statement and specific model goals. We provide real-life examples on how model building is usually less linear than presented in Modeling Practice I

Difficulty level: Beginner
Duration: 22:51
Speaker: : Gunnar Blohm

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture focuses on the purpose of model fitting, approaches to model fitting, model fitting for linear models, and how to assess the quality and compare model fits. We will present a 10-step practical guide on how to succeed in modeling. 

Difficulty level: Beginner
Duration: 26:46
Speaker: : Jan Drugowitsch

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture summarizes the concepts introduced in Model Fitting I and adds two additional concepts: 1) MLE is a frequentist way of looking at the data and the model, with its own limitations. 2) Side-by-side comparisons of bootstrapping and cross-validation.

Difficulty level: Beginner
Duration: 38.17
Speaker: : Kunlin Wei

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture provides an overview of generalized linear models (GLM) and contains links to 2 tutorials, lecture/tutorial slides, suggested reading list, and 3 recorded question and answer sessions.

Difficulty level: Beginner
Duration: 33:58
Speaker: : Cristina Savin

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture further develops the concepts introduced in Machine Learning I.

Difficulty level: Beginner
Duration: 29:30
Speaker: : I. Memming Park

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture introduces the core concepts of dimensionality reduction.

Difficulty level: Beginner
Duration: 31:43
Speaker: : Byron Yu

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture provides an application of dimensionality reduction applied to multi-dimensional neural recordings using brain-computer interfaces with simultaneous spike recordings.

Difficulty level: Beginner
Duration: 30:15
Speaker: : Byron Yu

This is part 1 of a 2-part series about Generalized Linear Models (GLMs), which are a fundamental framework for supervised learning. In this tutorial, the objective is to model a retinal ganglion cell spike train by fitting a temporal receptive field. First with a Linear-Gaussian GLM (also known as ordinary least-squares regression model) and then with a Poisson GLM (aka "Linear-Nonlinear-Poisson" model). In the next tutorial, we’ll extend to a special case of GLMs, logistic regression, and learn how to ensure good model performance. This tutorial is designed to run with retinal ganglion cell spike train data from Uzzell & Chichilnisky 2004.

Difficulty level: Beginner
Duration: 8:09
Speaker: : Anqi Wu

This is continuation of part 1 of a 2-part series about Generalized Linear Models (GLMs), which are a fundamental framework for supervised learning. In this tutorial, the objective is to model a retinal ganglion cell spike train by fitting a temporal receptive field. First with a Linear-Gaussian GLM (also known as ordinary least-squares regression model) and then with a Poisson GLM (aka "Linear-Nonlinear-Poisson" model). In the next tutorial, we’ll extend to a special case of GLMs, logistic regression, and learn how to ensure good model performance.

 

This tutorial is designed to run with retinal ganglion cell spike train data from Uzzell & Chichilnisky 2004.

Difficulty level: Beginner
Duration: 8:02
Speaker: : Anqi Wu