Course:

Neuromatch Academy aims to introduce traditional and emerging tools of computational neuroscience to trainees. It is appropriate for student population ranging from undergraduates to faculty in academic settings and also includes industry professionals. In addition to teaching the technical details of computational methods, Neuromatch Academy also provide a curriculum centered on modern neuroscience concepts taught by leading professors along with explicit instruction on how and why to apply models.

This lecture provides a summary of Linear Dynamical Systems concepts covered in Linear Systems I (Intro Lecture) and Tutorials 1 - 4 and introduces motor neuroscience/neuroengineering, brain machine interface, and applications of dynamical systems

Difficulty level: Beginner

Duration: 33:03

Speaker: : Krishna Shenoy

Course:

Neuromatch Academy aims to introduce traditional and emerging tools of computational neuroscience to trainees. It is appropriate for student population ranging from undergraduates to faculty in academic settings and also includes industry professionals. In addition to teaching the technical details of computational methods, Neuromatch Academy also provide a curriculum centered on modern neuroscience concepts taught by leading professors along with explicit instruction on how and why to apply models.

This course introduces the "hidden states" that neurons and networks have that affect their function, but are hidden from us as experimenters. Today, you'll be working towards understanding how to use graphical models with hidden states to learn about the dynamics in the world that we only have access to from noisy measurements.

Difficulty level: Beginner

Duration: 31:30

Speaker: : Sean Escola

Course:

This tutorial introduces the *Sequential Probability Ratio Test *between two hypotheses 𝐻𝐿 and 𝐻𝑅 by running simulations of a *Drift Diffusion Model (DDM)*. As independent and identically distributed (*i.i.d*) samples from the true data-generating distribution coming in, we accumulate our evidence linearly until a certain criterion is met before deciding which hypothesis to accept. Two types of stopping criterion/stopping rule will be implemented: after seeing a fixed amount of data, and after the likelihood ratio passes a pre-defined threshold. Due to the noisy nature of observations, there will be a *drifting* term governed by expected mean output and a *diffusion* term governed by observation noise.

Overview of this tutorial:

- Simulate Drift-Diffusion Model with different stopping rules
- Observe the relation between accuracy and reaction time, get an intuition about the speed/accuracy tradeoff

Difficulty level: Beginner

Duration: 4:46

Speaker: : Yicheng Fei

Course:

This tutorial covers how to simulate a Hidden Markov Model (HMM) and observe how changing the transition probability and observation noise impact what the samples look like. Then we'll look at how uncertainty increases as we make future predictions without evidence (from observations) and how to gain information from the observations.

Overview of this tutorial:

- Build an HMM in Python and generate sample data
- Calculate how predictive probabilities propagates in a Markov Chain with no evidence
- Combine new evidence and prediction from past evidence to estimate latent states

Difficulty level: Beginner

Duration: 4:48

Speaker: : Yicheng Fei

Course:

This tutorial covers how to infer a latent model when our states are continuous. Particular attention is paid to the Kalman filter and it's mathematical foundation.

Overview of this tutorial:

- Review linear dynamical systems
- Learn about and implement the Kalman filter
- Explore how the Kalman filter can be used to smooth data from an eye-tracking experiment

Difficulty level: Beginner

Duration: 2:38

Speaker: : Caroline Haimerl and Byron Galbraith

Course:

Neuromatch Academy aims to introduce traditional and emerging tools of computational neuroscience to trainees. It is appropriate for student population ranging from undergraduates to faculty in academic settings and also includes industry professionals. In addition to teaching the technical details of computational methods, Neuromatch Academy also provide a curriculum centered on modern neuroscience concepts taught by leading professors along with explicit instruction on how and why to apply models.

This lecture covers multiple topics on dynamical neural modeling and inference and their application to basic neuroscience and neurotechnology design: (1) How to develop multiscale dynamical models and filters? (2) How to study neural dynamics across spatiotemporal scales? (3) How to dissociate and model behaviorally relevant neural dynamics? (4) How to model neural dynamics in response to electrical stimulation input? (5) How to apply these techniques in developing brain-machine interfaces (BMIs) to restore lost motor or emotional function?

Difficulty level: Beginner

Duration: 30:40

Speaker: : Maryam M. Shanechi

Course:

This lecture provides an introduction to optimal control, describes open-loop and closed-loop control, and application to motor control.

Difficulty level: Beginner

Duration: 36:23

Speaker: : Maurice Smith

Course:

In this tutorial, you will perform a *Sequential Probability Ratio Test* between two hypotheses *HL* and *HR* by running simulations of a *Drift Diffusion Model (DDM)*.

Difficulty level: Beginner

Duration: 4:46

Speaker: : Zhengwei Wu

Course:

In this tutorial, you will implement a continuous control task: you will design control inputs for a linear dynamical system to reach a target state.

Difficulty level: Beginner

Duration: 10:02

Speaker: : Zhengwei Wu

Course:

This lecture covers the utility of action: vigor and neuroeconomics of movement and applications to foraging and the marginal value theorem.

Difficulty level: Beginner

Duration: 28:48

Speaker: : Reza Shadmehr

Course:

This lecture provides an introduction to a variety of topics in Reinforcement Learning.

Difficulty level: Beginner

Duration: 39:12

Speaker: : Doina Precup

Course:

This tutorial presents how to estimate state-value functions in a classical conditioning paradigm using Temporal Difference (TD) learning and examine TD-errors at the presentation of the conditioned and unconditioned stimulus (CS and US) under different CS-US contingencies. These exercises will provide you with an understanding of both how reward prediction errors (RPEs) behave in classical conditioning and what we should expect to see if Dopamine represents a "canonical" model-free RPE.

Difficulty level: Beginner

Duration: 6:57

Speaker: : Eric DeWitt

Course:

In this tutorial, you will use 'bandits' to understand the fundamentals of how a policy interacts with the learning algorithm in reinforcement learning.

Difficulty level: Beginner

Duration: 6:55

Speaker: : Eric DeWitt

Course:

In this tutorial, you will learn how to act in the more realistic setting of sequential decisions, formalized by Markov Decision Processes (MDPs). In a sequential decision problem, the actions executed in one state not only may lead to immediate rewards (as in a bandit problem), but may also affect the states experienced next (unlike a bandit problem). Each individual action may therefore affect affect all future rewards. Thus, making decisions in this setting requires considering each action in terms of their expected **cumulative** future reward.

Difficulty level: Beginner

Duration: 11:16

Speaker: : Marcelo Mattar

Course:

In this tutorial, you will implement one of the simplest model-based Reinforcement Learning algorithms, Dyna-Q. You will understand what a world model is, how it can improve the agent's policy, and the situations in which model-based algorithms are more advantageous than their model-free counterparts.

Difficulty level: Beginner

Duration: 9:10

Speaker: : Marcelo Mattar

Course:

This lecture highlights up-and-coming issues in the neuroscience of reinforcement learning

Difficulty level: Beginner

Duration: 33:25

Speaker: : Tim Behrens

Course:

Since their introduction in 2016, the FAIR data principles have gained increasing recognition and adoption in global neuroscience. FAIR defines a set of high-level principles and practices for making digital objects, including data, software, and workflows, Findable, Accessible, Interoperable, and Reusable. But FAIR is not a specification; it leaves many of the specifics up to individual scientific disciplines to define. INCF has been leading the way in promoting, defining, and implementing FAIR data practices for neuroscience. We have been bringing together researchers, infrastructure providers, industry, and publishers through our programs and networks. In this session, we will hear some perspectives on FAIR neuroscience from some of these stakeholders who have been working to develop and use FAIR tools for neuroscience. We will engage in a discussion on questions such as: how is neuroscience doing with respect to FAIR? What have been the successes? What is currently very difficult? Where does neuroscience need to go?

This lecture covers FAIR atlases, from their background, their construction, and how they can be created in line with the FAIR principles.

Difficulty level: Beginner

Duration: 14:24

Speaker: : Heidi Kleven

Course:

Since their introduction in 2016, the FAIR data principles have gained increasing recognition and adoption in global neuroscience. FAIR defines a set of high-level principles and practices for making digital objects, including data, software, and workflows, Findable, Accessible, Interoperable, and Reusable. But FAIR is not a specification; it leaves many of the specifics up to individual scientific disciplines to define. INCF has been leading the way in promoting, defining, and implementing FAIR data practices for neuroscience. We have been bringing together researchers, infrastructure providers, industry, and publishers through our programs and networks. In this session, we will hear some perspectives on FAIR neuroscience from some of these stakeholders who have been working to develop and use FAIR tools for neuroscience. We will engage in a discussion on questions such as: how is neuroscience doing with respect to FAIR? What have been the successes? What is currently very difficult? Where does neuroscience need to go? This lecture covers the biomedical researcher's perspective on FAIR data sharing and the importance of finding better ways to manage large datasets.

Difficulty level: Beginner

Duration: 10:51

Speaker: : Adam Ferguson

Course:

Since their introduction in 2016, the FAIR data principles have gained increasing recognition and adoption in global neuroscience. FAIR defines a set of high-level principles and practices for making digital objects, including data, software, and workflows, Findable, Accessible, Interoperable, and Reusable. But FAIR is not a specification; it leaves many of the specifics up to individual scientific disciplines to define. INCF has been leading the way in promoting, defining, and implementing FAIR data practices for neuroscience. We have been bringing together researchers, infrastructure providers, industry, and publishers through our programs and networks. In this session, we will hear some perspectives on FAIR neuroscience from some of these stakeholders who have been working to develop and use FAIR tools for neuroscience. We will engage in a discussion on questions such as: how is neuroscience doing with respect to FAIR? What have been the successes? What is currently very difficult? Where does neuroscience need to go? This lecture covers multiple aspects of FAIR neuroscience data: what makes it unique, the challenges to making it FAIR, the importance of overcoming these challenges, and how data governance comes into play.

Difficulty level: Beginner

Duration: 14:56

Speaker: : Damian Eke

As models in neuroscience have become increasingly complex, it has become more difficult to share all aspects of models and model analysis, hindering model accessibility and reproducibility. In this session, we will discuss existing resources for promoting FAIR data and models in computational neuroscience, their impact on the field, and the remaining barriers. This lecture introduces the FAIR principles, how they relate to the field of computational neuroscience, and the resources available.

Difficulty level: Beginner

Duration: 8:47

Speaker: : Sharon Crook

- Clinical neuroinformatics (3)
- (-) Digital brain atlasing (1)
- Neuroimaging (9)
- (-) Machine learning (3)
- Brain networks (1)
- (-) Glia (1)
- Electrophysiology (7)
- Ontologies (1)
- Neuroanatomy (2)
- Standards and best practices (1)
- Tools (6)
- Neurobiology (4)
- Workflows (3)
- Neurodegeneration (1)
- Neuroimmunology (1)
- Neural networks (1)
- (-) Neuropharmacology (1)
- Animal models (1)
- Assembly 2021 (27)
- Brain-hardware interfaces (1)
- Clinical neuroscience (8)
- International Brain Initiative (2)
- Repositories and science gateways (4)
- Resources (4)
- (-) Computational neuroscience (49)
- Computer Science (2)
- Genomics (1)
- Data science (5)
- (-) Open science (10)
- Project management (3)
- Education (1)
- (-) Neuroethics (4)