Skip to main content

This lesson provides an introduction to biologically detailed computational modelling of neural dynamics, including neuron membrane potential simulation and F-I curves. 

Difficulty level: Intermediate
Duration: 8:21
Speaker: : Mike X. Cohen

In this lesson, users learn how to use MATLAB to build an adaptive exponential integrate and fire (AdEx) neuron model. 

Difficulty level: Intermediate
Duration: 22:01
Speaker: : Mike X. Cohen

In this lesson, users learn about the practical differences between MATLAB scripts and functions, as well as how to embed their neuronal simulation into a callable function.  

Difficulty level: Intermediate
Duration: 11:20
Speaker: : Mike X. Cohen

This lesson teaches users how to generate a frequency-current (F-I) curve, which describes the function that relates the net synaptic current (I) flowing into a neuron to its firing rate (F). 

Difficulty level: Intermediate
Duration: 20:39
Speaker: : Mike X. Cohen

This lightning talk describes an automated pipline for positron emission tomography (PET) data. 

Difficulty level: Intermediate
Duration: 7:27

This lecture goes into detailed description of how to process workflows in the virtual research environment (VRE), including approaches for standardization, metadata, containerization, and constructing and maintaining scientific pipelines. 

Difficulty level: Intermediate
Duration: 1:03:55
Speaker: : Patrik Bey

This lesson is the first of three hands-on tutorials as part of the workshop Research Workflows for Collaborative Neuroscience. This tutorial goes over how to visualize data with Scanpy, a scalable toolkit for analyzing single-cell gene expression. 

Difficulty level: Intermediate
Duration: 25:26

In this third and final hands-on tutorial from the Research Workflows for Collaborative Neuroscience workshop, you will learn about workflow orchestration using open source tools like DataJoint and Flyte. 

Difficulty level: Intermediate
Duration: 22:36
Speaker: : Daniel Xenes

This lesson provides an overview of the current status in the field of neuroscientific ontologies, presenting examples of data organization and standards, particularly from neuroimaging and electrophysiology. 

Difficulty level: Intermediate
Duration: 33:41

This lesson continues from part one of the lecture Ontologies, Databases, and Standards, diving deeper into a description of ontologies and knowledg graphs. 

Difficulty level: Intermediate
Duration: 50:18
Speaker: : Jeff Grethe

This lecture describes how to build research workflows, including a demonstrate using DataJoint Elements to build data pipelines.

Difficulty level: Intermediate
Duration: 47:00
Speaker: : Dimitri Yatsenko

This lesson delves into the the structure of one of the brain's most elemental computational units, the neuron, and how said structure influences computational neural network models. 

Difficulty level: Intermediate
Duration: 6:33
Speaker: : Marcus Ghosh

In this lesson you will learn how machine learners and neuroscientists construct abstract computational models based on various neurophysiological signalling properties. 

Difficulty level: Intermediate
Duration: 10:52
Speaker: : Dan Goodman

In this lesson, you will learn about some typical neuronal models employed by machine learners and computational neuroscientists, meant to imitate the biophysical properties of real neurons. 

Difficulty level: Intermediate
Duration: 3:12
Speaker: : Dan Goodman

This lesson contains practical exercises which accompanies the first few lessons of the Neuroscience for Machine Learners (Neuro4ML) course. 

Difficulty level: Intermediate
Duration: 5:58
Speaker: : Dan Goodman

In this lesson, you will learn about how machine learners and computational neuroscientists design and build models of neuronal synapses. 

Difficulty level: Intermediate
Duration: 8:59
Speaker: : Dan Goodman

This lesson introduces some practical exercises which accompany the Synapses and Networks portion of this Neuroscience for Machine Learners course. 

Difficulty level: Intermediate
Duration: 3:51
Speaker: : Dan Goodman

This lesson describes spike timing-dependent plasticity (STDP), a biological process that adjusts the strength of connections between neurons in the brain, and how one can implement or mimic this process in a computational model. You will also find links for practical exercises at the bottom of this page. 

Difficulty level: Intermediate
Duration: 12:50
Speaker: : Dan Goodman

In this lesson, you will learn more about some of the issues inherent in modeling neural spikes, approaches to ameliorate these problems, and the pros and cons of these approaches. 

Difficulty level: Intermediate
Duration: 5:31
Speaker: : Dan Goodman

 In this lesson, you will learn about some of the many methods to train spiking neural networks (SNNs) with either no attempt to use gradients, or only use gradients in a limited or constrained way. 

Difficulty level: Intermediate
Duration: 5:14
Speaker: : Dan Goodman