Skip to main content

This talk describes the NIH-funded SPARC Data Structure, and how this project navigates ontology development while keeping in mind the FAIR science principles. 

Difficulty level: Beginner
Duration: 25:44
Speaker: : Fahim Imam

This lesson provides an overview of the current status in the field of neuroscientific ontologies, presenting examples of data organization and standards, particularly from neuroimaging and electrophysiology. 

Difficulty level: Intermediate
Duration: 33:41

This lesson continues from part one of the lecture Ontologies, Databases, and Standards, diving deeper into a description of ontologies and knowledg graphs. 

Difficulty level: Intermediate
Duration: 50:18
Speaker: : Jeff Grethe
Course:

This lecture covers structured data, databases, federating neuroscience-relevant databases, and ontologies. 

Difficulty level: Beginner
Duration: 1:30:45
Speaker: : Maryann Martone

This lecture covers FAIR atlases, including their background and construction, as well as how they can be created in line with the FAIR principles.

Difficulty level: Beginner
Duration: 14:24
Speaker: : Heidi Kleven

This lecture focuses on ontologies for clinical neurosciences.

Difficulty level: Intermediate
Duration: 21:54

This lesson provides an introduction to biologically detailed computational modelling of neural dynamics, including neuron membrane potential simulation and F-I curves. 

Difficulty level: Intermediate
Duration: 8:21
Speaker: : Mike X. Cohen

In this lesson, users learn how to use MATLAB to build an adaptive exponential integrate and fire (AdEx) neuron model. 

Difficulty level: Intermediate
Duration: 22:01
Speaker: : Mike X. Cohen

In this lesson, users learn about the practical differences between MATLAB scripts and functions, as well as how to embed their neuronal simulation into a callable function.  

Difficulty level: Intermediate
Duration: 11:20
Speaker: : Mike X. Cohen

This lesson teaches users how to generate a frequency-current (F-I) curve, which describes the function that relates the net synaptic current (I) flowing into a neuron to its firing rate (F). 

Difficulty level: Intermediate
Duration: 20:39
Speaker: : Mike X. Cohen

This lecture covers a lot of post-war developments in the science of the mind, focusing first on the cognitive revolution, and concluding with living machines.

Difficulty level: Beginner
Duration: 2:24:35

This brief talk goes into work being done at The Alan Turing Institute to solve real-world challenges and democratize computer vision methods to support interdisciplinary and international researchers. 

Difficulty level: Beginner
Duration: 7:10

This lesson aims to define computational neuroscience in general terms, while providing specific examples of highly successful computational neuroscience projects. 

Difficulty level: Beginner
Duration: 59:21
Speaker: : Alla Borisyuk

This lesson delves into the the structure of one of the brain's most elemental computational units, the neuron, and how said structure influences computational neural network models. 

Difficulty level: Intermediate
Duration: 6:33
Speaker: : Marcus Ghosh

In this lesson you will learn how machine learners and neuroscientists construct abstract computational models based on various neurophysiological signalling properties. 

Difficulty level: Intermediate
Duration: 10:52
Speaker: : Dan Goodman

In this lesson, you will learn about some typical neuronal models employed by machine learners and computational neuroscientists, meant to imitate the biophysical properties of real neurons. 

Difficulty level: Intermediate
Duration: 3:12
Speaker: : Dan Goodman

This lesson contains practical exercises which accompanies the first few lessons of the Neuroscience for Machine Learners (Neuro4ML) course. 

Difficulty level: Intermediate
Duration: 5:58
Speaker: : Dan Goodman

In this lesson, you will learn about how machine learners and computational neuroscientists design and build models of neuronal synapses. 

Difficulty level: Intermediate
Duration: 8:59
Speaker: : Dan Goodman

This lesson introduces some practical exercises which accompany the Synapses and Networks portion of this Neuroscience for Machine Learners course. 

Difficulty level: Intermediate
Duration: 3:51
Speaker: : Dan Goodman

This lesson describes spike timing-dependent plasticity (STDP), a biological process that adjusts the strength of connections between neurons in the brain, and how one can implement or mimic this process in a computational model. You will also find links for practical exercises at the bottom of this page. 

Difficulty level: Intermediate
Duration: 12:50
Speaker: : Dan Goodman