Skip to main content

This lesson briefly goes over the outline of the Neuroscience for Machine Learners course. 

Difficulty level: Intermediate
Duration: 3:05
Speaker: : Dan Goodman

This tutorial covers the fundamentals of collaborating with Git and GitHub.

Difficulty level: Intermediate
Duration: 2:15:50
Speaker: : Elizabeth DuPre

This talk presents state-of-the-art methods for ensuring data privacy with a particular focus on medical data sharing across multiple organizations.

Difficulty level: Intermediate
Duration: 22:49

This lecture talks about the usage of knowledge graphs in hospitals and related challenges of semantic interoperability.

Difficulty level: Intermediate
Duration: 24:32

This lecture provides an introduction to the Brain Imaging Data Structure (BIDS), a standard for organizing human neuroimaging datasets.

Difficulty level: Intermediate
Duration: 56:49

This lesson outlines Neurodata Without Borders (NWB), a data standard for neurophysiology which provides neuroscientists with a common standard to share, archive, use, and build analysis tools for neurophysiology data.

Difficulty level: Intermediate
Duration: 29:53
Speaker: : Oliver Ruebel

This lecture covers the rationale for developing the DAQCORD, a framework for the design, documentation, and reporting of data curation methods in order to advance the scientific rigour, reproducibility, and analysis of data.

Difficulty level: Intermediate
Duration: 17:08
Speaker: : Ari Ercole

This tutorial demonstrates how to use PyNN, a simulator-independent language for building neuronal network models, in conjunction with the neuromorphic hardware system SpiNNaker. 

Difficulty level: Intermediate
Duration: 25:49

This lesson describes spike timing-dependent plasticity (STDP), a biological process that adjusts the strength of connections between neurons in the brain, and how one can implement or mimic this process in a computational model. You will also find links for practical exercises at the bottom of this page. 

Difficulty level: Intermediate
Duration: 12:50
Speaker: : Dan Goodman

This lesson provides a brief introduction to the Computational Modeling of Neuronal Plasticity.

Difficulty level: Intermediate
Duration: 0:40

In this lesson, you will be introducted to a type of neuronal model known as the leaky integrate-and-fire (LIF) model.

Difficulty level: Intermediate
Duration: 1:23

This lesson goes over various potential inputs to neuronal synapses, loci of neural communication.

Difficulty level: Intermediate
Duration: 1:20

This lesson describes the how and why behind implementing integration time steps as part of a neuronal model.

Difficulty level: Intermediate
Duration: 1:08

In this lesson, you will learn about neural spike trains which can be characterized as having a Poisson distribution.

Difficulty level: Intermediate
Duration: 1:18

This lesson covers spike-rate adaptation, the process by which a neuron's firing pattern decays to a low, steady-state frequency during the sustained encoding of a stimulus.

Difficulty level: Intermediate
Duration: 1:26

This lesson provides a brief explanation of how to implement a neuron's refractory period in a computational model.

Difficulty level: Intermediate
Duration: 0:42

In this lesson, you will learn a computational description of the process which tunes neuronal connectivity strength, spike-timing-dependent plasticity (STDP).

Difficulty level: Intermediate
Duration: 2:40

This lesson reviews theoretical and mathematical descriptions of correlated spike trains.

Difficulty level: Intermediate
Duration: 2:54

This lesson investigates the effect of correlated spike trains on spike-timing dependent plasticity (STDP).

Difficulty level: Intermediate
Duration: 1:43

This lesson goes over synaptic normalisation, the homeostatic process by which groups of weighted inputs scale up or down their biases.

Difficulty level: Intermediate
Duration: 2:58