Skip to main content

This lesson describes spike timing-dependent plasticity (STDP), a biological process that adjusts the strength of connections between neurons in the brain, and how one can implement or mimic this process in a computational model. You will also find links for practical exercises at the bottom of this page. 

Difficulty level: Intermediate
Duration: 12:50
Speaker: : Dan Goodman

This lesson provides a brief introduction to the Computational Modeling of Neuronal Plasticity.

Difficulty level: Intermediate
Duration: 0:40

In this lesson, you will be introducted to a type of neuronal model known as the leaky integrate-and-fire (LIF) model.

Difficulty level: Intermediate
Duration: 1:23

This lesson goes over various potential inputs to neuronal synapses, loci of neural communication.

Difficulty level: Intermediate
Duration: 1:20

This lesson describes the how and why behind implementing integration time steps as part of a neuronal model.

Difficulty level: Intermediate
Duration: 1:08

In this lesson, you will learn about neural spike trains which can be characterized as having a Poisson distribution.

Difficulty level: Intermediate
Duration: 1:18

This lesson covers spike-rate adaptation, the process by which a neuron's firing pattern decays to a low, steady-state frequency during the sustained encoding of a stimulus.

Difficulty level: Intermediate
Duration: 1:26

This lesson provides a brief explanation of how to implement a neuron's refractory period in a computational model.

Difficulty level: Intermediate
Duration: 0:42

In this lesson, you will learn a computational description of the process which tunes neuronal connectivity strength, spike-timing-dependent plasticity (STDP).

Difficulty level: Intermediate
Duration: 2:40

This lesson reviews theoretical and mathematical descriptions of correlated spike trains.

Difficulty level: Intermediate
Duration: 2:54

This lesson investigates the effect of correlated spike trains on spike-timing dependent plasticity (STDP).

Difficulty level: Intermediate
Duration: 1:43

This lesson goes over synaptic normalisation, the homeostatic process by which groups of weighted inputs scale up or down their biases.

Difficulty level: Intermediate
Duration: 2:58

In this lesson, you will learn about the intrinsic plasticity of single neurons.

Difficulty level: Intermediate
Duration: 2:08

This lesson covers short-term facilitation, a process whereby a neuron's synaptic transmission is enhanced for a short (sub-second) period.

Difficulty level: Intermediate
Duration: 1:58

This lesson describes short-term depression, a reduction of synaptic information transfer between neurons.

Difficulty level: Intermediate
Duration: 1:40

This lesson briefly wraps up the course on Computational Modeling of Neuronal Plasticity.

Difficulty level: Intermediate
Duration: 0:37

This lesson breaks down the principles of Bayesian inference and how it relates to cognitive processes and functions like learning and perception. It is then explained how cognitive models can be built using Bayesian statistics in order to investigate how our brains interface with their environment. 

This lesson corresponds to slides 1-64 in the PDF below. 

Difficulty level: Intermediate
Duration: 1:28:14

This is a tutorial on designing a Bayesian inference model to map belief trajectories, with emphasis on gaining familiarity with Hierarchical Gaussian Filters (HGFs).

 

This lesson corresponds to slides 65-90 of the PDF below. 

Difficulty level: Intermediate
Duration: 1:15:04
Speaker: : Daniel Hauke

This lecture covers a lot of post-war developments in the science of the mind, focusing first on the cognitive revolution, and concluding with living machines.

Difficulty level: Beginner
Duration: 2:24:35

This lecture provides an overview of depression (epidemiology and course of the disorder), clinical presentation, somatic co-morbidity, and treatment options.

Difficulty level: Beginner
Duration: 37:51