Course:

This lesson provides an introduction to biologically detailed computational modelling of neural dynamics, including neuron membrane potential simulation and F-I curves.

Difficulty level: Intermediate

Duration: 8:21

Speaker: : Mike X. Cohen

Course:

In this lesson, users learn how to use MATLAB to build an adaptive exponential integrate and fire (AdEx) neuron model.

Difficulty level: Intermediate

Duration: 22:01

Speaker: : Mike X. Cohen

Course:

In this lesson, users learn about the practical differences between MATLAB scripts and functions, as well as how to embed their neuronal simulation into a callable function.

Difficulty level: Intermediate

Duration: 11:20

Speaker: : Mike X. Cohen

Course:

This lesson teaches users how to generate a frequency-current (F-I) curve, which describes the function that relates the net synaptic current (I) flowing into a neuron to its firing rate (F).

Difficulty level: Intermediate

Duration: 20:39

Speaker: : Mike X. Cohen

This is a tutorial on designing a Bayesian inference model to map belief trajectories, with emphasis on gaining familiarity with Hierarchical Gaussian Filters (HGFs).

This lesson corresponds to slides 65-90 of the PDF below.

Difficulty level: Intermediate

Duration: 1:15:04

Speaker: : Daniel Hauke

This tutorial provides instruction on how to simulate brain tumors with TVB (reproducing publication: Marinazzo et al. 2020 Neuroimage). This tutorial comprises a didactic video, jupyter notebooks, and full data set for the construction of virtual brains from patients and health controls.

Difficulty level: Intermediate

Duration: 10:01

This tutorial covers the fundamentals of collaborating with Git and GitHub.

Difficulty level: Intermediate

Duration: 2:15:50

Speaker: : Elizabeth DuPre

This is the Introductory Module to the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition.

Difficulty level: Intermediate

Duration: 50:17

Speaker: : Yann LeCun and Alfredo Canziani

This module covers the concepts of gradient descent and the backpropagation algorithm and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:51:03

Speaker: : Yann LeCun

This lecture covers the concept of parameter sharing: recurrent and convolutional nets and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:59:47

Speaker: : Yann LeCun and Alfredo Canziani

This lecture covers the concept of convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 51:40

Speaker: : Yann LeCun

This lecture discusses the concept of natural signals properties and the convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:09:12

Speaker: : Alfredo Canziani

This lecture covers the concept of recurrent neural networks: vanilla and gated (LSTM) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:05:36

Speaker: : Alfredo Canziani

This lecture is a foundationational lecture for the concept of energy-based models with a particular focus on the joint embedding method and latent variable energy-based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:51:30

Speaker: : Yann LeCun

This lecture covers the concept of inference in latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:01:04

Speaker: : Alfredo Canziani

This lecture is a foundationational lecture for the concept of energy-based models with a particular focus on the joint embedding method and latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:48:53

Speaker: : Yann LeCun

This tutorial covers the concept of training latent variable energy based models (LV-EBMs) and is is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:04:48

Speaker: : Alfredo Canziani

- Electroencephalography (EEG) (10)
- (-) Deep learning (10)
- Bayesian networks (2)
- Clinical neuroinformatics (2)
- Standards and Best Practices (1)
- Neuroimaging (18)
- Tools (1)
- Clinical neuroscience (1)
- General neuroscience (6)
- (-) Computational neuroscience (5)
- Statistics (3)
- (-) Computer Science (1)
- Genomics (8)
- (-) Data science (2)
- Open science (4)