This tutorial covers how to import appropriate data into The Virtual Brain, as well as how to begin constructing detailed brain models.

Difficulty level: Intermediate

Duration: 23:03

Speaker: : Patrik Bey

In this tutorial, you will learn how to run a typical TVB simulation.

Difficulty level: Intermediate

Duration: 1:29:13

Speaker: : Paul Triebkorn

This tutorial introduces The Virtual Mouse Brain (TVMB), walking users through the necessary steps for performing simulation operations on animal brain data.

Difficulty level: Intermediate

Duration: 42:43

Speaker: : Patrik Bey

In this tutorial, you will learn the necessary steps in modeling the brain of one of the most commonly studied animals among non-human primates, the macaque.

Difficulty level: Intermediate

Duration: 1:00:08

Speaker: : Julie Courtiol

This lecture provides an introduction to entropy in general, and multi-scale entropy (MSE) in particular, highlighting the potential clinical applications of the latter.

Difficulty level: Intermediate

Duration: 39:05

Speaker: : Jil Meier

In this lecture, you will learn about various neuroinformatic resources which allow for 3D reconstruction of brain models.

Difficulty level: Intermediate

Duration: 1:36:57

Speaker: : Michael Schirner

In this lesson you will learn about the Bayesian Virtual Epileptic Patient (BVEP), a research use case using TVB supported on the EBRAINS infrastructure.

Difficulty level: Intermediate

Duration: 15:39

Speaker: : Meysam Hashemi

This is the Introductory Module to the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition.

Difficulty level: Intermediate

Duration: 50:17

Speaker: : Yann LeCun and Alfredo Canziani

This module covers the concepts of gradient descent and the backpropagation algorithm and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:51:03

Speaker: : Yann LeCun

This lecture covers concepts associated with neural nets, including rotation and squashing, and is a part of the Deep Learning Course at New York University's Center for Data Science (CDS).

Difficulty level: Intermediate

Duration: 1:01:53

Speaker: : Alfredo Canziani

This lecture covers the concept of neural nets training (tools, classification with neural nets, and PyTorch implementation) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:05:47

Speaker: : Alfredo Canziani

This lecture covers the concept of parameter sharing: recurrent and convolutional nets and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:59:47

Speaker: : Yann LeCun and Alfredo Canziani

This lecture covers the concept of convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 51:40

Speaker: : Yann LeCun

This lecture discusses the concept of natural signals properties and the convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:09:12

Speaker: : Alfredo Canziani

This lecture covers the concept of recurrent neural networks: vanilla and gated (LSTM) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:05:36

Speaker: : Alfredo Canziani

This lecture is a foundationational lecture for the concept of energy-based models with a particular focus on the joint embedding method and latent variable energy-based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:51:30

Speaker: : Yann LeCun

This lecture covers the concept of inference in latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:01:04

Speaker: : Alfredo Canziani

This lecture is a foundationational lecture for the concept of energy-based models with a particular focus on the joint embedding method and latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:48:53

Speaker: : Yann LeCun

This tutorial covers the concept of training latent variable energy based models (LV-EBMs) and is is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:04:48

Speaker: : Alfredo Canziani

Course:

This lecture covers the rationale for developing the DAQCORD, a framework for the design, documentation, and reporting of data curation methods in order to advance the scientific rigour, reproducibility, and analysis of data.

Difficulty level: Intermediate

Duration: 17:08

Speaker: : Ari Ercole

- (-) Electroencephalography (EEG) (9)
- Clinical neuroinformatics (4)
- Standards and Best Practices (2)
- Bayesian networks (2)
- Neuroimaging (20)
- Machine learning (1)
- Tools (7)
- Workflows (2)
- (-) Clinical neuroscience (2)
- General neuroscience (5)
- (-) Computational neuroscience (16)
- Statistics (3)
- Computer Science (1)
- Genomics (8)
- (-) Data science (2)
- Open science (4)