Skip to main content

This is an in-depth guide on EEG signals and their interaction within brain microcircuits. Participants are also shown techniques and software for simulating, analyzing, and visualizing these signals.

Difficulty level: Intermediate
Duration: 1:30:41
Speaker: : Frank Mazza

In this tutorial on simulating whole-brain activity using Python, participants can follow along using corresponding code and repositories, learning the basics of neural oscillatory dynamics, evoked responses and EEG signals, ultimately leading to the design of a network model of whole-brain anatomical connectivity. 

Difficulty level: Intermediate
Duration: 1:16:10
Speaker: : John Griffiths

This lesson contains practical exercises which accompanies the first few lessons of the Neuroscience for Machine Learners (Neuro4ML) course. 

Difficulty level: Intermediate
Duration: 5:58
Speaker: : Dan Goodman

This lesson introduces some practical exercises which accompany the Synapses and Networks portion of this Neuroscience for Machine Learners course. 

Difficulty level: Intermediate
Duration: 3:51
Speaker: : Dan Goodman

In this lesson, you will learn how to train spiking neural networks (SNNs) with a surrogate gradient method. 

Difficulty level: Intermediate
Duration: 11:23
Speaker: : Dan Goodman

In this tutorial, you will learn how to use TVB-NEST toolbox on your local computer.

Difficulty level: Beginner
Duration: 2:16

This tutorial provides instruction on how to perform multi-scale simulation of Alzheimer's disease on The Virtual Brain Simulation Platform.

Difficulty level: Beginner
Duration: 29:08

This tutorial provides instruction on how to simulate brain tumors with TVB (reproducing publication: Marinazzo et al. 2020 Neuroimage). This tutorial comprises a didactic video, jupyter notebooks, and full data set for the construction of virtual brains from patients and health controls.

Difficulty level: Intermediate
Duration: 10:01

The tutorial on modelling strokes in TVB includes a didactic video and jupyter notebooks (reproducing publication: Falcon et al. 2016 eNeuro).

Difficulty level: Intermediate
Duration: 7:43

This lesson provides a brief overview of the Python programming language, with an emphasis on tools relevant to data scientists.

Difficulty level: Beginner
Duration: 1:16:36
Speaker: : Tal Yarkoni

This lecture covers concepts associated with neural nets, including rotation and squashing, and is a part of the Deep Learning Course at New York University's Center for Data Science (CDS).

Difficulty level: Intermediate
Duration: 1:01:53
Speaker: : Alfredo Canziani

This lecture covers the concept of neural nets training (tools, classification with neural nets, and PyTorch implementation) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:05:47
Speaker: : Alfredo Canziani

This lecture discusses the concept of natural signals properties and the convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:09:12
Speaker: : Alfredo Canziani

This lecture covers the concept of recurrent neural networks: vanilla and gated (LSTM) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:05:36
Speaker: : Alfredo Canziani

This tutorial covers LV-EBM to target prop to (vanilla, denoising, contractive, variational) autoencoder and is a part of the Advanced Energy-Based Models module of the the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this course include: Energy-Based Models IEnergy-Based Models IIEnergy-Based Models III, Energy-Based Models IV, and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Advanced
Duration: 1:00:34
Speaker: : Alfredo Canziani

This tutorial covers the concepts of autoencoders, denoising encoders, and variational autoencoders (VAE) with PyTorch, as well as generative adversarial networks and code. It is a part of the Advanced energy based models modules of the the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this course include: Energy-Based Models IEnergy-Based Models IIEnergy-Based Models IIIEnergy-Based Models IV, Energy-Based Models V, and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Advanced
Duration: 1:07:50
Speaker: : Alfredo Canziani

This tutorial covers advanced concept of energy-based models. The lecture is a part of the Associative Memories module of the the Deep Learning Course at NYU's Center for Data Science. 

Difficulty level: Advanced
Duration: 1:12:00
Speaker: : Alfredo Canziani

This tutuorial covers the concept of graph convolutional networks and is a part of the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this module include: Modules 1 - 5 of this course and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Advanced
Duration: 57:33
Speaker: : Alfredo Canziani

This lecture covers the concepts of emulation of kinematics from observations and training a policy. It is a part of the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this module include: Models 1-6 of this course and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Advanced
Duration: 1:01:21
Speaker: : Alfredo Canziani

As a part of NeuroHackademy 2021, Noah Benson gives an introduction to Pytorch, one of the two most common software packages for deep learning applications to the neurosciences.

Difficulty level: Beginner
Duration: 00:50:40
Speaker: :