Skip to main content

This lesson explains the fundamental principles of neuronal communication, such as neuronal spiking, membrane potentials, and cellular excitability, and how these electrophysiological features of the brain may be modelled and simulated digitally. 

Difficulty level: Intermediate
Duration: 1:20:42
Speaker: : Etay Hay

This is a tutorial on how to simulate neuronal spiking in brain microcircuit models, as well as how to analyze, plot, and visualize the corresponding data. 

Difficulty level: Intermediate
Duration: 1:39:50
Speaker: : Frank Mazza

This is an in-depth guide on EEG signals and their interaction within brain microcircuits. Participants are also shown techniques and software for simulating, analyzing, and visualizing these signals.

Difficulty level: Intermediate
Duration: 1:30:41
Speaker: : Frank Mazza

In this tutorial on simulating whole-brain activity using Python, participants can follow along using corresponding code and repositories, learning the basics of neural oscillatory dynamics, evoked responses and EEG signals, ultimately leading to the design of a network model of whole-brain anatomical connectivity. 

Difficulty level: Intermediate
Duration: 1:16:10
Speaker: : John Griffiths

This tutorial walks participants through the application of dynamic causal modelling (DCM) to fMRI data using MATLAB. Participants are also shown various forms of DCM, how to generate and specify different models, and how to fit them to simulated neural and BOLD data.

 

This lesson corresponds to slides 158-187 of the PDF below. 

Difficulty level: Advanced
Duration: 1:22:10

This lecture focuses on the structured validation process within computational neuroscience, including the tools, services, and methods involved in simulation and analysis.

Difficulty level: Beginner
Duration: 14:19
Speaker: : Michael Denker
Course:

This session will include presentations of infrastructure that embrace the FAIR principles developed by members of the INCF Community.

 

This lecture provides an overview of The Virtual Brain Simulation Platform.

 

Difficulty level: Beginner
Duration: 9:36
Speaker: : Petra Ritter

This tutorial demonstrates how to use PyNN, a simulator-independent language for building neuronal network models, in conjunction with the neuromorphic hardware system SpiNNaker. 

Difficulty level: Intermediate
Duration: 25:49
Course:

This session will include presentations of infrastructure that embrace the FAIR principles developed by members of the INCF Community. This lecture provides an overview and demo of the Canadian Open Neuroscience Platform (CONP).

Difficulty level: Beginner
Duration: 14:02

This is the Introductory Module to the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition.

Difficulty level: Intermediate
Duration: 50:17

This module covers the concepts of gradient descent and the backpropagation algorithm and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:51:03
Speaker: : Yann LeCun

This lecture covers concepts associated with neural nets, including rotation and squashing, and is a part of the Deep Learning Course at New York University's Center for Data Science (CDS).

Difficulty level: Intermediate
Duration: 1:01:53
Speaker: : Alfredo Canziani

This lesson provides a detailed description of some of the modules and architectures involved in the development of neural networks. 

Difficulty level: Intermediate
Duration: 1:42:26

This lecture covers the concept of neural nets training (tools, classification with neural nets, and PyTorch implementation) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:05:47
Speaker: : Alfredo Canziani

This lecture covers the concept of parameter sharing: recurrent and convolutional nets and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:59:47

This lecture covers the concept of convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 51:40
Speaker: : Yann LeCun

This lecture discusses the concept of natural signals properties and the convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:09:12
Speaker: : Alfredo Canziani

This lecture covers the concept of recurrent neural networks: vanilla and gated (LSTM) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:05:36
Speaker: : Alfredo Canziani

This lecture is a foundationational lecture for the concept of energy-based models with a particular focus on the joint embedding method and latent variable energy-based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:51:30
Speaker: : Yann LeCun

This lecture covers the concept of inference in latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science. 

Difficulty level: Intermediate
Duration: 1:01:04
Speaker: : Alfredo Canziani