Skip to main content

This lecture covers advanced concepts of energy-based models. The lecture is a part of the Associative Memories module of the the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this course include: Energy-Based Models IEnergy-Based Models IIEnergy-Based Models IIIEnergy-Based Models IV, Energy-Based Models V, and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Advanced
Duration: 2:00:28
Speaker: : Yann LeCun

This tutorial covers advanced concept of energy-based models. The lecture is a part of the Associative Memories module of the the Deep Learning Course at NYU's Center for Data Science. 

Difficulty level: Advanced
Duration: 1:12:00
Speaker: : Alfredo Canziani

This lecture provides an introduction to the problem of speech recognition using neural models, emphasizing the CTC loss for training and inference when input and output sequences are of different lengths. It also covers the concept of beam search for use during inference, and how that procedure may be modeled at training time using a Graph Transformer Network. It is a part of the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this module include: Modules 1 - 5 of this course and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Advanced
Duration: 1:55:03
Speaker: : Awni Hannun

This lecture covers the concepts of the architecture and convolution of traditional convolutional neural networks, the characteristics of graph and graph convolution, and spectral graph convolutional neural networks and how to perform spectral convolution, as well as the complete spectrum of Graph Convolutional Networks (GCNs), starting with the implementation of Spectral Convolution through Spectral Networks. It then provides insights on applicability of the other convolutional definition of Template Matching to graphs, leading to Spatial networks. This lecture is a part of the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this module include: Modules 1 - 5 of this course and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Advanced
Duration: 2:00:22
Speaker: : Xavier Bresson

This tutuorial covers the concept of graph convolutional networks and is a part of the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this module include: Modules 1 - 5 of this course and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Advanced
Duration: 57:33
Speaker: : Alfredo Canziani

This lecture covers the concept of model predictive control and is a part of the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this module include: Models 1-6 of this course and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Advanced
Duration: 1:10:22
Speaker: : Alfredo Canziani

This lecture covers the concepts of emulation of kinematics from observations and training a policy. It is a part of the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this module include: Models 1-6 of this course and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Advanced
Duration: 1:01:21
Speaker: : Alfredo Canziani

This lecture covers the concept of predictive policy learning under uncertainty and is a part of the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this module include: Models 1-6 of this course and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Advanced
Duration: 1:14:44
Speaker: : Alfredo Canziani

This lecture covers the concepts of gradient descent, stochastic gradient descent, and momentum. It is a part of the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this module include: Models 1-7 of this course and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Advanced
Duration: 1:29:05
Speaker: : Aaron DeFazio

This lecture continues on the topic of descent from the previous lesson, Optimization I. This lesson is a part of the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this module include: Models 1-7 of this course and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Advanced
Duration: 1:51:32
Speaker: : Alfredo Canziani

This lesson gives an introduction to deep learning, with a perspective via inductive biases and emphasis on correctly matching deep learning to the right research questions.

Difficulty level: Beginner
Duration: 01:35:12
Speaker: : Blake Richards

As a part of NeuroHackademy 2021, Noah Benson gives an introduction to Pytorch, one of the two most common software packages for deep learning applications to the neurosciences.

Difficulty level: Beginner
Duration: 00:50:40
Speaker: :

Learn how to use TensorFlow 2.0 in this full tutorial for beginners. This course is designed for Python programmers looking to enhance their knowledge and skills in machine learning and artificial intelligence.

 

Throughout the 8 modules in this course you will learn about fundamental concepts and methods in ML & AI like core learning algorithms, deep learning with neural networks, computer vision with convolutional neural networks, natural language processing with recurrent neural networks, and reinforcement learning.

Difficulty level: Beginner
Duration: 06:52:07
Speaker: :

In this hands-on tutorial, Dr. Robert Guangyu Yang works through a number of coding exercises to see how RNNs can be easily used to study cognitive neuroscience questions, with a quick demonstration of how we can train and analyze RNNs on various cognitive neuroscience tasks. Familiarity of Python and basic knowledge of Pytorch are assumed.

Difficulty level: Beginner
Duration: 00:26:38
Speaker: :

This lesson contains practical exercises which accompanies the first few lessons of the Neuroscience for Machine Learners (Neuro4ML) course. 

Difficulty level: Intermediate
Duration: 5:58
Speaker: : Dan Goodman

This video briefly goes over the exercises accompanying Week 6 of the Neuroscience for Machine Learners (Neuro4ML) course, Understanding Neural Networks.

Difficulty level: Intermediate
Duration: 2:43
Speaker: : Marcus Ghosh
Course:

This lecture covers the description and characterization of an input-output relationship in a information-theoretic context. 

Difficulty level: Beginner
Duration: 1:35:33

This lesson is part 1 of 2 of a tutorial on statistical models for neural data.

Difficulty level: Beginner
Duration: 1:45:48
Speaker: : Jonathan Pillow

This lesson is part 2 of 2 of a tutorial on statistical models for neural data.

Difficulty level: Beginner
Duration: 1:50:31
Speaker: : Jonathan Pillow

This lesson provides an introduction to modeling single neurons, as well as stability analysis of neural models.

Difficulty level: Intermediate
Duration: 1:26:06
Speaker: : Bard Ermentrout