Skip to main content

This lecture covers concepts associated with neural nets, including rotation and squashing, and is a part of the Deep Learning Course at New York University's Center for Data Science (CDS).

Difficulty level: Intermediate
Duration: 1:01:53
Speaker: : Alfredo Canziani

This lecture covers the concept of neural nets training (tools, classification with neural nets, and PyTorch implementation) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:05:47
Speaker: : Alfredo Canziani

This lecture discusses the concept of natural signals properties and the convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:09:12
Speaker: : Alfredo Canziani

This lecture covers the concept of recurrent neural networks: vanilla and gated (LSTM) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:05:36
Speaker: : Alfredo Canziani

This lecture covers the concept of inference in latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science. 

Difficulty level: Intermediate
Duration: 1:01:04
Speaker: : Alfredo Canziani

This tutorial covers the concept of training latent variable energy based models (LV-EBMs) and is is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:04:48
Speaker: : Alfredo Canziani

This lecture covers advanced concept of energy based models. The lecture is a part of the Advanced energy based models modules of the the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this course include: Energy-Based Models IEnergy-Based Models II, Energy-Based Models III, and an Introduction to Data Science or a Graduate Level Machine Learning course.

Difficulty level: Beginner
Duration: 56:41
Speaker: : Alfredo Canziani

As a part of NeuroHackademy 2021, Noah Benson gives an introduction to Pytorch, one of the two most common software packages for deep learning applications to the neurosciences.

Difficulty level: Beginner
Duration: 00:50:40
Speaker: :

In this hands-on tutorial, Dr. Robert Guangyu Yang works through a number of coding exercises to see how RNNs can be easily used to study cognitive neuroscience questions, with a quick demonstration of how we can train and analyze RNNs on various cognitive neuroscience tasks. Familiarity of Python and basic knowledge of Pytorch are assumed.

Difficulty level: Beginner
Duration: 00:26:38
Speaker: :

This is a tutorial on designing a Bayesian inference model to map belief trajectories, with emphasis on gaining familiarity with Hierarchical Gaussian Filters (HGFs).

 

This lesson corresponds to slides 65-90 of the PDF below. 

Difficulty level: Intermediate
Duration: 1:15:04
Speaker: : Daniel Hauke

This tutorial introduces pipelines and methods to compute brain connectomes from fMRI data. With corresponding code and repositories, participants can follow along and learn how to programmatically preprocess, curate, and analyze functional and structural brain data to produce connectivity matrices. 

Difficulty level: Intermediate
Duration: 1:39:04
Course:

EyeWire is a game to map the brain. Players are challenged to map branches of a neuron from one side of a cube to the other in a 3D puzzle. Players scroll through the cube and reconstruct neurons with the help of an artificial intelligence algorithm developed at Seung Lab in Princeton University. EyeWire gameplay advances neuroscience by helping researchers discover how neurons connect to process visual information. 

Difficulty level: Beginner
Duration: 03:56
Speaker: : EyeWire
Course:

Mozak is a scientific discovery game about neuroscience for citizen scientists and neuroscientists alike. Players to help neuroscientists build models of brain cells and learn more about the brain through their efforts.

Difficulty level: Beginner
Duration: 00:43
Speaker: : Mozak

This module explains how neurons come together to create the networks that give rise to our thoughts. The totality of our neurons and their connection is called our connectome. Learn how this connectome changes as we learn, and computes information.

Difficulty level: Beginner
Duration: 7:13
Speaker: : Harrison Canning

This lecture provides an introduction to the study of eye-tracking in humans. 

Difficulty level: Beginner
Duration: 34:05
Speaker: : Ulrich Ettinger

This demonstration walks through how to import your data into MATLAB.

Difficulty level: Beginner
Duration: 6:10
Speaker: : MATLAB®

This lesson provides instruction regarding the various factors one must consider when preprocessing data, preparing it for statistical exploration and analyses. 

Difficulty level: Beginner
Duration: 15:10
Speaker: : MATLAB®

This tutorial outlines, step by step, how to perform analysis by group and how to do change-point detection.

Difficulty level: Beginner
Duration: 2:49
Speaker: : MATLAB®

This tutorial walks through several common methods for visualizing your data in different ways depending on your data type.

Difficulty level: Beginner
Duration: 6:10
Speaker: : MATLAB®

This tutorial illustrates several ways to approach predictive modeling and machine learning with MATLAB.

Difficulty level: Beginner
Duration: 6:27
Speaker: : MATLAB®