Skip to main content

This lecture provides an introductory overview of some of the most important concepts in software engineering.

Difficulty level: Beginner
Duration: 32:59
Speaker: : Jeff Muller

In this lesson, you will learn in more detail about neuromorphic computing, that is, non-standard computational architectures that mimic some aspect of the way the brain works. 

Difficulty level: Intermediate
Duration: 10:08
Speaker: : Dan Goodman

This video provides a very quick introduction to some of the neuromorphic sensing devices, and how they offer unique, low-power applications.

Difficulty level: Intermediate
Duration: 2:37
Speaker: : Dan Goodman

This talk enumerates the challenges regarding data accessibility and reusability inherent in the current scientific publication system, and discusses novel approaches to these challenges, such as the EBRAINS Live Papers platform. 

Difficulty level: Beginner
Duration: 18:08
Speaker: : Andrew Davison

This brief video gives an introduction to the eighth session of INCF's Neuroinformatics Assembly 2023, focusing on FAIR data and the role of academic journals. 

Difficulty level: Beginner
Duration: 5:57
Speaker: : Jan G. Bjaalie

This talk gives an overview of the perspectives and FAIR-aligned policies of the academic journal Public Library of Science, better known as PLOS. This journal is a nonprofit, open access publisher empowering researchers to accelerate progress in science. 

Difficulty level: Beginner
Duration: 11:53

This talk highlights a set of platform technologies, software, and data collections that close and shorten the feedback cycle in research. 

Difficulty level: Beginner
Duration: 57:52
Speaker: : Satrajit Ghosh

This is the Introductory Module to the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition.

Difficulty level: Intermediate
Duration: 50:17

This module covers the concepts of gradient descent and the backpropagation algorithm and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:51:03
Speaker: : Yann LeCun

This lecture covers concepts associated with neural nets, including rotation and squashing, and is a part of the Deep Learning Course at New York University's Center for Data Science (CDS).

Difficulty level: Intermediate
Duration: 1:01:53
Speaker: : Alfredo Canziani

This lesson provides a detailed description of some of the modules and architectures involved in the development of neural networks. 

Difficulty level: Intermediate
Duration: 1:42:26

This lecture covers the concept of neural nets training (tools, classification with neural nets, and PyTorch implementation) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:05:47
Speaker: : Alfredo Canziani

This lecture covers the concept of parameter sharing: recurrent and convolutional nets and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:59:47

This lecture covers the concept of convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 51:40
Speaker: : Yann LeCun

This lecture discusses the concept of natural signals properties and the convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:09:12
Speaker: : Alfredo Canziani

This lecture covers the concept of recurrent neural networks: vanilla and gated (LSTM) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:05:36
Speaker: : Alfredo Canziani

This lecture is a foundationational lecture for the concept of energy-based models with a particular focus on the joint embedding method and latent variable energy-based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:51:30
Speaker: : Yann LeCun

This lecture covers the concept of inference in latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science. 

Difficulty level: Intermediate
Duration: 1:01:04
Speaker: : Alfredo Canziani

This panel discussion covers how energy based models are used and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 10:42

This lecture is a foundationational lecture for the concept of energy-based models with a particular focus on the joint embedding method and latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:48:53
Speaker: : Yann LeCun