Skip to main content

Initialization, Optimization, and Regularization  -  Day 12 lecture of the  Foundations of Machine Learning in Python course.

High-Performance Computing and Analytics Lab, University of Bonn

Difficulty level: Advanced
Duration: 42:07
Speaker: : Moritz Wolter

U-Nets for medical Image-Segmentation  -  Day 13 lecture of the  Foundations of Machine Learning in Python course.

High-Performance Computing and Analytics Lab, University of Bonn

Difficulty level: Advanced
Duration: 16:45
Speaker: : Moritz Wolter

Sequence Processing -  Day 15 lecture of the  Foundations of Machine Learning in Python course.

High-Performance Computing and Analytics Lab, University of Bonn

Difficulty level: Advanced
Duration: 47:45
Speaker: : Moritz Wolter

This lesson gives a brief introduction to the course Neuroscience for Machine Learners (Neuro4ML). 

Difficulty level: Beginner
Duration: 1:25
Speaker: : Dan Goodman

This lesson covers the history of neuroscience and machine learning, and the story of how these two seemingly disparate fields are increasingly merging. 

Difficulty level: Beginner
Duration: 12:25
Speaker: : Dan Goodman

In this lesson, you will learn about the current challenges facing the integration of machine learning and neuroscience. 

Difficulty level: Beginner
Duration: 5:42
Speaker: : Dan Goodman

This lesson delves into the the structure of one of the brain's most elemental computational units, the neuron, and how said structure influences computational neural network models. 

Difficulty level: Intermediate
Duration: 6:33
Speaker: : Marcus Ghosh

In this lesson you will learn how machine learners and neuroscientists construct abstract computational models based on various neurophysiological signalling properties. 

Difficulty level: Intermediate
Duration: 10:52
Speaker: : Dan Goodman

This lesson characterizes different types of learning in a neuroscientific and cellular context, and various models employed by researchers to investigate the mechanisms involved. 

Difficulty level: Intermediate
Duration: 3:54
Speaker: : Dan Goodman

In this lesson, you will learn about different approaches to modeling learning in neural networks, particularly focusing on system parameters such as firing rates and synaptic weights impact a network. 

Difficulty level: Intermediate
Duration: 9:40
Speaker: : Dan Goodman

 In this lesson, you will learn about some of the many methods to train spiking neural networks (SNNs) with either no attempt to use gradients, or only use gradients in a limited or constrained way. 

Difficulty level: Intermediate
Duration: 5:14
Speaker: : Dan Goodman

In this lesson, you will learn how to train spiking neural networks (SNNs) with a surrogate gradient method. 

Difficulty level: Intermediate
Duration: 11:23
Speaker: : Dan Goodman

This lesson provides an overview of self-supervision as it relates to neural data tasks and the Mine Your Own vieW (MYOW) approach.

Difficulty level: Beginner
Duration: 25:50
Speaker: : Eva Dyer

This lesson provides a conceptual overview of the rudiments of machine learning, including its bases in traditional statistics and the types of questions it might be applied to. The lesson was presented in the context of the BrainHack School 2020.

Difficulty level: Beginner
Duration: 01:22:18
Speaker: : Estefany Suárez

This lesson presents advanced machine learning algorithms for neuroimaging, while addressing some real-world considerations related to data size and type.

Difficulty level: Beginner
Duration: 01:17:14
Speaker: : Gael Varoquaux

This lecture covers FAIR atlases, including their background and construction, as well as how they can be created in line with the FAIR principles.

Difficulty level: Beginner
Duration: 14:24
Speaker: : Heidi Kleven

This is the Introductory Module to the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition.

Difficulty level: Intermediate
Duration: 50:17

This module covers the concepts of gradient descent and the backpropagation algorithm and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:51:03
Speaker: : Yann LeCun

This lecture covers the concept of parameter sharing: recurrent and convolutional nets and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:59:47

This lecture covers the concept of convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 51:40
Speaker: : Yann LeCun