This is a tutorial on designing a Bayesian inference model to map belief trajectories, with emphasis on gaining familiarity with Hierarchical Gaussian Filters (HGFs).

This lesson corresponds to slides 65-90 of the PDF below.

Difficulty level: Intermediate

Duration: 1:15:04

Speaker: : Daniel Hauke

This lesson briefly goes over the outline of the Neuroscience for Machine Learners course.

Difficulty level: Intermediate

Duration: 3:05

Speaker: : Dan Goodman

This lesson goes over the basic mechanisms of neural synapses, the space between neurons where signals may be transmitted.

Difficulty level: Intermediate

Duration: 7:03

Speaker: : Marcus Ghosh

While the previous lesson in the Neuro4ML course dealt with the mechanisms involved in individual synapses, this lesson discusses how synapses and their neurons' firing patterns may change over time.

Difficulty level: Intermediate

Duration: 4:48

Speaker: : Marcus Ghosh

This lesson introduces some practical exercises which accompany the Synapses and Networks portion of this Neuroscience for Machine Learners course.

Difficulty level: Intermediate

Duration: 3:51

Speaker: : Dan Goodman

This lesson delves into the human nervous system and the immense cellular, connectomic, and functional sophistication therein.

Difficulty level: Intermediate

Duration: 8:41

Speaker: : Marcus Ghosh

This lesson characterizes different types of learning in a neuroscientific and cellular context, and various models employed by researchers to investigate the mechanisms involved.

Difficulty level: Intermediate

Duration: 3:54

Speaker: : Dan Goodman

This lesson describes spike timing-dependent plasticity (STDP), a biological process that adjusts the strength of connections between neurons in the brain, and how one can implement or mimic this process in a computational model. You will also find links for practical exercises at the bottom of this page.

Difficulty level: Intermediate

Duration: 12:50

Speaker: : Dan Goodman

As the previous lesson of this course described how researchers acquire neural data, this lesson will discuss how to go about interpreting and analysing the data.

Difficulty level: Intermediate

Duration: 9:24

Speaker: : Marcus Ghosh

In this lesson you will learn about the motivation behind manipulating neural activity, and what forms that may take in various experimental designs.

Difficulty level: Intermediate

Duration: 8:42

Speaker: : Marcus Ghosh

In this lesson, you will learn about one particular aspect of decision making: reaction times. In other words, how long does it take to take a decision based on a stream of information arriving continuously over time?

Difficulty level: Intermediate

Duration: 6:01

Speaker: : Dan Goodman

In this lesson, you will hear about some of the open issues in the field of neuroscience, as well as a discussion about whether neuroscience works, and how can we know?

Difficulty level: Intermediate

Duration: 6:54

Speaker: : Marcus Ghosh

This lesson discusses a gripping neuroscientific question: why have neurons developed the discrete action potential, or spike, as a principle method of communication?

Difficulty level: Intermediate

Duration: 9:34

Speaker: : Dan Goodman

This tutorial covers the fundamentals of collaborating with Git and GitHub.

Difficulty level: Intermediate

Duration: 2:15:50

Speaker: : Elizabeth DuPre

This is the Introductory Module to the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition.

Difficulty level: Intermediate

Duration: 50:17

Speaker: : Yann LeCun and Alfredo Canziani

This module covers the concepts of gradient descent and the backpropagation algorithm and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:51:03

Speaker: : Yann LeCun

This lecture covers the concept of parameter sharing: recurrent and convolutional nets and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:59:47

Speaker: : Yann LeCun and Alfredo Canziani

This lecture covers the concept of convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 51:40

Speaker: : Yann LeCun

This lecture discusses the concept of natural signals properties and the convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:09:12

Speaker: : Alfredo Canziani

This lecture covers the concept of recurrent neural networks: vanilla and gated (LSTM) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:05:36

Speaker: : Alfredo Canziani

- Bayesian networks (2)
- Clinical neuroinformatics (2)
- Standards and Best Practices (1)
- Neuroimaging (19)
- Machine learning (9)
- Neuromorphic engineering (3)
- Tools (1)
- Animal models (1)
- Brain-hardware interfaces (1)
- (-) Clinical neuroscience (1)
- (-) General neuroscience (15)
- Computational neuroscience (12)
- Statistics (5)
- (-) Computer Science (2)
- Genomics (8)
- (-) Data science (2)
- Open science (4)