This lesson breaks down the principles of Bayesian inference and how it relates to cognitive processes and functions like learning and perception. It is then explained how cognitive models can be built using Bayesian statistics in order to investigate how our brains interface with their environment.

This lesson corresponds to slides 1-64 in the PDF below.

Difficulty level: Intermediate

Duration: 1:28:14

Speaker: : Andreea Diaconescu

This is a tutorial on designing a Bayesian inference model to map belief trajectories, with emphasis on gaining familiarity with Hierarchical Gaussian Filters (HGFs).

This lesson corresponds to slides 65-90 of the PDF below.

Difficulty level: Intermediate

Duration: 1:15:04

Speaker: : Daniel Hauke

Similarity Network Fusion (SNF) is a computational method for data integration across various kinds of measurements, aimed at taking advantage of the common as well as complementary information in different data types. This workshop walks participants through running SNF on EEG and genomic data using RStudio.

Difficulty level: Intermediate

Duration: 1:21:38

Speaker: : Dan Felsky

This lesson continues from part one of the lecture *Ontologies, Databases, and Standards*, diving deeper into a description of ontologies and knowledg graphs.

Difficulty level: Intermediate

Duration: 50:18

Speaker: : Jeff Grethe

This lecture describes how to build research workflows, including a demonstrate using DataJoint Elements to build data pipelines.

Difficulty level: Intermediate

Duration: 47:00

Speaker: : Dimitri Yatsenko

This lesson characterizes different types of learning in a neuroscientific and cellular context, and various models employed by researchers to investigate the mechanisms involved.

Difficulty level: Intermediate

Duration: 3:54

Speaker: : Dan Goodman

In this lesson, you will learn about different approaches to modeling learning in neural networks, particularly focusing on system parameters such as firing rates and synaptic weights impact a network.

Difficulty level: Intermediate

Duration: 9:40

Speaker: : Dan Goodman

In this lesson, you will learn about one particular aspect of decision making: reaction times. In other words, how long does it take to take a decision based on a stream of information arriving continuously over time?

Difficulty level: Intermediate

Duration: 6:01

Speaker: : Dan Goodman

In this lesson, you will hear about some of the open issues in the field of neuroscience, as well as a discussion about whether neuroscience works, and how can we know?

Difficulty level: Intermediate

Duration: 6:54

Speaker: : Marcus Ghosh

Course:

This lecture and tutorial focuses on measuring human functional brain networks, as well as how to account for inherent variability within those networks.

Difficulty level: Intermediate

Duration: 50:44

Speaker: : Caterina Gratton

This lecture gives an overview of how to prepare and preprocess neuroimaging (EEG/MEG) data for use in TVB.

Difficulty level: Intermediate

Duration: 1:40:52

Speaker: : Paul Triebkorn

This is the Introductory Module to the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition.

Difficulty level: Intermediate

Duration: 50:17

Speaker: : Yann LeCun and Alfredo Canziani

This module covers the concepts of gradient descent and the backpropagation algorithm and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:51:03

Speaker: : Yann LeCun

This lesson provides a detailed description of some of the modules and architectures involved in the development of neural networks.

Difficulty level: Intermediate

Duration: 1:42:26

Speaker: : Yann LeCun and Alfredo Canziani

This lecture covers the concept of parameter sharing: recurrent and convolutional nets and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:59:47

Speaker: : Yann LeCun and Alfredo Canziani

This lecture covers the concept of convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 51:40

Speaker: : Yann LeCun

This lecture is a foundationational lecture for the concept of energy-based models with a particular focus on the joint embedding method and latent variable energy-based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:51:30

Speaker: : Yann LeCun

This lecture is a foundationational lecture for the concept of energy-based models with a particular focus on the joint embedding method and latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate

Duration: 1:48:53

Speaker: : Yann LeCun

This lecture discusses differential privacy and synthetic data in the context of medical data sharing in clinical neurosciences.

Difficulty level: Intermediate

Duration: 20:26

Speaker: : Minos Garofalakis

This lecture focuses on ontologies for clinical neurosciences.

Difficulty level: Intermediate

Duration: 21:54

Speaker: : Martin Hofmann-Apitius

- Artificial Intelligence (1)
- Provenance (1)
- EBRAINS RI (6)
- Animal models (1)
- Brain-hardware interfaces (1)
- Clinical neuroscience (20)
- General neuroscience (13)
- General neuroinformatics (1)
- Computational neuroscience (42)
- Statistics (5)
- (-) Computer Science (2)
- (-) Genomics (7)
- Data science
(8)
- Open science (3)
- Project management (1)
- Neuroethics (3)