Course:

This lesson demonstrates how to use MATLAB to implement a multivariate dimension reduction method, PCA, on time series data.

Difficulty level: Intermediate

Duration: 17:19

Speaker: : Mike X. Cohen

This is the first of two workshops on reproducibility in science, during which participants are introduced to concepts of FAIR and open science. After discussing the definition of and need for FAIR science, participants are walked through tutorials on installing and using Github and Docker, the powerful, open-source tools for versioning and publishing code and software, respectively.

Difficulty level: Intermediate

Duration: 1:20:58

Speaker: : Erin Dickie and Sejal Patel

This is a hands-on tutorial on PLINK, the open source whole genome association analysis toolset. The aims of this tutorial are to teach users how to perform basic quality control on genetic datasets, as well as to identify and understand GWAS summary statistics.

Difficulty level: Intermediate

Duration: 1:27:18

Speaker: : Dan Felsky

This is a tutorial on using the open-source software PRSice to calculate a set of polygenic risk scores (PRS) for a study sample. Users will also learn how to read PRS into R, visualize distributions, and perform basic association analyses.

Difficulty level: Intermediate

Duration: 1:53:34

Speaker: : Dan Felsky

Course:

This lesson describes the principles underlying functional magnetic resonance imaging (fMRI), diffusion-weighted imaging (DWI), tractography, and parcellation. These tools and concepts are explained in a broader context of neural connectivity and mental health.

Difficulty level: Intermediate

Duration: 1:47:22

Speaker: : Erin Dickie and John Griffiths

This lesson breaks down the principles of Bayesian inference and how it relates to cognitive processes and functions like learning and perception. It is then explained how cognitive models can be built using Bayesian statistics in order to investigate how our brains interface with their environment.

This lesson corresponds to slides 1-64 in the PDF below.

Difficulty level: Intermediate

Duration: 1:28:14

Speaker: : Andreea Diaconescu

This is a tutorial on designing a Bayesian inference model to map belief trajectories, with emphasis on gaining familiarity with Hierarchical Gaussian Filters (HGFs).

This lesson corresponds to slides 65-90 of the PDF below.

Difficulty level: Intermediate

Duration: 1:15:04

Speaker: : Daniel Hauke

This lesson briefly goes over the outline of the Neuroscience for Machine Learners course.

Difficulty level: Intermediate

Duration: 3:05

Speaker: : Dan Goodman

This lesson delves into the the structure of one of the brain's most elemental computational units, the neuron, and how said structure influences computational neural network models.

Difficulty level: Intermediate

Duration: 6:33

Speaker: : Marcus Ghosh

In this lesson you will learn how machine learners and neuroscientists construct abstract computational models based on various neurophysiological signalling properties.

Difficulty level: Intermediate

Duration: 10:52

Speaker: : Dan Goodman

This lesson contains practical exercises which accompanies the first few lessons of the Neuroscience for Machine Learners (Neuro4ML) course.

Difficulty level: Intermediate

Duration: 5:58

Speaker: : Dan Goodman

This lesson goes over the basic mechanisms of neural synapses, the space between neurons where signals may be transmitted.

Difficulty level: Intermediate

Duration: 7:03

Speaker: : Marcus Ghosh

While the previous lesson in the Neuro4ML course dealt with the mechanisms involved in individual synapses, this lesson discusses how synapses and their neurons' firing patterns may change over time.

Difficulty level: Intermediate

Duration: 4:48

Speaker: : Marcus Ghosh

This lesson goes over some examples of how machine learners and computational neuroscientists go about designing and building neural network models inspired by biological brain systems.

Difficulty level: Intermediate

Duration: 12:52

Speaker: : Dan Goodman

This lesson introduces some practical exercises which accompany the Synapses and Networks portion of this Neuroscience for Machine Learners course.

Difficulty level: Intermediate

Duration: 3:51

Speaker: : Dan Goodman

This lesson characterizes different types of learning in a neuroscientific and cellular context, and various models employed by researchers to investigate the mechanisms involved.

Difficulty level: Intermediate

Duration: 3:54

Speaker: : Dan Goodman

In this lesson, you will learn about different approaches to modeling learning in neural networks, particularly focusing on system parameters such as firing rates and synaptic weights impact a network.

Difficulty level: Intermediate

Duration: 9:40

Speaker: : Dan Goodman

In this lesson, you will learn about some of the many methods to train spiking neural networks (SNNs) with either no attempt to use gradients, or only use gradients in a limited or constrained way.

Difficulty level: Intermediate

Duration: 5:14

Speaker: : Dan Goodman

In this lesson, you will learn how to train spiking neural networks (SNNs) with a surrogate gradient method.

Difficulty level: Intermediate

Duration: 11:23

Speaker: : Dan Goodman

This lesson explores how researchers try to understand neural networks, particularly in the case of observing neural activity.

Difficulty level: Intermediate

Duration: 8:20

Speaker: : Marcus Ghosh

- Clinical neuroinformatics (4)
- Bayesian networks (3)
- (-) Standards and Best Practices (2)
- Notebooks (1)
- (-) Neuroimaging (23)
- (-) Machine learning (12)
- Neuromorphic engineering (3)
- Tools (7)
- Workflows (2)
- Animal models (1)
- Brain-hardware interfaces (1)
- Clinical neuroscience (2)
- (-) General neuroscience (14)
- Computational neuroscience (24)
- Statistics (5)
- (-) Computer Science (2)
- Genomics (8)
- Data science (2)
- (-) Open science (4)