Course:

In this tutorial, users will learn how to identify and remove background noise, or "blur", an important step in isolating cell bodies from image data.

Difficulty level: Intermediate

Duration: 17:08

Speaker: : Mike X. Cohen

Course:

This lesson teaches users how MATLAB can be used to apply image processing techniques to identify cell bodies based on contiguity.

Difficulty level: Intermediate

Duration: 11:23

Speaker: : Mike X. Cohen

Course:

This tutorial demonstrates how to extract the time course of calcium activity from each clusters of neuron somata, and store the data in a MATLAB matrix.

Difficulty level: Intermediate

Duration: 22:41

Speaker: : Mike X. Cohen

Course:

This lesson demonstrates how to use MATLAB to implement a multivariate dimension reduction method, PCA, on time series data.

Difficulty level: Intermediate

Duration: 17:19

Speaker: : Mike X. Cohen

Course:

This lesson describes the principles underlying functional magnetic resonance imaging (fMRI), diffusion-weighted imaging (DWI), tractography, and parcellation. These tools and concepts are explained in a broader context of neural connectivity and mental health.

Difficulty level: Intermediate

Duration: 1:47:22

Speaker: : Erin Dickie and John Griffiths

This is a tutorial on designing a Bayesian inference model to map belief trajectories, with emphasis on gaining familiarity with Hierarchical Gaussian Filters (HGFs).

This lesson corresponds to slides 65-90 of the PDF below.

Difficulty level: Intermediate

Duration: 1:15:04

Speaker: : Daniel Hauke

Similarity Network Fusion (SNF) is a computational method for data integration across various kinds of measurements, aimed at taking advantage of the common as well as complementary information in different data types. This workshop walks participants through running SNF on EEG and genomic data using RStudio.

Difficulty level: Intermediate

Duration: 1:21:38

Speaker: : Dan Felsky

This lesson briefly goes over the outline of the Neuroscience for Machine Learners course.

Difficulty level: Intermediate

Duration: 3:05

Speaker: : Dan Goodman

This lesson delves into the the structure of one of the brain's most elemental computational units, the neuron, and how said structure influences computational neural network models.

Difficulty level: Intermediate

Duration: 6:33

Speaker: : Marcus Ghosh

In this lesson you will learn how machine learners and neuroscientists construct abstract computational models based on various neurophysiological signalling properties.

Difficulty level: Intermediate

Duration: 10:52

Speaker: : Dan Goodman

This lesson goes over the basic mechanisms of neural synapses, the space between neurons where signals may be transmitted.

Difficulty level: Intermediate

Duration: 7:03

Speaker: : Marcus Ghosh

While the previous lesson in the Neuro4ML course dealt with the mechanisms involved in individual synapses, this lesson discusses how synapses and their neurons' firing patterns may change over time.

Difficulty level: Intermediate

Duration: 4:48

Speaker: : Marcus Ghosh

This lesson introduces some practical exercises which accompany the Synapses and Networks portion of this Neuroscience for Machine Learners course.

Difficulty level: Intermediate

Duration: 3:51

Speaker: : Dan Goodman

This lesson describes spike timing-dependent plasticity (STDP), a biological process that adjusts the strength of connections between neurons in the brain, and how one can implement or mimic this process in a computational model. You will also find links for practical exercises at the bottom of this page.

Difficulty level: Intermediate

Duration: 12:50

Speaker: : Dan Goodman

In this lesson, you will learn about some of the many methods to train spiking neural networks (SNNs) with either no attempt to use gradients, or only use gradients in a limited or constrained way.

Difficulty level: Intermediate

Duration: 5:14

Speaker: : Dan Goodman

In this lesson, you will learn how to train spiking neural networks (SNNs) with a surrogate gradient method.

Difficulty level: Intermediate

Duration: 11:23

Speaker: : Dan Goodman

This lesson explores how researchers try to understand neural networks, particularly in the case of observing neural activity.

Difficulty level: Intermediate

Duration: 8:20

Speaker: : Marcus Ghosh

As the previous lesson of this course described how researchers acquire neural data, this lesson will discuss how to go about interpreting and analysing the data.

Difficulty level: Intermediate

Duration: 9:24

Speaker: : Marcus Ghosh

In this lesson you will learn about the motivation behind manipulating neural activity, and what forms that may take in various experimental designs.

Difficulty level: Intermediate

Duration: 8:42

Speaker: : Marcus Ghosh

In this lesson, you will learn about one particular aspect of decision making: reaction times. In other words, how long does it take to take a decision based on a stream of information arriving continuously over time?

Difficulty level: Intermediate

Duration: 6:01

Speaker: : Dan Goodman

- Bayesian networks (2)
- Clinical neuroinformatics (2)
- Standards and Best Practices (1)
- (-) Neuroimaging (19)
- Machine learning (9)
- Neuromorphic engineering (3)
- Tools (1)
- Animal models (1)
- Brain-hardware interfaces (1)
- Clinical neuroscience (1)
- (-) General neuroscience (15)
- (-) Computational neuroscience (12)
- Statistics (5)
- Computer Science (2)
- Genomics (8)
- Data science (2)
- Open science (4)