Skip to main content

This lecture provides an introduction to the course "Cognitive Science & Psychology: Mind, Brain, and Behavior".

Difficulty level: Beginner
Duration: 1:06:49

This lesson covers the history of neuroscience and machine learning, and the story of how these two seemingly disparate fields are increasingly merging. 

Difficulty level: Beginner
Duration: 12:25
Speaker: : Dan Goodman

In this lesson you will learn how machine learners and neuroscientists construct abstract computational models based on various neurophysiological signalling properties. 

Difficulty level: Intermediate
Duration: 10:52
Speaker: : Dan Goodman

In this lesson, you will learn about some typical neuronal models employed by machine learners and computational neuroscientists, meant to imitate the biophysical properties of real neurons. 

Difficulty level: Intermediate
Duration: 3:12
Speaker: : Dan Goodman

Whereas the previous two lessons described the biophysical and signalling properties of individual neurons, this lesson describes properties of those units when part of larger networks. 

Difficulty level: Intermediate
Duration: 6:00
Speaker: : Marcus Ghosh

This lesson goes over some examples of how machine learners and computational neuroscientists go about designing and building neural network models inspired by biological brain systems. 

Difficulty level: Intermediate
Duration: 12:52
Speaker: : Dan Goodman

In this lesson, you will learn about different approaches to modeling learning in neural networks, particularly focusing on system parameters such as firing rates and synaptic weights impact a network. 

Difficulty level: Intermediate
Duration: 9:40
Speaker: : Dan Goodman

In this lesson, you will learn more about some of the issues inherent in modeling neural spikes, approaches to ameliorate these problems, and the pros and cons of these approaches. 

Difficulty level: Intermediate
Duration: 5:31
Speaker: : Dan Goodman

 In this lesson, you will learn about some of the many methods to train spiking neural networks (SNNs) with either no attempt to use gradients, or only use gradients in a limited or constrained way. 

Difficulty level: Intermediate
Duration: 5:14
Speaker: : Dan Goodman

In this lesson, you will learn how to train spiking neural networks (SNNs) with a surrogate gradient method. 

Difficulty level: Intermediate
Duration: 11:23
Speaker: : Dan Goodman

This lesson explores how researchers try to understand neural networks, particularly in the case of observing neural activity. 

Difficulty level: Intermediate
Duration: 8:20
Speaker: : Marcus Ghosh

In this lesson you will learn about the motivation behind manipulating neural activity, and what forms that may take in various experimental designs. 

Difficulty level: Intermediate
Duration: 8:42
Speaker: : Marcus Ghosh

This video briefly goes over the exercises accompanying Week 6 of the Neuroscience for Machine Learners (Neuro4ML) course, Understanding Neural Networks.

Difficulty level: Intermediate
Duration: 2:43
Speaker: : Marcus Ghosh

This lecture focuses on the structured validation process within computational neuroscience, including the tools, services, and methods involved in simulation and analysis.

Difficulty level: Beginner
Duration: 14:19
Speaker: : Michael Denker

This module explains how neurons come together to create the networks that give rise to our thoughts. The totality of our neurons and their connection is called our connectome. Learn how this connectome changes as we learn, and computes information.

Difficulty level: Beginner
Duration: 7:13
Speaker: : Harrison Canning

In this lesson, while learning about the need for increased large-scale collaborative science that is transparent in nature, users also are given a tutorial on using Synapse for facilitating reusable and reproducible research. 

Difficulty level: Beginner
Duration: 1:15:12
Speaker: : Abhi Pratap

This lecture discusses what defines an integrative approach regarding research and methods, including various study designs and models which are appropriate choices when attempting to bridge data domains; a necessity when whole-person modelling. 

Difficulty level: Beginner
Duration: 1:28:14
Speaker: : Dan Felsky

Similarity Network Fusion (SNF) is a computational method for data integration across various kinds of measurements, aimed at taking advantage of the common as well as complementary information in different data types. This workshop walks participants through running SNF on EEG and genomic data using RStudio.

Difficulty level: Intermediate
Duration: 1:21:38
Speaker: : Dan Felsky

This lesson provides an introduction the International Neuroinformatics Coordinating Facility (INCF), its mission towards FAIR neuroscience, and future directions. 

Difficulty level: Beginner
Duration: 20:29
Speaker: : Maryann Martone

This brief video provides an introduction to the third session of INCF's Neuroinformatics Assembly 2023, focusing on how to streamling cross-platform data integration in a neuroscientific context. 

Difficulty level: Beginner
Duration: 5:55
Speaker: : Bing-Xing Huo