Skip to main content

This lecture provides an overview of successful open-access projects aimed at describing complex neuroscientific models, and makes a case for expanded use of resources in support of reproducibility and validation of models against experimental data.

Difficulty level: Beginner
Duration: 1:00:39
Speaker: : Sharon Crook

This lecture provides an introduction to the Brain Imaging Data Structure (BIDS), a standard for organizing human neuroimaging datasets.

Difficulty level: Intermediate
Duration: 56:49

This lesson provides an overview of Neurodata Without Borders (NWB), an ecosystem for neurophysiology data standardization. The lecture also introduces some NWB-enabled tools. 

Difficulty level: Beginner
Duration: 29:53
Speaker: : Oliver Ruebel

This lesson outlines Neurodata Without Borders (NWB), a data standard for neurophysiology which provides neuroscientists with a common standard to share, archive, use, and build analysis tools for neurophysiology data.

Difficulty level: Intermediate
Duration: 29:53
Speaker: : Oliver Ruebel

This lecture covers the rationale for developing the DAQCORD, a framework for the design, documentation, and reporting of data curation methods in order to advance the scientific rigour, reproducibility, and analysis of data.

Difficulty level: Intermediate
Duration: 17:08
Speaker: : Ari Ercole

This tutorial demonstrates how to use PyNN, a simulator-independent language for building neuronal network models, in conjunction with the neuromorphic hardware system SpiNNaker. 

Difficulty level: Intermediate
Duration: 25:49

In this lesson, you will learn in more detail about neuromorphic computing, that is, non-standard computational architectures that mimic some aspect of the way the brain works. 

Difficulty level: Intermediate
Duration: 10:08
Speaker: : Dan Goodman

This video provides a very quick introduction to some of the neuromorphic sensing devices, and how they offer unique, low-power applications.

Difficulty level: Intermediate
Duration: 2:37
Speaker: : Dan Goodman

This lesson is a general overview of overarching concepts in neuroinformatics research, with a particular focus on clinical approaches to defining, measuring, studying, diagnosing, and treating various brain disorders. Also described are the complex, multi-level nature of brain disorders and the data associated with them, from genes and individual cells up to cortical microcircuits and whole-brain network dynamics. Given the heterogeneity of brain disorders and their underlying mechanisms, this lesson lays out a case for multiscale neuroscience data integration.

Difficulty level: Intermediate
Duration: 1:09:33
Speaker: : Sean Hill

This lesson breaks down the principles of Bayesian inference and how it relates to cognitive processes and functions like learning and perception. It is then explained how cognitive models can be built using Bayesian statistics in order to investigate how our brains interface with their environment. 

This lesson corresponds to slides 1-64 in the PDF below. 

Difficulty level: Intermediate
Duration: 1:28:14

Whereas the previous two lessons described the biophysical and signalling properties of individual neurons, this lesson describes properties of those units when part of larger networks. 

Difficulty level: Intermediate
Duration: 6:00
Speaker: : Marcus Ghosh

This lesson goes over some examples of how machine learners and computational neuroscientists go about designing and building neural network models inspired by biological brain systems. 

Difficulty level: Intermediate
Duration: 12:52
Speaker: : Dan Goodman

This lecture and tutorial focuses on measuring human functional brain networks, as well as how to account for inherent variability within those networks. 

Difficulty level: Intermediate
Duration: 50:44
Speaker: : Caterina Gratton

This lecture presents an overview of functional brain parcellations, as well as a set of tutorials on bootstrap agregation of stable clusters (BASC) for fMRI brain parcellation.

Difficulty level: Advanced
Duration: 50:28
Speaker: : Pierre Bellec

This is the first of two workshops on reproducibility in science, during which participants are introduced to concepts of FAIR and open science. After discussing the definition of and need for FAIR science, participants are walked through tutorials on installing and using Github and Docker, the powerful, open-source tools for versioning and publishing code and software, respectively.

Difficulty level: Intermediate
Duration: 1:20:58

In this lesson, while learning about the need for increased large-scale collaborative science that is transparent in nature, users also are given a tutorial on using Synapse for facilitating reusable and reproducible research. 

Difficulty level: Beginner
Duration: 1:15:12
Speaker: : Abhi Pratap

This lesson contains the first part of the lecture Data Science and Reproducibility. You will learn about the development of data science and what the term currently encompasses, as well as how neuroscience and data science intersect. 

Difficulty level: Beginner
Duration: 32:18
Speaker: : Ariel Rokem

The lecture provides an overview of the core skills and practical solutions required to practice reproducible research.

Difficulty level: Beginner
Duration: 1:25:17
Speaker: : Fernando Perez

This lecture provides an introduction to reproducibility issues within the fields of neuroimaging and fMRI, as well as an overview of tools and resources being developed to alleviate the problem.

Difficulty level: Beginner
Duration: 1:03:07
Speaker: : Russell Poldrack

This lecture provides a historical perspective on reproducibility in science, as well as the current limitations of neuroimaging studies to date. This lecture also lays out a case for the use of meta-analyses, outlining available resources to conduct such analyses. 

Difficulty level: Beginner
Duration: 55:39
Speaker: : Angela Laird