Skip to main content

This lesson is a general overview of overarching concepts in neuroinformatics research, with a particular focus on clinical approaches to defining, measuring, studying, diagnosing, and treating various brain disorders. Also described are the complex, multi-level nature of brain disorders and the data associated with them, from genes and individual cells up to cortical microcircuits and whole-brain network dynamics. Given the heterogeneity of brain disorders and their underlying mechanisms, this lesson lays out a case for multiscale neuroscience data integration.

Difficulty level: Intermediate
Duration: 1:09:33
Speaker: : Sean Hill

This lesson breaks down the principles of Bayesian inference and how it relates to cognitive processes and functions like learning and perception. It is then explained how cognitive models can be built using Bayesian statistics in order to investigate how our brains interface with their environment. 

This lesson corresponds to slides 1-64 in the PDF below. 

Difficulty level: Intermediate
Duration: 1:28:14

This lecture and tutorial focuses on measuring human functional brain networks, as well as how to account for inherent variability within those networks. 

Difficulty level: Intermediate
Duration: 50:44
Speaker: : Caterina Gratton
Course:

Neuronify is an educational tool meant to create intuition for how neurons and neural networks behave. You can use it to combine neurons with different connections, just like the ones we have in our brain, and explore how changes on single cells lead to behavioral changes in important networks. Neuronify is based on an integrate-and-fire model of neurons. This is one of the simplest models of neurons that exist. It focuses on the spike timing of a neuron and ignores the details of the action potential dynamics. These neurons are modeled as simple RC circuits. When the membrane potential is above a certain threshold, a spike is generated and the voltage is reset to its resting potential. This spike then signals other neurons through its synapses.

Neuronify aims to provide a low entry point to simulation-based neuroscience.

Difficulty level: Beginner
Duration: 01:25
Speaker: : Neuronify

This lecture goes into detailed description of how to process workflows in the virtual research environment (VRE), including approaches for standardization, metadata, containerization, and constructing and maintaining scientific pipelines. 

Difficulty level: Intermediate
Duration: 1:03:55
Speaker: : Patrik Bey

This lesson provides an overview of how to conceptualize, design, implement, and maintain neuroscientific pipelines in via the cloud-based computational reproducibility platform Code Ocean. 

Difficulty level: Beginner
Duration: 17:01
Speaker: : David Feng

In this workshop talk, you will receive a tour of the Code Ocean ScienceOps Platform, a centralized cloud workspace for all teams. 

Difficulty level: Beginner
Duration: 10:24
Speaker: : Frank Zappulla

This lecture covers a wide range of aspects regarding neuroinformatics and data governance, describing both their historical developments and current trajectories. Particular tools, platforms, and standards to make your research more FAIR are also discussed.

Difficulty level: Beginner
Duration: 54:58
Speaker: : Franco Pestilli

This lecture introduces you to the basics of the Amazon Web Services public cloud. It covers the fundamentals of cloud computing and goes through both the motivations and processes involved in moving your research computing to the cloud.

Difficulty level: Intermediate
Duration: 3:09:12

This lecture discusses how FAIR practices affect personalized data models, including workflows, challenges, and how to improve these practices.

Difficulty level: Beginner
Duration: 13:16
Speaker: : Kelly Shen

In this talk, you will learn how brainlife.io works, and how it can be applied to neuroscience data.

Difficulty level: Beginner
Duration: 10:14
Speaker: : Franco Pestilli

As a part of NeuroHackademy 2020, this lecture delves into cloud computing, focusing on Amazon Web Services. 

Difficulty level: Beginner
Duration: 01:43:59

This talk presents an overview of CBRAIN, a web-based platform that allows neuroscientists to perform computationally intensive data analyses by connecting them to high-performance computing facilities across Canada and around the world.

Difficulty level: Beginner
Duration: 56:07
Speaker: : Shawn Brown

This lesson continues from part one of the lecture Ontologies, Databases, and Standards, diving deeper into a description of ontologies and knowledg graphs. 

Difficulty level: Intermediate
Duration: 50:18
Speaker: : Jeff Grethe

This lecture covers FAIR atlases, including their background and construction, as well as how they can be created in line with the FAIR principles.

Difficulty level: Beginner
Duration: 14:24
Speaker: : Heidi Kleven

This lecture focuses on ontologies for clinical neurosciences.

Difficulty level: Intermediate
Duration: 21:54

Learn how to create a standard extracellular electrophysiology dataset in NWB using Python.

Difficulty level: Intermediate
Duration: 23:10
Speaker: : Ryan Ly

Learn how to create a standard calcium imaging dataset in NWB using Python.

Difficulty level: Intermediate
Duration: 31:04
Speaker: : Ryan Ly

In this tutorial, you will learn how to create a standard intracellular electrophysiology dataset in NWB using Python.

Difficulty level: Intermediate
Duration: 20:23
Speaker: : Pamela Baker

In this tutorial, you will learn how to use the icephys-metadata extension to enter meta-data detailing your experimental paradigm.

Difficulty level: Intermediate
Duration: 27:18
Speaker: : Oliver Ruebel