Skip to main content

This lesson is a general overview of overarching concepts in neuroinformatics research, with a particular focus on clinical approaches to defining, measuring, studying, diagnosing, and treating various brain disorders. Also described are the complex, multi-level nature of brain disorders and the data associated with them, from genes and individual cells up to cortical microcircuits and whole-brain network dynamics. Given the heterogeneity of brain disorders and their underlying mechanisms, this lesson lays out a case for multiscale neuroscience data integration.

Difficulty level: Intermediate
Duration: 1:09:33
Speaker: : Sean Hill

This lesson describes the fundamentals of genomics, from central dogma to design and implementation of GWAS, to the computation, analysis, and interpretation of polygenic risk scores. 

Difficulty level: Intermediate
Duration: 1:28:16
Speaker: : Dan Felsky

This lesson contains the slides (pptx) of a lecture discussing the necessary concepts and tools for taking into account population stratification and admixture in the context of genome-wide association studies (GWAS). The free-access software Tractor and its advantages in GWAS are also discussed. 

Difficulty level: Intermediate
Duration:
Speaker: : Dan Felsky

This lesson is an overview of transcriptomics, from fundamental concepts of the central dogma and RNA sequencing at the single-cell level, to how genetic expression underlies diversity in cell phenotypes. 

Difficulty level: Intermediate
Duration: 1:29:08

This is a continuation of the talk on the cellular mechanisms of neuronal communication, this time at the level of brain microcircuits and associated global signals like those measureable by electroencephalography (EEG). This lecture also discusses EEG biomarkers in mental health disorders, and how those cortical signatures may be simulated digitally.

Difficulty level: Intermediate
Duration: 1:11:04
Speaker: : Etay Hay

This lesson breaks down the principles of Bayesian inference and how it relates to cognitive processes and functions like learning and perception. It is then explained how cognitive models can be built using Bayesian statistics in order to investigate how our brains interface with their environment. 

This lesson corresponds to slides 1-64 in the PDF below. 

Difficulty level: Intermediate
Duration: 1:28:14

This lecture covers the history of behaviorism and the ultimate challenge to behaviorism. 

Difficulty level: Beginner
Duration: 1:19:08

This lecture covers various learning theories.

Difficulty level: Beginner
Duration: 1:00:42

How genetics can contribute to our understanding of psychiatric phenotypes.

Difficulty level: Beginner
Duration: 55:15
Speaker: : Sven Cichon

The tutorial is intended primarily for beginners, but it will also beneficial to experimentalists who understand electroencephalography and event related techniques, but need additional knowledge in annotation, standardization, long-term storage and publication of data.

Difficulty level: Beginner
Duration: 35:30

Lecture on functional brain parcellations and a set of tutorials on bootstrap agregation of stable clusters (BASC) for fMRI brain parcellation which were part of the 2019 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.

Difficulty level: Advanced
Duration: 50:28
Speaker: : Pierre Bellec

Since their introduction in 2016, the FAIR data principles have gained increasing recognition and adoption in global neuroscience.  FAIR defines a set of high-level principles and practices for making digital objects, including data, software, and workflows, Findable, Accessible,  Interoperable, and Reusable.  But FAIR is not a specification;  it leaves many of the specifics up to individual scientific disciplines to define.  INCF has been leading the way in promoting, defining, and implementing FAIR data practices for neuroscience.  We have been bringing together researchers, infrastructure providers, industry, and publishers through our programs and networks.  In this session, we will hear some perspectives on FAIR neuroscience from some of these stakeholders who have been working to develop and use FAIR tools for neuroscience.  We will engage in a discussion on questions such as:  how is neuroscience doing with respect to FAIR?  What have been the successes?  What is currently very difficult? Where does neuroscience need to go?

 

This lecture covers FAIR atlases, from their background, their construction, and how they can be created in line with the FAIR principles.

Difficulty level: Beginner
Duration: 14:24
Speaker: : Heidi Kleven

This lecture contains an overview of the Australian Electrophysiology Data Analytics Platform (AEDAPT), how it works, how to scale it, and how it fits into the FAIR ecosystem.

Difficulty level: Beginner
Duration: 18:56
Speaker: : Tom Johnstone

Félix-Antoine Fortin from Calcul Québec gives an introduction to high-performance computing with the Compute Canada network, first providing an overview of use cases for HPC and then a hand-on tutorial.  Though some examples might seem specific to the Calcul Québec, all computing clusters in the Compute Canada network share the same software modules and environments.

 

The lesson was given in the context of the BrainHack School 2020.

Difficulty level: Beginner
Duration: 02:49:34
Speaker: :