Skip to main content

This lecture presents the Graphical (GUI) and Command Line (CLI) User Interface of TVB. Alongside with the speakers, explore and interact with all means necessary to generate, manipulate and visualize connectivity and network dynamics. Speakers: Paula Popa & Mihai Andrei

Difficulty level: Beginner
Duration: 1:02:16
Speaker: :

This lecture briefly introduces The Virtual Brain (TVB), a multi-scale, multi-modal neuroinformatics platform for full brain network simulations using biologically realistic connectivity, as well as its potential neuroscience applications: for example with epilepsy.

Difficulty level: Beginner
Duration: 8:53
Speaker: : Petra Ritter

This lecture introduces the theoretical background and foundations that led to the development of TVB, the architecture and features of its major software components.

Difficulty level: Beginner
Duration:
Speaker: : Randy McIntosh

Audio slides presentation to accompany the paper titled: An automated pipeline for constructing personalized virtual brains from multimodal neuroimaging data. Authors: M. Schirner, S. Rothmeier, V. Jirsa, A.R. McIntosh, P. Ritter.

Difficulty level: Beginner
Duration: 4:56
Speaker: :

Computational models provide a framework for integrating data across spatial scales and for exploring hypotheses about the biological mechanisms underlying neuronal and network dynamics. However, as models increase in complexity, additional barriers emerge to the creation, exchange, and re-use of models. Successful projects have created standards for describing complex models in neuroscience and provide open source tools to address these issues. This lecture provides an overview of these projects and make a case for expanded use of resources in support of reproducibility and validation of models against experimental data.

Difficulty level: Beginner
Duration: 1:00:39
Speaker: : Sharon Crook

A brief overview of the Python programming language, with an emphasis on tools relevant to data scientists. This lecture was part of the 2018 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.

Difficulty level: Beginner
Duration: 1:16:36
Speaker: : Tal Yarkoni

Lecture on functional brain parcellations and a set of tutorials on bootstrap agregation of stable clusters (BASC) for fMRI brain parcellation which were part of the 2019 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.

Difficulty level: Advanced
Duration: 50:28
Speaker: : Pierre Bellec

NWB: An ecosystem for neurophysiology data standardization

Difficulty level: Beginner
Duration: 29:53
Speaker: : Oliver Ruebel

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture on model types introduces the advantages of modeling, provide examples of different model types, and explain what modeling is all about. This lecture contains links to 3 tutorials, lecture/tutorial slides, suggested reading list, and 3 recorded question and answer sessions.

Difficulty level: Beginner
Duration: 27:48
Speaker: : Gunnar Blohm

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture summarizes the concepts introduced in Model Types I and further explains how models can be used answer different scientific questions. 

Difficulty level: Beginner
Duration: 32:30
Speaker: : Megan Peters

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture focuses on how to get from a scientific question to a model using concrete examples. We will present a 10-step practical guide on how to succeed in modeling. This lecture contains links to 2 tutorials, lecture/tutorial slides, suggested reading list, and 3 recorded question and answer sessions.

Difficulty level: Beginner
Duration: 29:52
Speaker: : Megan Peters

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture formalizes modeling as a decision process that is constrained by a precise problem statement and specific model goals. We provide real-life examples on how model building is usually less linear than presented in Modeling Practice I

Difficulty level: Beginner
Duration: 22:51
Speaker: : Gunnar Blohm

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture focuses on the purpose of model fitting, approaches to model fitting, model fitting for linear models, and how to assess the quality and compare model fits. We will present a 10-step practical guide on how to succeed in modeling. 

Difficulty level: Beginner
Duration: 26:46
Speaker: : Jan Drugowitsch

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture summarizes the concepts introduced in Model Fitting I and adds two additional concepts: 1) MLE is a frequentist way of looking at the data and the model, with its own limitations. 2) Side-by-side comparisons of bootstrapping and cross-validation.

Difficulty level: Beginner
Duration: 38.17
Speaker: : Kunlin Wei

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture provides an overview of generalized linear models (GLM) and contains links to 2 tutorials, lecture/tutorial slides, suggested reading list, and 3 recorded question and answer sessions.

Difficulty level: Beginner
Duration: 33:58
Speaker: : Cristina Savin

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture further develops the concepts introduced in Machine Learning I.

Difficulty level: Beginner
Duration: 29:30
Speaker: : I. Memming Park

An overview of the process of developing the TVB-NEST co-simulation on the EBRAINS infrastructure, and it's use cases.

Difficulty level: Beginner
Duration: 0:25:14
Speaker: : Denis Perdikis

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture introduces the core concepts of dimensionality reduction.

Difficulty level: Beginner
Duration: 31:43
Speaker: : Byron Yu

This lecture is part of the Neuromatch Academy (NMA), a massive, interactive online summer school held in 2020 that provided participants with experiences spanning from hands-on modeling experience to meta-science interpretation skills across just about everything that could reasonably be included in the label "computational neuroscience". 

 

This lecture provides an application of dimensionality reduction applied to multi-dimensional neural recordings using brain-computer interfaces with simultaneous spike recordings.

Difficulty level: Beginner
Duration: 30:15
Speaker: : Byron Yu

This is Tutorial 1 of a series on fitting models to data. We start with simple linear regression, using least squares optimization (Tutorial 1) and Maximum Likelihood Estimation (Tutorial 2). We will use bootstrapping to build confidence intervals around the inferred linear model parameters (Tutorial 3). We'll finish our exploration of regression models by generalizing to multiple linear regression and polynomial regression (Tutorial 4). We end by learning how to choose between these various models. We discuss the bias-variance trade-off (Tutorial 5) and Cross Validation for model selection (Tutorial 6).

Difficulty level: Beginner
Duration: 6:18
Speaker: : Anqi Wu