The Virtual Brain is an open-source, multi-scale, multi-modal brain simulation platform. In this lesson, you get introduced to brain simulation in general and to The Virtual brain in particular. Prof. Ritter will present the newest approaches for clinical applications of The Virtual brain - that is, for stroke, epilepsy, brain tumors and Alzheimer’s disease - and show how brain simulation can improve diagnostics, therapy and understanding of neurological disease.
The concept of neural masses, an application of mean field theory, is introduced as a possible surrogate for electrophysiological signals in brain simulation. The mathematics of neural mass models and their integration to a coupled network are explained. Bifurcation analysis is presented as an important technique in the understanding of non-linear systems and as a fundamental method in the design of brain simulations. Finally, the application of the described mathematics is demonstrated in the exploration of brain stimulation regimes.
The simulation of the virtual epileptic patient is presented as an example of advanced brain simulation as a translational approach to deliver improved results in clinics. The fundamentals of epilepsy are explained. On this basis, the concept of epilepsy simulation is developed. By using an iPython notebook, the detailed process of this approach is explained step by step. In the end, you are able to perform simple epilepsy simulations your own.
A brief overview of the Python programming language, with an emphasis on tools relevant to data scientists. This lecture was part of the 2018 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.
Lecture on functional brain parcellations and a set of tutorials on bootstrap agregation of stable clusters (BASC) for fMRI brain parcellation which were part of the 2019 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.
Overview of the content for Day 1 of this course.
Best practices: the tips and tricks on how to get your Miniscope to work and how to get your experiments off the ground.
"Balancing size & function in compact miniscopes" was presented by Tycho Hoogland at the 2021 Virtual Miniscope Workshop as part of a series of talks by leading Miniscope users and developers.
"Computational imaging for miniature miniscopes" was presented by Laura Waller at the 2021 Virtual Miniscope Workshop as part of a series of talks by leading Miniscope users and developers.
"Online 1-photon vs 2-photon calcium imaging data analysis: Current developments and future plans" was presented by Andrea Giovannucci at the 2021 Virtual Miniscope Workshop as part of a series of talks by leading Miniscope users and developers.
"Ensemble fluidity supports memory flexibility during spatial reversal" was presented by William Mau at the 2021 Virtual Miniscope Workshop as part of a series of talks by leading Miniscope users and developers.
How to start processing the raw imaging data generated with a Miniscope, including developing a usable pipeline and demoing the Minion pipeline
The direction of miniature microscopes, including both MetaCell and other groups.
Overview of the content for Day 2 of this course.
Summary and closing remarks for this three-day course.
This lecture covers infrared LED oblique illumination for studying neuronal circuits in in vitro block-preparations of the spinal cord and brain stem.
This lecture covers the application of diffusion MRI for clinical and preclinical studies.
This lecture was part of the 2019 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.
This lecture was part of the 2019 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.
Since their introduction in 2016, the FAIR data principles have gained increasing recognition and adoption in global neuroscience. FAIR defines a set of high-level principles and practices for making digital objects, including data, software, and workflows, Findable, Accessible, Interoperable, and Reusable. But FAIR is not a specification; it leaves many of the specifics up to individual scientific disciplines to define. INCF has been leading the way in promoting, defining, and implementing FAIR data practices for neuroscience. We have been bringing together researchers, infrastructure providers, industry, and publishers through our programs and networks. In this session, we will hear some perspectives on FAIR neuroscience from some of these stakeholders who have been working to develop and use FAIR tools for neuroscience. We will engage in a discussion on questions such as: how is neuroscience doing with respect to FAIR? What have been the successes? What is currently very difficult? Where does neuroscience need to go? This lecture covers the needs and challenges involved in creating a FAIR ecosystem for neuroimaging research.