This lecture on generating TVB ready imaging data by Paul Triebkorn is part of the TVB Node 10 series, a 4 day workshop dedicated to learning about The Virtual Brain, brain imaging, brain simulation, personalised brain models, TVB use cases, etc. TVB is a full brain simulation platform.
The course is an introduction to the field of electrophysiology standards, infrastructure, and initiatives. This lecture contains an overview of the Australian Electrophysiology Data Analytics Platform (AEDAPT), how it works, how to scale it, and how it fits into the FAIR ecosystem.
As researchers develop new non-invasive direct-to-consumer technologies that read and stimulate the brain, society must consider the appropriate uses of such devices. Will these brain technologies eventually allow enhancement of abilities beyond human capabilities? In what settings are people using these devices outside the purview of researchers or clinicians? Should consumers be allowed to ‘hack’ their own brain in order to improve performance?
To explore these challenges and the ethical issues raised by advances in do-it-yourself (DIY) neurotechnology, the Emerging Issues Task Force of the International Neuroethics Society organized a virtual panel discussion. The panel discussed neurotechnologies such as transcranial direct current stimulation (tDCS) and electroencephalogram (EEG) headsets and their ability to change the way we understand and alter our brains. Particular attention will be given to the use of neurotechnology by everyday people and the implications this has for regulatory oversight and citizen neuroscience.
Panelists included:
This module covers many of the types of non-invasive neurotech and neuroimaging devices including Electroencephalography (EEG), Electromyography (EMG), Electroneurography (ENG), Magnetoencephalography (MEG), functional Near-Infrared Spectroscopy (fNRIs), Magnetic Resonance Imaging (MRI), Positron Emission Tomography (PET), and Computed Tomography
An introduction to data management, manipulation, visualization, and analysis for neuroscience. Students will learn scientific programming in Python, and use this to work with example data from areas such as cognitive-behavioral research, single-cell recording, EEG, and structural and functional MRI. Basic signal processing techniques including filtering are covered. The course includes a Jupyter Notebook and video tutorials.
Hierarchical Event Descriptors (HED) fill a major gap in the neuroinformatics standards toolkit, namely the specification of the nature(s) of events and time-limited conditions recorded as having occurred during time series recordings (EEG, MEG, iEEG, fMRI, etc.). We, the HED Working Group, propose a half-day online INCF workshop on the need for, structure of, tools for, and use of HED annotation to prepare neuroimaging time series data for storing, sharing, and advanced analysis.
This lecture and tutorial focuses on measuring human functional brain networks. The lecture and tutorial were part of the 2019 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.
Lecture on functional brain parcellations and a set of tutorials on bootstrap agregation of stable clusters (BASC) for fMRI brain parcellation which were part of the 2019 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.
Neuronify is an educational tool meant to create intuition for how neurons and neural networks behave. You can use it to combine neurons with different connections, just like the ones we have in our brain, and explore how changes on single cells lead to behavioral changes in important networks. Neuronify is based on an integrate-and-fire model of neurons. This is one of the simplest models of neurons that exist. It focuses on the spike timing of a neuron and ignores the details of the action potential dynamics. These neurons are modeled as simple RC circuits. When the membrane potential is above a certain threshold, a spike is generated and the voltage is reset to its resting potential. This spike then signals other neurons through its synapses.
Neuronify aims to provide a low entry point to simulation-based neuroscience.
This lecture covers structured data, databases, federating neuroscience-relevant databases, ontologies.
Since their introduction in 2016, the FAIR data principles have gained increasing recognition and adoption in global neuroscience. FAIR defines a set of high-level principles and practices for making digital objects, including data, software, and workflows, Findable, Accessible, Interoperable, and Reusable. But FAIR is not a specification; it leaves many of the specifics up to individual scientific disciplines to define. INCF has been leading the way in promoting, defining, and implementing FAIR data practices for neuroscience. We have been bringing together researchers, infrastructure providers, industry, and publishers through our programs and networks. In this session, we will hear some perspectives on FAIR neuroscience from some of these stakeholders who have been working to develop and use FAIR tools for neuroscience. We will engage in a discussion on questions such as: how is neuroscience doing with respect to FAIR? What have been the successes? What is currently very difficult? Where does neuroscience need to go?
This lecture covers FAIR atlases, from their background, their construction, and how they can be created in line with the FAIR principles.
This lecture focuses on ontologies for clinical neurosciences.
This lecture covers describing and characterizing an input-output relationship.
Part 1 of 2 of a tutorial on statistical models for neural data
Part 2 of 2 of a tutorial on statistical models for neural data.
Introduction to stability analysis of neural models
Introduction to stability analysis of neural models
Oscillations and bursting
Oscillations and bursting
Weakly coupled oscillators