In this lecture, you will learn about various neuroinformatic resources which allow for 3D reconstruction of brain models.
This lecture provides an general introduction to epilepsy, as well as why and how TVB can prove useful in building and testing epileptic models.
This lesson provides an overview of The Virtual Brain integrated workflows on EBRAINS.
This lesson walks users through the Image Processing Pipeline, an integral part of the TVB on EBRAINS integrated workflows.
This lesson gives an overview of The Virtual Brain simulator and its integration into the Human Brain Project Cloud and EBRAINS infrastructure.
In this lesson, users will get an overview of the EBRAINS integrated Fast TVB, a C implementation of TVB that is orders of magnitude faster than the original Python TVB, and capable of performing parallelizable simulations in the cloud.
This lesson gives a brief overview of the multi-scale co-simulation between TVB-NEST and Elephant on the EBRAINS infrastructure.
In this lesson, you will learn about the process of constructing models for TVB automatically on the EBRAINS infrastructure.
The Medical Informatics Platform (MIP) is a platform providing federated analytics for diagnosis and research in clinical neuroscience research. The federated analytics is possible thanks to a distributed engine that executes computations and transfers information between the members of the federation (hospital nodes). In this talk the speaker will describe the process of designing and implementing new analytical tools, i.e. statistical and machine learning algorithms. Mr. Sakellariou will further describe the environment in which these federated algorithms run, the challenges and the available tools, the principles that guide its design and the followed general methodology for each new algorithm. One of the most important challenges which are faced is to design these tools in a way that does not compromise the privacy of the clinical data involved. The speaker will show how to address the main questions when designing such algorithms: how to decompose and distribute the computations and what kind of information to exchange between nodes, in order to comply with the privacy constraint mentioned above. Finally, also the subject of validating these federated algorithms will be briefly touched.