This course consists of 12 lectures on the visual system and neural coding produced by the Allen Institute for Brain Science. The lectures cover broad neurophysiological concepts such as information theory and the mammalian visual system, as well as more specific topics such as cell types and their functions in the mammalian retina.
“Computational Thinking“ refers to a mindset or set of tools used by computational or ICT specialists to describe their work. This course is intended for people outside of the ICT field to allow students to understand the way that computer specialists analyse problems and to introduce students to the basic terminology of the field.
The field of neuroscience is one of the most interdisciplinary scientific fields. It is constantly expanded and developed further and unites researchers from a vast variety of backgrounds such as chemistry, biology, physics, medicine, or psychology. By examining the principles that influence the development and function of the human nervous system, it advances the understanding of the fundamental mechanisms of human behaviour, emotions, and thoughts, and what happens if they fail.
This course, arranged by EPFL and also available as a MOOC on edX, aims for a mechanistic description of mammalian brain function at the level of individual nerve cells and their synaptic interactions.
A series of short explanations of the basic equations underlying computational neuroscience.
This course includes two tutorials on R, a programming language and environment for statistical computing and graphics. R provides a wide variety of statistical (linear and nonlinear modelling, classical statistical tests, time-series analysis, classification, clustering, etc.) and graphical techniques, and is highly extensible.
This course features tutorials on how to use Allen atlases and digital brain atlasing tools, including operational and user features of the Allen Mouse Brain Atlas, as well as the Allen Institute's 3D viewing tool, Brain Explorer®.
Neuroscience has traditionally been a discipline where isolated labs have produced their own experimental data and created their own models to interpret their findings. However, it is becoming clear that no one lab can create cell and network models rich enough to address all the relevant biological questions, or to generate and analyse all the data required to inform, constrain, and test these models.
Future computing systems will capitalize on our increased understanding of the brain through the use of similar architectures and computational principles. During this workshop, we bring together recent developments in this rapidly developing field of neuromorphic computing systems, and also discuss challenges ahead.
Neuroanatomy provides one of the unifying frameworks for neuroscience and thus it is not surprising that it provides the basis for many neuroinformatics tools and approaches. Regardless of whether one is working at the subcellular, cellular or gross anatomical level or whether one is modeling circuitry, molecular pathways or function, at some point, this work will include an anatomical reference.
Probing the organization of interactions within and across neuronal populations is a promising approach to understanding the principles of brain processing. The rapidly advancing technical capabilities to record from hundreds of neurons in parallel open up new possibilities to disentangle the correlative structure within neuronal networks. However, the complexity of these massive data streams calls for novel, tractable analysis tools that exploit the parallel aspect of the data.