This lesson breaks down the principles of Bayesian inference and how it relates to cognitive processes and functions like learning and perception. It is then explained how cognitive models can be built using Bayesian statistics in order to investigate how our brains interface with their environment.
This lesson corresponds to slides 1-64 in the PDF below.
This is a tutorial on designing a Bayesian inference model to map belief trajectories, with emphasis on gaining familiarity with Hierarchical Gaussian Filters (HGFs).
This lesson corresponds to slides 65-90 of the PDF below.
This lecture covers a lot of post-war developments in the science of the mind, focusing first on the cognitive revolution, and concluding with living machines.
This lecture provides an overview of depression (epidemiology and course of the disorder), clinical presentation, somatic co-morbidity, and treatment options.
This lesson is part 1 of 2 of a tutorial on statistical models for neural data.
What is the difference between attention and consciousness? This lecture describes the scientific meaning of consciousness, journeys on the search for neural correlates of visual consciousness, and explores the possibility of consciousness in other beings and even non-biological structures.
This lecture covers visualizing extracellular neurotransmitter dynamics
This lesson goes over the basic mechanisms of neural synapses, the space between neurons where signals may be transmitted.
This lesson describes spike timing-dependent plasticity (STDP), a biological process that adjusts the strength of connections between neurons in the brain, and how one can implement or mimic this process in a computational model. You will also find links for practical exercises at the bottom of this page.
This lesson discusses a gripping neuroscientific question: why have neurons developed the discrete action potential, or spike, as a principle method of communication?
This lecture consists of the second half of the introduction to signal transduction, here focusing on cell receptors and signalling cascades.
In this lesson, you will learn about GABAergic interneurons and local inhibition on the circuit level.
This lesson provides an overview of how to construct computational pipelines for neurophysiological data using DataJoint.
This lesson delves into the the structure of one of the brain's most elemental computational units, the neuron, and how said structure influences computational neural network models.
Following the previous lesson on neuronal structure, this lesson discusses neuronal function, particularly focusing on spike triggering and propogation.
While the previous lesson in the Neuro4ML course dealt with the mechanisms involved in individual synapses, this lesson discusses how synapses and their neurons' firing patterns may change over time.
Whereas the previous two lessons described the biophysical and signalling properties of individual neurons, this lesson describes properties of those units when part of larger networks.
This lesson covers the ionic basis of the action potential, including the Hodgkin-Huxley model.
This lesson provides an introduction to the myriad forms of cellular mechanisms whicn underpin healthy brain function and communication.
In this lesson you will learn about the ionic basis of the action potential, including the Hodgkin-Huxley model.