This lesson breaks down the principles of Bayesian inference and how it relates to cognitive processes and functions like learning and perception. It is then explained how cognitive models can be built using Bayesian statistics in order to investigate how our brains interface with their environment.
This lesson corresponds to slides 1-64 in the PDF below.
This is a tutorial on designing a Bayesian inference model to map belief trajectories, with emphasis on gaining familiarity with Hierarchical Gaussian Filters (HGFs).
This lesson corresponds to slides 65-90 of the PDF below.
This lecture goes into detailed description of how to process workflows in the virtual research environment (VRE), including approaches for standardization, metadata, containerization, and constructing and maintaining scientific pipelines.
In this lesson you will learn about fundamental neural phenomena such as oscillations and bursting, and the effects these have on cortical networks.
In this lecture, you will learn about rules governing coupled oscillators, neural synchrony in networks, and theoretical assumptions underlying current understanding.
In this lesson, you will learn about phenomena of neural populations such as synchrony, oscillations, and bursting.
This lesson provides more context around weakly coupled oscillators.
In this lesson, you will learn about neural activity pattern generation in visual system hallucinations.
This lecture covers computational principles that growth cones employ to detect and respond to environmental chemotactic gradients, focusing particularly on growth-cone shape dynamics.