In this module, you will work with human EEG data recorded during a steady-state visual evoked potential study (SSVEP, aka flicker). You will learn about spectral analysis, alpha activity, and topographical mapping. The MATLAB code introduces functions, sorting, and correlation analysis.
In this module, you will work with human EEG data recorded during a steady-state visual evoked potential study (SSVEP, aka flicker). You will learn about spectral analysis, alpha activity, and topographical mapping. The MATLAB code introduces functions, sorting, and correlation analysis.
In this module, you will work with human EEG data recorded during a steady-state visual evoked potential study (SSVEP, aka flicker). You will learn about spectral analysis, alpha activity, and topographical mapping. The MATLAB code introduces functions, sorting, and correlation analysis.
In this module, you will work with human EEG data recorded during a steady-state visual evoked potential study (SSVEP, aka flicker). You will learn about spectral analysis, alpha activity, and topographical mapping. The MATLAB code introduces functions, sorting, and correlation analysis
In this module, you will work with human EEG data recorded during a steady-state visual evoked potential study (SSVEP, aka flicker). You will learn about spectral analysis, alpha activity, and topographical mapping. The MATLAB code introduces functions, sorting, and correlation analysis.
In this module, you will work with human EEG data recorded during a steady-state visual evoked potential study (SSVEP, aka flicker). You will learn about spectral analysis, alpha activity, and topographical mapping. The MATLAB code introduces functions, sorting, and correlation analysis.
This is an in-depth guide on EEG signals and their interaction within brain microcircuits. Participants are also shown techniques and software for simulating, analyzing, and visualizing these signals.
In this tutorial on simulating whole-brain activity using Python, participants can follow along using corresponding code and repositories, learning the basics of neural oscillatory dynamics, evoked responses and EEG signals, ultimately leading to the design of a network model of whole-brain anatomical connectivity.
This tutorial introduces pipelines and methods to compute brain connectomes from fMRI data. With corresponding code and repositories, participants can follow along and learn how to programmatically preprocess, curate, and analyze functional and structural brain data to produce connectivity matrices.
This lecture and tutorial focuses on measuring human functional brain networks. The lecture and tutorial were part of the 2019 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.
Introduction to the central concepts of machine learning, and how they can be applied in Python using the Scikit-learn Package. This lecture was part of the 2018 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.
This lecture covers the concept of neural nets--rotation and squashing and is a part of the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this course include: Introduction to Data Science or a Graduate Level Machine Learning.
This lecture covers the concept of neural nets training (tools, classification with neural nets, and PyTorch implementation) and is a part of the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this course include: Introduction to Data Science or a Graduate Level Machine Learning.
This lecture covers the concept of natural signals properties and the convolutional nets in practice and is a part of the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this course include: Introduction to Deep Learning and Introduction to Data Science or a Graduate Level Machine Learning.
This lecture covers the concept of recurrent neural networks: vanilla and gated (LSTM) and is a part of the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this course include: Introduction to Deep Learning and Introduction to Data Science or a Graduate Level Machine Learning.
This lecture covers the concept of inference in latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this course include: Introduction to Deep Learning, Parameter sharing, and Introduction to Data Science or a Graduate Level Machine Learning.
This tutorial covers the concept of training latent variable energy based models (LV-EBMs) and is is a part of the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this course include: Introduction to Deep Learning, Parameter sharing, and Introduction to Data Science or a Graduate Level Machine Learning.
This is a tutorial on designing a Bayesian inference model to map belief trajectories, with emphasis on gaining familiarity with Hierarchical Gaussian Filters (HGFs).
This lesson corresponds to slides 65-90 of the PDF below.
This is the first of two workshops on reproducibility in science, during which participants are introduced to concepts of FAIR and open science. After discussing the definition of and need for FAIR science, participants are walked through tutorials on installing and using Github and Docker, the powerful, open-source tools for versioning and publishing code and software, respectively.
This is a hands-on tutorial on PLINK, the open source whole genome association analysis toolset. The aims of this tutorial are to teach users how to perform basic quality control on genetic datasets, as well as to identify and understand GWAS summary statistics.