Skip to main content

This lecture introduces you to the basics of the Amazon Web Services public cloud. It covers the fundamentals of cloud computing and go through both motivation and process involved in moving your research computing to the cloud. This lecture was part of the 2018 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.

Difficulty level: Intermediate
Duration: 3:09:12
Speaker: : Amanda Tan

This lecture introduces neuroscience concepts and methods such as fMRI, visual respones in BOLD data, and the eccentricity of visual receptive fields. 

Difficulty level: Intermediate
Duration: 7:15
Speaker: : Mike X. Cohen

This tutorial walks users through the creation and visualization of activation flat maps from fMRI datasets. 

Difficulty level: Intermediate
Duration: 12:15
Speaker: : Mike X. Cohen

This tutorial demonstrates to users the conventional preprocessing steps when working with BOLD signal datasets from fMRI. 

Difficulty level: Intermediate
Duration: 12:05
Speaker: : Mike X. Cohen

In this tutorial, users will learn how to create a trial-averaged BOLD response and store it in a matrix in MATLAB. 

Difficulty level: Intermediate
Duration: 20:12
Speaker: : Mike X. Cohen

This tutorial teaches users how to create animations of BOLD responses over time, to allow researchers and clinicians to visualize time-course activity patterns.

Difficulty level: Intermediate
Duration: 12:52
Speaker: : Mike X. Cohen

This tutorial demonstrates how to use MATLAB to create event-related BOLD time courses from fMRI datasets. 

Difficulty level: Intermediate
Duration: 13:39
Speaker: : Mike X. Cohen

In this tutorial, users learn how to compute and visualize a t-test on experimental condition differences.

Difficulty level: Intermediate
Duration: 17:54
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 5:02
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 15:01
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 5:15
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 17:08
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 11:23
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 22:41
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 17:19
Speaker: : Mike X. Cohen

This lesson describes the principles underlying functional magnetic resonance imaging (fMRI), diffusion-weighted imaging (DWI), tractography, and parcellation. These tools and concepts are explained in a broader context of neural connectivity and mental health. 

Difficulty level: Intermediate
Duration: 1:47:22

Introduction to the Brain Imaging Data Structure (BIDS): a standard for organizing human neuroimaging datasets. This lecture was part of the 2018 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.

Difficulty level: Intermediate
Duration: 56:49

This lecture was part of the 2018 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.

Difficulty level: Intermediate
Duration: 1:25:05
Speaker: : Satrajit Ghosh

This lecture on generating TVB ready imaging data by Paul Triebkorn is part of the TVB Node 10 series, a 4 day workshop dedicated to learning about The Virtual Brain, brain imaging, brain simulation, personalised brain models, TVB use cases, etc. TVB is a full brain simulation platform.

Difficulty level: Intermediate
Duration: 1:40:52
Speaker: : Paul Triebkorn
Course:

This book was written with the goal of introducing researchers and students in a variety of research fields to the intersection of data science and neuroimaging. This book reflects our own experience of doing research at the intersection of data science and neuroimaging and it is based on our experience working with students and collaborators who come from a variety of backgrounds and have a variety of reasons for wanting to use data science approaches in their work. The tools and ideas that we chose to write about are all tools and ideas that we have used in some way in our own research. Many of them are tools that we use on a daily basis in our work. This was important to us for a few reasons: the first is that we want to teach people things that we ourselves find useful. Second, it allowed us to write the book with a focus on solving specific analysis tasks. For example, in many of the chapters you will see that we walk you through ideas while implementing them in code, and with data. We believe that this is a good way to learn about data analysis, because it provides a connecting thread from scientific questions through the data and its representation to implementing specific answers to these questions. Finally, we find these ideas compelling and fruitful. That’s why we were drawn to them in the first place. We hope that our enthusiasm about the ideas and tools described in this book will be infectious enough to convince the readers of their value.

 

Difficulty level: Intermediate
Duration:
Speaker: :