Skip to main content

This module covers fMRI data, including creating and interpreting flatmaps, exploring variability and average responses, and visual eccentricity. You will learn about processing BOLD signals, trial-averaging, and t-tests. The MATLAB code introduces data animations, multicolor visualizations, and linear indexing.

Difficulty level: Intermediate
Duration: 12:52
Speaker: : Mike X. Cohen

This module covers fMRI data, including creating and interpreting flatmaps, exploring variability and average responses, and visual eccentricity. You will learn about processing BOLD signals, trial-averaging, and t-tests. The MATLAB code introduces data animations, multicolor visualizations, and linear indexing.

Difficulty level: Intermediate
Duration: 13:39
Speaker: : Mike X. Cohen

This module covers fMRI data, including creating and interpreting flatmaps, exploring variability and average responses, and visual eccentricity. You will learn about processing BOLD signals, trial-averaging, and t-tests. The MATLAB code introduces data animations, multicolor visualizations, and linear indexing.

Difficulty level: Intermediate
Duration: 17:54
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 5:02
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 15:01
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 5:15
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 17:08
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 11:23
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 22:41
Speaker: : Mike X. Cohen

You will learn about working with calcium imaging data, including image processing to remove background "blur," identifying cells based on thresholded spatial contiguity, time series filtering, and principal components analysis (PCA). The MATLAB code shows data animations, capabilities of the image processing toolbox, and PCA.

Difficulty level: Intermediate
Duration: 17:19
Speaker: : Mike X. Cohen

This tutorial was part of the 2018 Neurohackademy, a 2-week hands-on summer institute in neuroimaging and data science held at the University of Washington eScience Institute.

Difficulty level: Intermediate
Duration: 1:26:02
Speaker: : Ariel Rokem

This lecture on multi-scale entropy by Jil Meier is part of the TVB Node 10 series, a 4 day workshop dedicated to learning about The Virtual Brain, brain imaging, brain simulation, personalised brain models, TVB use cases, etc. TVB is a full brain simulation platform.

Difficulty level: Intermediate
Duration: 39:05
Speaker: : Jil Meier
Course:

 

Panel discussion by leading scientists, engineers and philosophers discuss what brain-computer interfaces are and the unique scientific and ethical challenges they pose. hosted by Lynne Malcolm from ABC Radio National's All in the Mind program and features:

  • Dr Hannah Maslen, Deputy Director, Oxford Uehiro Centre for Practical Ethics, University of Oxford
  • Prof. Eric Racine, Director, Pragmatic Health Ethics Research Unity, Montreal Institute of Clinical Research
  • Prof Jeffrey Rosenfeld, Director, Monash Institute of Medical Engineering, Monash University
  • Dr Isabell Kiral-Kornek, AI and Life Sciences Researcher, IBM Research
  • A/Prof Adrian Carter, Neuroethics Program Coordinator, ARC Centre of Excellence for Integrative Brain Function

 

Difficulty level: Intermediate
Duration: 1:14:34
Course:

 

Panel of experts discuss the virtues and risks of our digital health data being captured and used by others in the age of Facebook, metadata retention laws, Cambridge Analytica and a rapidly evolving neuroscience. The discussion was moderated by Jon Faine, ABC Radio presenter. The panelists were:

  • Mr Sven Bluemmel, Victorian Information Commissioner
  • Prof Judy Illes, Neuroethics Canada, University of British Columbia, Order of Canada
  • Prof Mark Andrejevic, Professor of Media Studies, Monash University
  • Ms Vrinda Edan, Chief Operating Officer, Victorian Mental Illness Awareness Council


 

 

Difficulty level: Intermediate
Duration: 1:10:30

DAQCORD is a framework for the design, documentation and reporting of data curation methods in order to advance the scientific rigour, reproducibility and analysis of the data. This lecture covers the rationale for developing the framework, the process in which the framework was developed, and ends with a presentation of the framework. While the driving use case for DAQCORD was clinical traumatic brain injury research, the framework is applicable to clinical studies in other domains of clinical neuroscience research.

Difficulty level: Intermediate
Duration: 17:08
Speaker: : Ari Ercole