Skip to main content

The workshop was designed to introduce all aspects of using Miniscopes, including basic principles of Miniscope design and imaging, how to build and attach a Miniscope, how to implant a GRIN lens for imaging deep structures, and how to analyze imaging data. It also covered the most recent developments in Miniscope technology and highlighted some of the best advances in this exciting and growing field. The event was organized by Daniel Aharoni, Denise Cai, and Tristan Shuman, and it was hosted at MetaCell's Workspace for Calcium Imaging Analysis.

Difficulty level: Beginner
Duration: 16:23

This lesson is an overview of the Miniscope project. It will give motivation for why we have developed Miniscopes, how they've been developed, why they may be useful for researchers, and the differences between previous and current versions. While directly applicable to the UCLA Miniscope project, this information can be applied to most mainstream miniature microscopes, including both open source and commercially available models.

Difficulty level: Beginner
Duration: 42:15
Speaker: : Daniel Aharoni

This lesson will go through the theory and practical techniques for implanting a GRIN lens for imaging in mice.

Difficulty level: Beginner
Duration: 1:00:40

Learn how to build a Miniscope and stream data, including an overview of the software involved.

Difficulty level: Beginner
Duration: 1:04:28
Course:

An introduction to data management, manipulation, visualization, and analysis for neuroscience. Students will learn scientific programming in Python, and use this to work with example data from areas such as cognitive-behavioral research, single-cell recording, EEG, and structural and functional MRI. Basic signal processing techniques including filtering are covered. The course includes a Jupyter Notebook and video tutorials.

 

Difficulty level: Beginner
Duration: 1:09:16
Speaker: : Aaron J. Newman

Since their introduction in 2016, the FAIR data principles have gained increasing recognition and adoption in global neuroscience.  FAIR defines a set of high-level principles and practices for making digital objects, including data, software, and workflows, Findable, Accessible,  Interoperable, and Reusable.  But FAIR is not a specification;  it leaves many of the specifics up to individual scientific disciplines to define.  INCF has been leading the way in promoting, defining, and implementing FAIR data practices for neuroscience.  We have been bringing together researchers, infrastructure providers, industry, and publishers through our programs and networks.  In this session, we will hear some perspectives on FAIR neuroscience from some of these stakeholders who have been working to develop and use FAIR tools for neuroscience.  We will engage in a discussion on questions such as:  how is neuroscience doing with respect to FAIR?  What have been the successes?  What is currently very difficult? Where does neuroscience need to go?

 

This lecture covers FAIR atlases, from their background, their construction, and how they can be created in line with the FAIR principles.

Difficulty level: Beginner
Duration: 14:24
Speaker: : Heidi Kleven

Learn how to build and share extensions in NWB

Difficulty level: Advanced
Duration: 20:29
Speaker: : Ryan Ly

Learn how to build custom APIs for extension

Difficulty level: Advanced
Duration: 25:40
Speaker: : Andrew Tritt

Learn how to handle writing very large data in PyNWB

Difficulty level: Advanced
Duration: 26:50
Speaker: : Andrew Tritt

Learn how to handle writing very large data in MatNWB

Difficulty level: Advanced
Duration: 16:18
Speaker: : Ben Dichter

This video explains what metadata is, why it is important, and how you can organise your metadata to increase the FAIRness of your data on EBRAINS.

Difficulty level: Beginner
Duration: 17:23
Speaker: : Ulrike Schlegel

Elizabeth Dupre provides reviews some standards for project management and organization, including motivation in the view of the FAIR principles and improved reproducibility.

Difficulty level: Beginner
Duration: 01:08:34
Speaker: : Elizabeth DuPre

This lecture covers a lot of post-war developments in the science of the mind, focusing first on the cognitive revolution, and concluding with living machines.

Difficulty level: Beginner
Duration: 2:24:35

This lecture provides an overview of depression (epidemiology and course of the disorder), clinical presentation, somatic co-morbidity, and treatment options.

Difficulty level: Beginner
Duration: 37:51

NWB: An ecosystem for neurophysiology data standardization

Difficulty level: Beginner
Duration: 29:53
Speaker: : Oliver Ruebel

This lecture discusses the FAIR principles as they apply to electrophysiology data and metadata, the building blocks for community tools and standards, platforms and grassroots initiatives, and the challenges therein.

Difficulty level: Beginner
Duration: 8:11
Speaker: : Thomas Wachtler

This lecture contains an overview of electrophysiology data reuse within the EBRAINS ecosystem.

Difficulty level: Beginner
Duration: 15:57
Speaker: : Andrew Davison

This lecture contains an overview of the Distributed Archives for Neurophysiology Data Integration (DANDI) archive, its ties to FAIR and open-source, integrations with other programs, and upcoming features.

Difficulty level: Beginner
Duration: 13:34

This lecture contains an overview of the Australian Electrophysiology Data Analytics Platform (AEDAPT), how it works, how to scale it, and how it fits into the FAIR ecosystem.

Difficulty level: Beginner
Duration: 18:56
Speaker: : Tom Johnstone

This lecture discusses how to standardize electrophysiology data organization to move towards being more FAIR.

Difficulty level: Beginner
Duration: 15:51