This lecture and tutorial focuses on measuring human functional brain networks, as well as how to account for inherent variability within those networks.
This lecture presents an overview of functional brain parcellations, as well as a set of tutorials on bootstrap agregation of stable clusters (BASC) for fMRI brain parcellation.
Neuronify is an educational tool meant to create intuition for how neurons and neural networks behave. You can use it to combine neurons with different connections, just like the ones we have in our brain, and explore how changes on single cells lead to behavioral changes in important networks. Neuronify is based on an integrate-and-fire model of neurons. This is one of the simplest models of neurons that exist. It focuses on the spike timing of a neuron and ignores the details of the action potential dynamics. These neurons are modeled as simple RC circuits. When the membrane potential is above a certain threshold, a spike is generated and the voltage is reset to its resting potential. This spike then signals other neurons through its synapses.
Neuronify aims to provide a low entry point to simulation-based neuroscience.
Maximize Your Research With Cloud Workspaces is a talk aimed at researchers who are looking for innovative ways to set up and execute their life science data analyses in a collaborative, extensible, open-source cloud environment. This panel discussion is brought to you by MetaCell and scientists from leading universities who share their experiences of advanced analysis and collaborative learning through the Cloud.
In this lecture, you will learn about virtual research environments (VREs) and their technical limitations, (i.e., a computing platform and the software stack behind it) and the security measures which should be considered during implementation.
This lecture goes into detailed description of how to process workflows in the virtual research environment (VRE), including approaches for standardization, metadata, containerization, and constructing and maintaining scientific pipelines.
This lesson provides an overview of how to conceptualize, design, implement, and maintain neuroscientific pipelines in via the cloud-based computational reproducibility platform Code Ocean.
In this workshop talk, you will receive a tour of the Code Ocean ScienceOps Platform, a centralized cloud workspace for all teams.
This lecture covers a wide range of aspects regarding neuroinformatics and data governance, describing both their historical developments and current trajectories. Particular tools, platforms, and standards to make your research more FAIR are also discussed.
This lecture introduces you to the basics of the Amazon Web Services public cloud. It covers the fundamentals of cloud computing and goes through both the motivations and processes involved in moving your research computing to the cloud.
This lecture discusses how FAIR practices affect personalized data models, including workflows, challenges, and how to improve these practices.
In this talk, you will learn how brainlife.io works, and how it can be applied to neuroscience data.
As a part of NeuroHackademy 2020, this lecture delves into cloud computing, focusing on Amazon Web Services.
Overview of the content for Day 1 of this course.
Overview of Day 2 of this course.
Best practices: the tips and tricks on how to get your Miniscope to work and how to get your experiments off the ground.
This talk compares various sensors and resolutions for in vivo neural recordings.
This talk delves into challenges and opportunities of Miniscope design, seeking the optimal balance between scale and function.
Attendees of this talk will learn aobut computational imaging systems and associated pipelines, as well as open-source software solutions supporting miniscope use.
This lecture introduces neuroscience concepts and methods such as fMRI, visual respones in BOLD data, and the eccentricity of visual receptive fields.