This lecture contains an overview of the China-Cuba-Canada neuroinformatics ecosystem for Quantitative Tomographic EEG Analysis (qEEGt).
In this lesson, while learning about the need for increased large-scale collaborative science that is transparent in nature, users also are given a tutorial on using Synapse for facilitating reusable and reproducible research.
This lecture discusses what defines an integrative approach regarding research and methods, including various study designs and models which are appropriate choices when attempting to bridge data domains; a necessity when whole-person modelling.
Similarity Network Fusion (SNF) is a computational method for data integration across various kinds of measurements, aimed at taking advantage of the common as well as complementary information in different data types. This workshop walks participants through running SNF on EEG and genomic data using RStudio.
This lesson provides an introduction the International Neuroinformatics Coordinating Facility (INCF), its mission towards FAIR neuroscience, and future directions.
This brief video provides an introduction to the third session of INCF's Neuroinformatics Assembly 2023, focusing on how to streamling cross-platform data integration in a neuroscientific context.
This final lesson of the course consists of the panel discussion for Streamlining Cross-Platform Data Integration session during the first day of INCF's Neuroinformatics Assembly 2023.
This lightning talk describes the heterogeneity of the MR field regarding types of scanners, data formats, protocols, and software/hardware versions, as well as the challenges and opportunities for unifying these datasets in a common interface, MRdataset.
This session covers the framework of the International Brain Lab (IBL) and the data architecture used for this project.
This is the Introductory Module to the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition.
This module covers the concepts of gradient descent and the backpropagation algorithm and is a part of the Deep Learning Course at NYU's Center for Data Science.
This lecture covers concepts associated with neural nets, including rotation and squashing, and is a part of the Deep Learning Course at New York University's Center for Data Science (CDS).
This lesson provides a detailed description of some of the modules and architectures involved in the development of neural networks.
This lecture covers the concept of neural nets training (tools, classification with neural nets, and PyTorch implementation) and is a part of the Deep Learning Course at NYU's Center for Data Science.
This lecture covers the concept of parameter sharing: recurrent and convolutional nets and is a part of the Deep Learning Course at NYU's Center for Data Science.
This lecture covers the concept of convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.
This lecture discusses the concept of natural signals properties and the convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.
This lecture covers the concept of recurrent neural networks: vanilla and gated (LSTM) and is a part of the Deep Learning Course at NYU's Center for Data Science.
This lecture is a foundationational lecture for the concept of energy-based models with a particular focus on the joint embedding method and latent variable energy-based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.
This lecture covers the concept of inference in latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.