Skip to main content

This lesson discusses FAIR principles and methods currently in development for assessing FAIRness.

Difficulty level: Beginner
Duration:
Speaker: : Michel Dumontier

This opening lecture from INCF's Short Course in Neuroinformatics provides an overview of the field of neuroinformatics itself, as well as laying out an argument for the necessity for developing more sophisticated approaches towards FAIR data management principles in neuroscience. 

Difficulty level: Beginner
Duration: 1:19:14
Speaker: : Maryann Martone

This lesson continues from part one of the lecture Ontologies, Databases, and Standards, diving deeper into a description of ontologies and knowledg graphs. 

Difficulty level: Intermediate
Duration: 50:18
Speaker: : Jeff Grethe

This lesson aims to define computational neuroscience in general terms, while providing specific examples of highly successful computational neuroscience projects. 

Difficulty level: Beginner
Duration: 59:21
Speaker: : Alla Borisyuk

This lecture covers a wide range of aspects regarding neuroinformatics and data governance, describing both their historical developments and current trajectories. Particular tools, platforms, and standards to make your research more FAIR are also discussed.

Difficulty level: Beginner
Duration: 54:58
Speaker: : Franco Pestilli

Introduction of the Foundations of Machine Learning in Python course - Day 01.

High-Performance Computing and Analytics Lab, University of Bonn

Difficulty level: Beginner
Duration: 35:24
Speaker: : Elena Trunz

Presented by the OHBM OpenScienceSIG, this lesson covers how containers can be useful for running the same software on different platforms and sharing analysis pipelines with other researchers.

Difficulty level: Beginner
Duration: 01:21:59

This lecture describes how to build research workflows, including a demonstrate using DataJoint Elements to build data pipelines.

Difficulty level: Intermediate
Duration: 47:00
Speaker: : Dimitri Yatsenko

This lesson gives an introductory presentation on how data science can help with scientific reproducibility.

Difficulty level: Beginner
Duration:
Speaker: : Michel Dumontier

This lecture discusses how FAIR practices affect personalized data models, including workflows, challenges, and how to improve these practices.

Difficulty level: Beginner
Duration: 13:16
Speaker: : Kelly Shen

This lecture covers how to make modeling workflows FAIR by working through a practical example, dissecting the steps within the workflow, and detailing the tools and resources used at each step.

Difficulty level: Beginner
Duration: 15:14

This is the Introductory Module to the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition.

Difficulty level: Intermediate
Duration: 50:17

This module covers the concepts of gradient descent and the backpropagation algorithm and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:51:03
Speaker: : Yann LeCun

This lesson provides a detailed description of some of the modules and architectures involved in the development of neural networks. 

Difficulty level: Intermediate
Duration: 1:42:26

This lecture covers the concept of parameter sharing: recurrent and convolutional nets and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:59:47

This lecture covers the concept of convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 51:40
Speaker: : Yann LeCun

This lecture discusses the concept of natural signals properties and the convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:09:12
Speaker: : Alfredo Canziani

This lecture covers the concept of recurrent neural networks: vanilla and gated (LSTM) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:05:36
Speaker: : Alfredo Canziani

This lecture is a foundationational lecture for the concept of energy-based models with a particular focus on the joint embedding method and latent variable energy-based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Difficulty level: Intermediate
Duration: 1:51:30
Speaker: : Yann LeCun

This lecture covers the concept of inference in latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science. 

Difficulty level: Intermediate
Duration: 1:01:04
Speaker: : Alfredo Canziani