This tutorial demonstrates how to get the coordinates and sequences of exons using the UCSC Genome Browser.
This tutorial will demonstrate how to locate amino acid numbers for coding genes using the UCSC Genome Browser.
This tutorial will demonstrate how to find the tables in the UCSC database that are associated with the data tracks in the Genome Browser graphical viewer.
This tutorial shows how to navigate between exons of a gene using the UCSC Genome Browser.
EyeWire is a game to map the brain. Players are challenged to map branches of a neuron from one side of a cube to the other in a 3D puzzle. Players scroll through the cube and reconstruct neurons with the help of an artificial intelligence algorithm developed at Seung Lab in Princeton University. EyeWire gameplay advances neuroscience by helping researchers discover how neurons connect to process visual information.
This session will include presentations of infrastructure that embrace the FAIR principles developed by members of the INCF Community. This lecture provides an overview and demo of the Canadian Open Neuroscience Platform (CONP).
This module explains how neurons come together to create the networks that give rise to our thoughts. The totality of our neurons and their connection is called our connectome. Learn how this connectome changes as we learn, and computes information. We will also learn about physiological phenomena of the brain such as synchronicity that gives rise to brain waves.
This video gives a short introduction to the EBRAINS data sharing platform, why it was developed, and how it contributes to open data sharing.
This video introduces the key principles for data organisation and explains how you could make your data FAIR for data sharing on EBRAINS.
This video introduces the importance of writing a Data Descriptor to accompany your dataset on EBRAINS. It gives concrete examples on what information to include and highlights how this makes your data more FAIR.
This video demonstrates how to find, access, and download data on EBRAINS.
This lecture covers advanced concept of energy based models. The lecture is a part of the Advanced energy based models modules of the the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this course include: Energy based models I, Energy based models II, Energy based models III, and Introduction to Data Science or a Graduate Level Machine Learning.
This tutorial covers LV-EBM to target prop to (vanilla, denoising, contractive, variational) autoencoder and a part of the Advanced energy based models modules of the the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this course include: Energy based models I, Energy based models II, Energy based models III, Energy based models IV, and Introduction to Data Science or a Graduate Level Machine Learning.
This tutorial covers the concepts of autoencoders, denoising encoders, and variational autoencoders (VAE) with PyTorch, as well as generative adversarial networks and code. It is a part of the Advanced energy based models modules of the the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this course include: Energy based models I, Energy based models II, Energy based models III, Energy based models IV, Energy based models V, and Introduction to Data Science or a Graduate Level Machine Learning.
This tutorial covers advanced concept of energy based models. The lecture is a part of the Associative memories modules of the the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition.
This tutuorial covers the concept of Graph convolutional networks and is a part of the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this module include: Modules 1 - 5 of this course and Introduction to Data Science or a Graduate Level Machine Learning.
This lecture covers the concept of model predictive control and is a part of the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this module include: Models 1-6 of this course and Introduction to Data Science or a Graduate Level Machine Learning.