This lesson continues from part one of the lecture Ontologies, Databases, and Standards, diving deeper into a description of ontologies and knowledg graphs.
In this lecture, the speaker demonstrates Neurokernel's module interfacing feature by using it to integrate independently developed models of olfactory and vision LPUs based upon experimentally obtained connectivity information.
This lecture highlights the importance of correct annotation and assignment of location, and updated atlas resources to avoid errors in navigation and data interpretation.
We are at the exciting technological stage where it has become feasible to represent the anatomy of an entire human brain at the cellular level. This lecture discusses how neuroanatomy in the 21st Century has become an effort towards the virtualization and standardization of brain tissue.
This lecture covers essential features of digital brain models for neuroinformatics, particularly NeuroMaps.
This presentation covers the neuroinformatics tools and techniques used and their relationship to neuroanatomy for the Allen Institute's atlases of the mouse, developing mouse, and mouse connectional atlas.
This tutorial covers the fundamentals of collaborating with Git and GitHub.
This lecture and tutorial focuses on measuring human functional brain networks, as well as how to account for inherent variability within those networks.
This lesson provides an overview of Jupyter notebooks, Jupyter lab, and Binder, as well as their applications within the field of neuroimaging, particularly when it comes to the writing phase of your research.
This lecture gives an overview of how to prepare and preprocess neuroimaging (EEG/MEG) data for use in TVB.
Panel discussion by leading scientists, engineers and philosophers discuss what brain-computer interfaces are and the unique scientific and ethical challenges they pose. hosted by Lynne Malcolm from ABC Radio National's All in the Mind program and features:
Panel of experts discuss the virtues and risks of our digital health data being captured and used by others in the age of Facebook, metadata retention laws, Cambridge Analytica and a rapidly evolving neuroscience. The discussion was moderated by Jon Faine, ABC Radio presenter. The panelists were:
This is the Introductory Module to the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition.
This module covers the concepts of gradient descent and the backpropagation algorithm and is a part of the Deep Learning Course at NYU's Center for Data Science.
This lecture covers concepts associated with neural nets, including rotation and squashing, and is a part of the Deep Learning Course at New York University's Center for Data Science (CDS).
This lesson provides a detailed description of some of the modules and architectures involved in the development of neural networks.
This lecture covers the concept of neural nets training (tools, classification with neural nets, and PyTorch implementation) and is a part of the Deep Learning Course at NYU's Center for Data Science.
This lecture covers the concept of parameter sharing: recurrent and convolutional nets and is a part of the Deep Learning Course at NYU's Center for Data Science.
This lecture covers the concept of convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.
This lecture discusses the concept of natural signals properties and the convolutional nets in practice and is a part of the Deep Learning Course at NYU's Center for Data Science.