Skip to main content

Energy-Based Models II

Difficulty level
Intermediate
Speaker
Duration
1:48:53

This lecture is a foundationational lecture for the concept of energy-based models with a particular focus on the joint embedding method and latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Topics covered in this lesson

Chapters: 
00:00:00 – Welcome to class
00:00:17 – Training of an EBM
00:04:27 – Contrastive vs. regularised / architectural methods
00:05:21 – General margin loss
00:09:34 – List of loss functions
00:13:45 – Generalised additive margin loss
00:17:53 – Joint embedding architectures
00:21:29 – Wav2Vec 2.0
00:27:14 – XLSR: multilingual speech recognition
00:29:16 – Generative adversarial networks (GANs)
00:37:24 – Mode collapse
00:41:45 – Non-contrastive methods
00:43:19 – BYOL: bootstrap your own latent
00:44:27 – SwAV
00:46:45 – Barlow twins
00:51:29 – SEER
00:54:29 – Latent variable models in practice
00:57:34 – DETR
01:01:21 – Structured prediction
01:04:53 – Factor graph
01:12:47 – Viterbi algorithm whiteboard time
01:30:24 – Graph transformer networks
01:46:48 – Graph composition, transducers
01:48:38 – Final remarks