Skip to main content

Energy-Based Models I

Difficulty level
Intermediate
Speaker
Duration
1:51:30

This lecture is a foundationational lecture for the concept of energy-based models with a particular focus on the joint embedding method and latent variable energy-based models (LV-EBMs) and is a part of the Deep Learning Course at NYU's Center for Data Science.

Topics covered in this lesson

Chapters:

00:00:00 – Welcome to class
00:00:39 – Predictive models
00:02:25 – Multi-output system
00:06:36 – Notation (factor graph)
00:07:41 – The energy function F(x, y)
00:08:53 – Inference
00:11:59 – Implicit function
00:15:53 – Conditional EBM
00:16:24 – Unconditional EBM
00:19:18 – EBM vs. probabilistic models
00:21:33 – Do we need a y at inference?
00:23:29 – When inference is hard
00:25:02 – Joint embeddings
00:28:29 – Latent variables
00:33:54 – Inference with latent variables
00:37:58 – Energies E and F
00:42:35 – Preview on the EBM practicum
00:44:30 – From energy to probabilities
00:50:37 – Examples: K-means and sparse coding
00:53:56 – Limiting the information capacity of the latent variable
00:57:24 – Training EBMs
01:04:02 – Maximum likelihood
01:13:58 – How to pick β?
01:17:28 – Problems with maximum likelihood
01:20:20 – Other types of loss functions
01:26:32 – Generalised margin loss
01:27:22 – General group loss
01:28:26 – Contrastive joint embeddings
01:34:51 – Denoising or mask autoencoder
01:46:14 – Summary and final remarks