Skip to main content

Inference for latent variable energy based models (LV-EBMs)

Difficulty level
Intermediate
Duration
1:01:04

This lecture covers the concept of inference in latent variable energy based models (LV-EBMs) and is a part of the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this course include: Introduction to Deep LearningParameter sharing, and Introduction to Data Science or a Graduate Level Machine Learning.

Topics covered in this lesson

Chapters:
00:00 – Affine transformation in 2 and 3D by @LeiosOS (James Schloss)
01:21 – Thanks for sending me a Wacom graphic tablet
01:50 – *Inference* for LV EBM (we're given a model)
04:32 – Training samples: one to many mapping
13:10 – Let's simplify stuff: the unconditional case
15:56 – Untrained model manifold generation
21:15 – The Energy Function
24:51 – Indexing energy function by picking individual training samples
31:41 – The 23rd energy (U shaped)
39:27 – The 10th energy (~ shaped)
46:07 – The Free Energy (definition and the 10th example)
51:59 – The 23rd free energy
53:07 – Computing the free energy for the entire 𝒴 space
1:00:01 – That was it :)