Skip to main content

Training latent variable energy based models (LV-EBMs)

Difficulty level

This tutorial covers the concept of training latent variable energy based models (LV-EBMs) and is is a part of the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this course include: Introduction to Deep LearningParameter sharing, and Introduction to Data Science or a Graduate Level Machine Learning.

Topics covered in this lesson


00:00 – Welcome to class
01:11 – Zero Temperature Limit (ZTL) free energy recap
05:29 – (warmer) Free energy
15:42 – Infinite Temperature Limit
20:20 – Free energy, example y = Y[23]
25:22 – Free energy, example y = Y[10]
26:41 – Free energy, example y = (0, 0)
28:30 – Nomenclature and PyTorch
34:13 – Training!
35:44 – Loss functionals
46:09 – ZTL vs. varmer temperature training
49:50 – Conditional case
51:47 – Untrained model manifold
54:11 – Energy function E(x, y, z)
56:48 – Trained model manifold
59:54 – Learning the Decoder
1:01:15 – Choosing the latent size
1:03:02 – And that was it