Skip to main content

Unsupervised Learning: Autoencoding the Targets

Difficulty level
Beginner
Duration
56:41

This lecture covers advanced concept of energy based models. The lecture is a part of the Advanced energy based models modules of the the Deep Learning Course at NYU's Center for Data Science. Prerequisites for this course include: Energy-Based Models IEnergy-Based Models II, Energy-Based Models III, and an Introduction to Data Science or a Graduate Level Machine Learning course.

Topics covered in this lesson

Chapters: 

00:00 – 2021 edition disclaimer
02:28 – Unsupervised learning and generative models
05:42 – Input space interpolation
08:24 – Latent space interpolation
10:54 – Conditional generative networks
13:37 – Style transfer
16:21 – Super resolution
21:37 – Inpainting
23:19 – Caption to image (Dall-e)
28:24 – Definitions: x, y, z
31:27 – Recap: conditional latent variable EBM
32:12 – Recap: energy function
32:39 – Softmin training recap → autoencoder via amortised inference
38:40 – Reconstruction energies
39:24 – Loss functional
43:59 – Under and over complete hidden layer
48:58 – Denoising autoencoder
54:28 – Nearest neighbourhood denoising autoencoder
55:00 – Sparse autoencoder
55:20 – Final remarks