Skip to main content

Gradient descent and the backpropagation algorithm

Difficulty level

This module covers the concepts of gradient descent and the backpropagation algorithm and is a part of the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this course include: Introduction to Data Science or a Graduate Level Machine Learning.

Topics covered in this lesson


00:00:00 – Supervised learning
00:03:43 – Parametrised models
00:07:23 – Block diagram
00:08:55 – Loss function, average loss
00:12:23 – Gradient descent
00:30:47 – Traditional neural nets
00:35:07 – Backprop through a non-linear function
00:40:41 – Backprop through a weighted sum
00:50:55 – PyTorch implementation
00:57:18 – Backprop through a functional module
01:05:08 – Backprop through a functional module
01:12:15 – Backprop in practice
01:33:15 – Learning representations
01:42:14 – Shallow networks are universal approximators!
01:47:25 – Multilayer architectures == compositional structure of data