Neural nets inference
This lecture covers the concept of neural nets--rotation and squashing and is a part of the Deep Learning Course at CDS, a course that covered the latest techniques in deep learning and representation learning, focusing on supervised and unsupervised deep learning, embedding methods, metric learning, convolutional and recurrent nets, with applications to computer vision, natural language understanding, and speech recognition. Prerequisites for this course include: Introduction to Data Science or a Graduate Level Machine Learning.
Chapters
00:00 – Welcome!
02:30 – Affine transformations and non-linearities
03:42 – Affine transformation: intuition
13:47 – Summary slide
14:06 – Jupyter and PyTorch
18:47 – Input data
24:39 – Coding a 2×2 linear transformation & Gilbert Strang
30:15 – Coding a 2×2 linear transformation w/ PyTorch
33:03 – Hyperbolic tangent
47:02 – Rotation + squashing + rotation: ooooh, a neural net
49:28 – Rectifying linear unit (ReLU)
51:10 – Shoutout to @vcubingx and his animation
52:09 – Spiky transformation: what happen here?
54:23 – A *very deep* neural net
56:30 – A deep net with tanh
56:43 – Summary of today lesson