Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Recurrent Neural Networks, Vanilla and Gated - LSTM

Alfredo Canziani via YouTube

Overview

This course covers the fundamentals of recurrent neural networks, including both vanilla and gated (LSTM) models. By the end of the course, learners will understand the concepts of vector to sequence, sequence to vector, sequence to sequence, and training a recurrent network using backpropagation through time. They will also learn about vanishing and exploding gradients, the gating mechanism, and how LSTM works. The course includes practical demonstrations using Jupyter Notebook and PyTorch for sequence classification. This course is suitable for individuals interested in deep learning and neural networks.

Syllabus

– Good morning
– How to summarise papers as @y0b1byte with Notion
– Why do we need to go to a higher hidden dimension?
– Today class: recurrent neural nets
– Vector to sequence vec2seq
– Sequence to vector seq2vec
– Sequence to vector to sequence seq2vec2seq
– Sequence to sequence seq2seq
– Training a recurrent network: back propagation through time
– Training example: language model
– Vanishing & exploding gradients and gating mechanism
– The Long Short-Term Memory LSTM
– Jupyter Notebook and PyTorch in action: sequence classification
– Inspecting the activation values
– Closing remarks

Taught by

Alfredo Canziani

Reviews

Start your review of Recurrent Neural Networks, Vanilla and Gated - LSTM

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.