MIT: Recurrent Neural Networks

MIT: Recurrent Neural Networks

https://www.youtube.com/@AAmini/videos via YouTube Direct link

Intro

1 of 33

1 of 33

Intro

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

MIT: Recurrent Neural Networks

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Sequences in the wild
  3. 3 A sequence modeling problem: predict the next word
  4. 4 use a fixed window
  5. 5 can't model long-term dependencies
  6. 6 use entire sequence as set of counts
  7. 7 counts don't preserve order
  8. 8 use a really big fixed window
  9. 9 no parameter sharing
  10. 10 Sequence modeling: design criteria
  11. 11 Standard feed-forward neural network
  12. 12 Recurrent neural networks: sequence modeling
  13. 13 A standard "vanilla" neural network
  14. 14 A recurrent neural network (RNN)
  15. 15 RNN state update and output
  16. 16 RNNs: computational graph across time
  17. 17 Recall: backpropagation in feed forward models
  18. 18 RNNs: backpropagation through time
  19. 19 Standard RNN gradient flow: exploding gradients
  20. 20 Standard RNN gradient flow:vanishing gradients
  21. 21 The problem of long-term dependencies
  22. 22 Trick #1: activation functions
  23. 23 Trick #2: parameter initialization
  24. 24 Standard RNN In a standard RNN repeating modules contain a simple computation node
  25. 25 Long Short Term Memory (LSTMs)
  26. 26 LSTMs: forget irrelevant information
  27. 27 LSTMs: output filtered version of cell state
  28. 28 LSTM gradient flow
  29. 29 Example task: music generation
  30. 30 Example task: sentiment classification
  31. 31 Example task: machine translation
  32. 32 Attention mechanisms
  33. 33 Recurrent neural networks (RNNs)

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.