MIT: Introduction to Deep Learning

MIT: Introduction to Deep Learning

https://www.youtube.com/@AAmini/videos via YouTube Direct link

Intro

1 of 32

1 of 32

Intro

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

MIT: Introduction to Deep Learning

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 The Rise of Deep Learning
  3. 3 What is Deep Learning?
  4. 4 Lecture Schedule
  5. 5 Final Class Project
  6. 6 Class Support
  7. 7 Course Staff
  8. 8 Why Deep Learning
  9. 9 The Perceptron: Forward Propagation
  10. 10 Common Activation Functions
  11. 11 Importance of Activation Functions
  12. 12 The Perceptron: Example
  13. 13 The Perceptron: Simplified
  14. 14 Multi Output Perceptron
  15. 15 Single Layer Neural Network
  16. 16 Deep Neural Network
  17. 17 Quantifying Loss
  18. 18 Empirical Loss
  19. 19 Binary Cross Entropy Loss
  20. 20 Mean Squared Error Loss
  21. 21 Loss Optimization
  22. 22 Computing Gradients: Backpropagation
  23. 23 Training Neural Networks is Difficult
  24. 24 Setting the Learning Rate
  25. 25 Adaptive Learning Rates
  26. 26 Adaptive Learning Rate Algorithms
  27. 27 Stochastic Gradient Descent
  28. 28 Mini-batches while training
  29. 29 The Problem of Overfitting
  30. 30 Regularization 1: Dropout
  31. 31 Regularization 2: Early Stopping
  32. 32 Core Foundation Review

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.