Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization
deeplearning.ai and Stanford University via Coursera
-
2.0k
-
- Write review
Overview
Class Central Tips
After 3 weeks, you will:
- Understand industry best-practices for building deep learning applications.
- Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking,
- Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence.
- Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance
- Be able to implement a neural network in TensorFlow.
This is the second course of the Deep Learning Specialization.
Syllabus
- Practical aspects of Deep Learning
- Optimization algorithms
- Hyperparameter tuning, Batch Normalization and Programming Frameworks
Taught by
Andrew Ng
Charts
- #2 in Subjects / Machine Learning / TensorFlow
Related Courses
-
Neural Networks and Deep Learning
deeplearning.ai, Stanford University
4.8 -
Convolutional Neural Networks
deeplearning.ai, Stanford University
4.9 -
Deep Learning - IITKGP
Indian Institute of Technology, Kharagpur, NPTEL
-
Art and Science of Machine Learning
Google Cloud, Google
-
Deep Learning
deeplearning.ai
-
Introduction to Deep Learning & Neural Networks with Keras
IBM
Reviews
5.0 rating, based on 3 reviews
-
Nattapon Sub-Anake completed this course, spending 6 hours a week on it and found the course difficulty to be medium.
I finished this second deep learning course and I like it very much. I am looking forward for more courses in this deep learning series. Andrew is doing a great job here. -
Silveira Homero completed this course, spending 9 hours a week on it and found the course difficulty to be medium.
This is a follow up course to Neural Networks and Deep Learning so you must start with the latter. The practical side of the teaching was very interesting. -
Raivis Joksts completed this course, spending 6 hours a week on it and found the course difficulty to be easy.
A good look at most important parameters that impact DL models. Again it is math heavy, but it' s ok to just understand basic logic behind it. Has some TensorFlow practical tasks, which are easy to complete, but you may wish to read up on TF on your own to understand how it works in general, especially if you have used other DLK framework before - like PyTorch, for example.