Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Kaggle

Intro to Deep Learning

via Kaggle

Overview

Use TensorFlow and Keras to build and train neural networks for structured data.
  • Learn about linear units, the building blocks of deep learning.
  • Add hidden layers to your network to uncover complex relationships.
  • Use Keras and Tensorflow to train your first neural network.
  • Improve performance with extra capacity or early stopping.
  • Add these special layers to prevent overfitting and stabilize training.
  • Apply deep learning to another common task.
  • Get started with Tensor Processing Units (TPUs)!

Syllabus

  • A Single Neuron
  • Deep Neural Networks
  • Stochastic Gradient Descent
  • Overfitting and Underfitting
  • Dropout and Batch Normalization
  • Binary Classification
  • Detecting the Higgs Boson With TPUs

Taught by

Ryan Holbrook

Reviews

5.0 rating, based on 2 Class Central reviews

Start your review of Intro to Deep Learning

  • TANUSHREE MAHATA
    The "Introduction to Deep Learning" course on Kaggle offers a well-structured curriculum, blending theory with hands-on exercises. Instructors provide clear explanations, making complex topics easily understandable. Real-world examples and case studies enrich the learning experience, showcasing practical applications across domains. The interactive Kaggle Kernels platform allows collaborative learning and experimentation with datasets and pre-built kernels. Overall, the course exceeds expectations, providing a solid foundation in deep learning fundamentals and practical skills for real-world problem-solving. Whether a beginner or with some experience, I highly recommend this course for anyone venturing into deep learning.
  • SWAVAGYASHREE
    This course provided a fantastic foundation in neural networks, including:

    Understanding neurons (single & multi-layer)
    Coding with TensorFlow & Keras
    Activation functions (ReLU, ELU, etc.) & layers
    Exploring linear regression & training processes
    Recognizing & preventing overfitting/underfitting
    I dove deep into:

    Loss functions (MAE, MSE, Huber) & optimizers (SGD, Adam)
    Regularization techniques (dropout, batch normalization)
    Binary classification & metrics (accuracy, cross-entropy)
    This practical course even included step-by-step coding using TensorFlow and Keras, making the concepts stick.

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.