Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Online Course

Using GPUs to Scale and Speed-up Deep Learning

IBM via edX

(1)
68
  • Provider edX
  • Cost Free Online Course (Audit)
  • Session Upcoming
  • Language English
  • Certificate $99 Certificate Available
  • Effort 2-4 hours a week
  • Duration 5 weeks long
  • Learn more about MOOCs

Taken this course? Share your experience with other students. Write review

Overview

Training acomplex deep learning model with a very large datasetcan take hours, days and occasionally weeks to train. So, what is the solution? Accelerated hardware.

Youcan use accelerated hardware such as Google’s Tensor Processing Unit(TPU) or Nvidia GPU to accelerateyourconvolutional neural network computations timeon the Cloud. These chips arespecifically designed to support the training of neural networks, as well as the use of trained networks(inference).Accelerated hardware has recently been proven to significantly reduce training time.

But the problem is that your datamight be sensitiveand you may not feel comfortable uploading iton apublic cloud, preferring to analyze it on-premise.In this case, youneed to use an in-house system withGPU support. One solution isto useIBM’s Power SystemswithNvidia GPU andPowerAI. ThePowerAIplatform supports popular machine learning libraries and dependencies including Tensorflow, Caffe, Torch, and Theano.

In this course, you'll understand what GPU-based accelerated hardware is and how it can benefit your deep learning scaling needs. You'll also deploydeep learning networks on GPU accelerated hardware for several problems, including the classification ofimages and videos.

Syllabus

Module 1 – Quick review of Deep Learning
Intro to Deep Learning
Deep Learning Pipeline

Module 2 – Hardware Accelerated Deep Learning
How to accelerate a deep learning model?
Running TensorFlow operations on CPUs vs. GPUs
Convolutional Neural Networks on GPUs
Recurrent Neural Networks on GPUs

Module 3 – Deep Learning in the Cloud
Deep Learning in the Cloud
How does one use a GPU

Module 4 – Distributed Deep Learning
* Distributed Deep Learning

Module 5 – PowerAI vision
Computer vision
Image Classification
* Object recognition in Videos.

Taught by

SAEED AGHABOZORGI

Help Center

Most commonly asked questions about EdX

Review for edX's Using GPUs to Scale and Speed-up Deep Learning Based on 1 reviews

  • 5 star 0%
  • 4 star 0%
  • 3 star 0%
  • 2 star 0%
  • 1 star 100%

Did you take this course? Share your experience with other students.

Write a review
  • 1
Anonymous
Anonymous completed this course.
No more than a promotion for IBM's PowerAI platform. And at the end, we find that the promised free trail is no longer offered.
Was this review helpful to you? Yes
  • 1

Class Central

Get personalized course recommendations, track subjects and courses with reminders, and more.

Sign up for free

Never stop learning Never Stop Learning!

Get personalized course recommendations, track subjects and courses with reminders, and more.

Sign up for free