Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Institution Logo

Sequence Models

deeplearning.ai and Stanford University via Coursera

Overview

In the fifth course of the Deep Learning Specialization, you will become familiar with NLP models and their exciting applications such as speech recognition, music synthesis, chatbots, machine translation, natural language understanding, and more that have become possible with the evolution of sequence algorithms thanks to deep learning.

By the end, you will be able to build and train Recurrent Neural Networks and commonly-used variants such as GRUs and LSTMs; apply RNNs to Character-level Language Modeling; gain experience with natural language processing and Word Embeddings; and use HuggingFace tokenizers and transformer models to solve different NLP tasks such as NER and Question Answering.

DeepLearning.AI is proud to partner with NVIDIA Deep Learning Institute (DLI) to provide a programming assignment on Machine Translation with Deep Learning. Get an opportunity to build a deep learning project with leading-edge techniques using industry-relevant use cases.

The Deep Learning Specialization is our foundational program that will help you understand the capabilities, challenges, and consequences of deep learning and prepare you to participate in the development of leading-edge AI technology. It provides a pathway for you to gain the knowledge and skills to apply machine learning to your work, level up your technical career, and take the definitive step in the world of AI.

Syllabus

  • Recurrent Neural Networks
    • Learn about recurrent neural networks. This type of model has been proven to perform extremely well on temporal data. It has several variants including LSTMs, GRUs and Bidirectional RNNs, which you are going to learn about in this section.
  • Natural Language Processing & Word Embeddings
    • Natural language processing with deep learning is an important combination. Using word vector representations and embedding layers you can train recurrent neural networks with outstanding performances in a wide variety of industries. Examples of applications are sentiment analysis, named entity recognition and machine translation.
  • Sequence models & Attention mechanism
    • Sequence models can be augmented using an attention mechanism. This algorithm will help your model understand where it should focus its attention given a sequence of inputs. This week, you will also learn about speech recognition and how to deal with audio data.

Taught by

Andrew Ng

Related Courses

Reviews

4.8 rating, based on 4 reviews

Start your review of Sequence Models

  • Ronny De Winter completed this course, spending 5 hours a week on it and found the course difficulty to be medium.

    Great final course of a world-class specialization on Deep Learning. Andrew Ng understands how to make difficult concepts understandable for a broad audience. The difficulty of the exercises builds up course after course, so ensure you built up your tensorflow/keras skills with the earlier courses or by other means.
    If you get stuck in one of the exercises do not hesitate to go to the discussion forum, most probably somebody else had similar problems before and you can find worthy advice and save precious time.
  • Profile image for Raivis Joksts
    Raivis Joksts

    Raivis Joksts completed this course, spending 6 hours a week on it and found the course difficulty to be medium.

    This is the hardest course in the specialisation, and may take some extra effort. For practical assignments I recommend getting familiar with Keras syntax and workflow, as here there is little hand-holding here,. the focus is on actual model architecture and algorithms.
  • Anonymous

    Anonymous completed this course.

    Best RNN course out there. Great explanation, amazing practical examples and interesting quizes. Well prepared. Good to take earlier courses in the specialization.
  • Wichaiditsornpon@gmail.com completed this course, spending 7 hours a week on it and found the course difficulty to be medium.

    Prof Andrew do a great work as usual i never seen better explanation for RNN, Prof Andrew break down such a complex theory stuff to small piece of easy to follow and understandable stuff

Never stop learning Never Stop Learning!

Get personalized course recommendations, track subjects and courses with reminders, and more.

Sign up for free