Get started with custom lists to organize and share courses.

Sign up

Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Sequence Models for Time Series and Natural Language Processing

Google Cloud and Google via Coursera

1 Review 49 students interested
  • Provider Coursera
  • Cost $49
  • Session Upcoming
  • Language English
  • Certificate Paid Certificate Available
  • Start Date
  • Duration 2 weeks long
  • Learn more about MOOCs

Taken this course? Share your experience with other students. Write review

Overview

This course is an introduction to sequence models and their applications, including an overview of sequence model architectures and how to handle inputs of variable length.

• Predict future values of a time-series
• Classify free form text
• Address time-series and text problems with recurrent neural networks
• Choose between RNNs/LSTMs and simpler models
• Train and reuse word embeddings in text problems

You will get hands-on practice building and optimizing your own text classification and sequence models on a variety of public datasets in the labs we’ll work on together.

Prerequisites: Basic SQL, familiarity with Python and TensorFlow

Syllabus

Working with Sequences
-In this module, you’ll learn what a sequence is, see how you can prepare sequence data for modeling, and be introduced to some classical approaches to sequence modeling and practice applying them.

Recurrent Neural Networks
-In this module, we introduce recurrent neural nets, explain how they address the variable-length sequence problem, explain how our traditional optimization procedure applies to RNNs, and review the limits of what RNNs can and can’t represent.

Dealing with Longer Sequences
-In this module we dive deeper into RNNs. We’ll talk about LSTMs, Deep RNNs, working with real world data, and more.

Text Classification
-In this module we look at different ways of working with text and how to create your own text classification models.

Reusable Embeddings
-Labeled data for our classification models is expensive and precious. Here we will address how we can reuse pre-trained embeddings to make our models with TensorFlow Hub.

Encoder-Decoder Models
-In this module, we focus on a sequence-to-sequence model called the encoder-decoder network to solve tasks, such as Machine Translation, Text Summarization and Question Answering.

Summary
-In this final module, we review what you have learned so far about sequence modeling for time-series and natural language data.

Taught by

Google Cloud Training

Help Center

Most commonly asked questions about Coursera Coursera

Review for Coursera's Sequence Models for Time Series and Natural Language Processing
1.0 Based on 1 reviews

  • 5 star 0%
  • 4 star 0%
  • 3 star 0%
  • 2 star 0%
  • 1 star 100%

Did you take this course? Share your experience with other students.

Write a review
  • 1
Anonymous
1.0 4 months ago
Anonymous is taking this course right now.
This course has FULL PAYWALL

meaning you cant audit the course, and can't see nothing without paying

In the MOOC era, this is a shame, especially that Google is behind this. they can spend a few dollars on spreading education that leads to using their products (i'm a Google Cloud business user)

NO chance i'm going to take this course, even if i decide to take a payed track later
Was this review helpful to you? Yes
  • 1

Class Central

Get personalized course recommendations, track subjects and courses with reminders, and more.

Sign up for free

Never stop learning Never Stop Learning!

Get personalized course recommendations, track subjects and courses with reminders, and more.