Sequence Models for Time Series and Natural Language Processing
Google Cloud and Google via Coursera
-
60
-
- Write review
Overview
Class Central Tips
• Predict future values of a time-series
• Classify free form text
• Address time-series and text problems with recurrent neural networks
• Choose between RNNs/LSTMs and simpler models
• Train and reuse word embeddings in text problems
You will get hands-on practice building and optimizing your own text classification and sequence models on a variety of public datasets in the labs we’ll work on together.
Prerequisites: Basic SQL, familiarity with Python and TensorFlow
Syllabus
- Working with Sequences
- In this module, you’ll learn what a sequence is, see how you can prepare sequence data for modeling, and be introduced to some classical approaches to sequence modeling and practice applying them.
- Recurrent Neural Networks
- In this module, we introduce recurrent neural nets, explain how they address the variable-length sequence problem, explain how our traditional optimization procedure applies to RNNs, and review the limits of what RNNs can and can’t represent.
- Dealing with Longer Sequences
- In this module we dive deeper into RNNs. We’ll talk about LSTMs, Deep RNNs, working with real world data, and more.
- Text Classification
- In this module we look at different ways of working with text and how to create your own text classification models.
- Reusable Embeddings
- Labeled data for our classification models is expensive and precious. Here we will address how we can reuse pre-trained embeddings to make our models with TensorFlow Hub.
- Encoder-Decoder Models
- In this module, we focus on a sequence-to-sequence model called the encoder-decoder network to solve tasks, such as Machine Translation, Text Summarization and Question Answering.
- Summary
- In this final module, we review what you have learned so far about sequence modeling for time-series and natural language data.
Taught by
Google Cloud Training
Related Courses
-
Natural Language Processing
Higher School of Economics
5.0 -
Natural Language Processing with Sequence Models
deeplearning.ai
-
Natural Language Processing with Probabilistic Models
deeplearning.ai
-
Natural Language Processing in TensorFlow
deeplearning.ai
4.0 -
Sequence Models
deeplearning.ai, Stanford University
4.8 -
Natural Language Processing with Attention Models
deeplearning.ai
Reviews
1.0 rating, based on 1 reviews
-
Anonymous is taking this course right now.
This course has FULL PAYWALL
meaning you cant audit the course, and can't see nothing without paying
In the MOOC era, this is a shame, especially that Google is behind this. they can spend a few dollars on spreading education that leads to using their products (i'm a Google Cloud business user)
NO chance i'm going to take this course, even if i decide to take a payed track later