Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2019 - Attention

Graham Neubig via YouTube

Overview

Limited-Time Offer: Up to 75% Off Coursera Plus!
7000+ certificate courses from Google, Microsoft, IBM, and many more.
This course covers the topic of Attention in Neural Networks for Natural Language Processing (NLP). The learning outcomes include understanding different types of attention mechanisms, improvements to attention, and specialized attention varieties. Students will learn about encoder-decoder models, calculating attention, attention score functions, and incorporating markov properties. The teaching method involves lectures and a case study on "Attention is All You Need." This course is intended for students interested in neural networks, NLP, and machine learning.

Syllabus

Intro
Encoder-decoder Models
Sentence Representations
Basic Idea (Bahdanau et al. 2015)
Calculating Attention (2)
A Graphical Example
Attention Score Functions (1)
Input Sentence
Hierarchical Structures (Yang et al. 2016)
Multiple Sources
Coverage • Problem: Neural models tends to drop or repeat
Incorporating Markov Properties (Cohn et al. 2015)
در Bidirectional Training
Hard Attention
Summary of the Transformer (Vaswani et al. 2017)
Attention Tricks
Training Tricks
Masking for Training . We want to perform training in as few operations as

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2019 - Attention

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.