Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP - Multi-task, Multi-lingual Learning

Graham Neubig via YouTube

Overview

Limited-Time Offer: Up to 75% Off Coursera Plus!
7000+ certificate courses from Google, Microsoft, IBM, and many more.
This course on Neural Networks for NLP focuses on Multi-task Learning and Multi-lingual Learning. The learning outcomes include understanding various types of learning, such as multitasking to increase data and standard multi-task learning. Students will also learn about examples of pre-training encoders, regularization for pre-training, selective parameter adaptation, soft parameter tying, supervised domain adaptation, unsupervised learning, and multilingual structured prediction. The course teaches about multilingual sequence-to-sequence models, multiple annotation standards, and different layers for different tasks. The teaching method involves lectures and examples. This course is intended for students interested in neural networks for natural language processing, particularly those looking to delve into multi-task and multi-lingual learning in this domain.

Syllabus

Intro
Remember, Neural Nets are Feature Extractors!
Types of Learning
Plethora of Tasks in NLP
Rule of Thumb 1: Multitask to Increase Data
Rule of Thumb 2
Standard Multi-task Learning
Examples of Pre-training Encoders . Common to pre-train encoders for downstream tasks, common to use
Regularization for Pre-training (e.g. Barone et al. 2017) Pre-training relies on the fact that we won't move too far from the
Selective Parameter Adaptation Sometimes it is better to adapt only some of the parameters
Soft Parameter Tying
Supervised Domain Adaptation through Feature Augmentation
Unsupervised Learning through Feature Matching
Multilingual Structured Prediction/ Multilingual Outputs • Things are harder when predicting a sequence of actions (parsing) or words (MT) in different languages
Multi-lingual Sequence-to- sequence Models
Types of Multi-tasking
Multiple Annotation Standards
Different Layers for Different
Summary of design dimensions

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP - Multi-task, Multi-lingual Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.