CMU Multilingual NLP 2020 - Multilingual Training and Cross-Lingual Transfer
Graham Neubig via YouTube
Overview
Syllabus
Many languages are left behind
Roadmap
Cross-lingual transfer
Supporting multiple languages could be tedious
Combining the two methods
Use case: covid-19 response
Rapid adaptation of massive multilingual models
Meta-learning for multilingual training
Multilingual NMT
Improve zero-shot NMT
Align multilingual representation
Zero-shot transfer for pretrained representations
Massively multilingual training
Training data highly imbalanced
Heuristic Sampling of Data
Learning to balance data
Problem: sometimes underperforms bilingual model
Multilingual Knowledge Distillation
Adding Language-specific layers
Problem: one-to-many transfer
Problem: multilingual
evaluation
Discussion question
Taught by
Graham Neubig