CMU Multilingual NLP 2020 - Multilingual Training and Cross-Lingual Transfer

CMU Multilingual NLP 2020 - Multilingual Training and Cross-Lingual Transfer

Graham Neubig via YouTube Direct link

Many languages are left behind

1 of 23

1 of 23

Many languages are left behind

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

CMU Multilingual NLP 2020 - Multilingual Training and Cross-Lingual Transfer

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Many languages are left behind
  2. 2 Roadmap
  3. 3 Cross-lingual transfer
  4. 4 Supporting multiple languages could be tedious
  5. 5 Combining the two methods
  6. 6 Use case: covid-19 response
  7. 7 Rapid adaptation of massive multilingual models
  8. 8 Meta-learning for multilingual training
  9. 9 Multilingual NMT
  10. 10 Improve zero-shot NMT
  11. 11 Align multilingual representation
  12. 12 Zero-shot transfer for pretrained representations
  13. 13 Massively multilingual training
  14. 14 Training data highly imbalanced
  15. 15 Heuristic Sampling of Data
  16. 16 Learning to balance data
  17. 17 Problem: sometimes underperforms bilingual model
  18. 18 Multilingual Knowledge Distillation
  19. 19 Adding Language-specific layers
  20. 20 Problem: one-to-many transfer
  21. 21 Problem: multilingual
  22. 22 evaluation
  23. 23 Discussion question

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.