CMU Multilingual NLP - Unsupervised Translation

CMU Multilingual NLP - Unsupervised Translation

Graham Neubig via YouTube Direct link

Intro

1 of 27

1 of 27

Intro

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

CMU Multilingual NLP - Unsupervised Translation

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Conditional Text Generation
  3. 3 Modeling: Conditional Language Models
  4. 4 What if we don't have parallel data?
  5. 5 Can't we just collect/generate the data?
  6. 6 Outline
  7. 7 Initialization: Unsupervised Word Translation
  8. 8 Unsupervised Word Translation: Adversarial Training
  9. 9 Back-translation
  10. 10 One slide primer on phrase-based statistical MT
  11. 11 Unsupervised Statistical MT
  12. 12 Bidirectional Modeling . Model: same encoder decoder used for both languages Initialize with cross-lingual word embeddings
  13. 13 Unsupervised MT: Training Objective 1
  14. 14 How does it work?
  15. 15 Unsupervised NMT: Training Objective 3
  16. 16 In summary
  17. 17 When Does Unsupervised Machine Translation Work?
  18. 18 Reasons for this poor performance
  19. 19 Open Problems
  20. 20 Better Initialization: Cross Lingual Language Models
  21. 21 Better Initialization: Multilingual BART
  22. 22 Better Initialization: Masked Sequence to Sequence Model (MASS) • Encoder-decoder formulation of masked language modelling
  23. 23 Multilingual Unsupervised MT
  24. 24 Multilingual UNMT
  25. 25 How practical is the strict unsupervised scenario
  26. 26 Related Area: Style Transfer
  27. 27 Discussion Question

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.