Neural Nets for NLP - Multi-task, Multi-lingual Learning

Neural Nets for NLP - Multi-task, Multi-lingual Learning

Graham Neubig via YouTube Direct link

Intro

1 of 19

1 of 19

Intro

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP - Multi-task, Multi-lingual Learning

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Remember, Neural Nets are Feature Extractors!
  3. 3 Types of Learning
  4. 4 Plethora of Tasks in NLP
  5. 5 Rule of Thumb 1: Multitask to Increase Data
  6. 6 Rule of Thumb 2
  7. 7 Standard Multi-task Learning
  8. 8 Examples of Pre-training Encoders . Common to pre-train encoders for downstream tasks, common to use
  9. 9 Regularization for Pre-training (e.g. Barone et al. 2017) Pre-training relies on the fact that we won't move too far from the
  10. 10 Selective Parameter Adaptation Sometimes it is better to adapt only some of the parameters
  11. 11 Soft Parameter Tying
  12. 12 Supervised Domain Adaptation through Feature Augmentation
  13. 13 Unsupervised Learning through Feature Matching
  14. 14 Multilingual Structured Prediction/ Multilingual Outputs • Things are harder when predicting a sequence of actions (parsing) or words (MT) in different languages
  15. 15 Multi-lingual Sequence-to- sequence Models
  16. 16 Types of Multi-tasking
  17. 17 Multiple Annotation Standards
  18. 18 Different Layers for Different
  19. 19 Summary of design dimensions

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.