Overview
This course on Neural Networks for Natural Language Processing (NLP) covers topics such as Encoder-Decoder Models, Conditional Generation, Ensembling, Evaluation, and Types of Data to Condition On. Students will learn about language models, conditioned language models, different search algorithms, and evaluation methods. The teaching method includes lectures and discussions. This course is intended for students interested in neural networks and NLP, particularly those with a background in computer science or linguistics.
Syllabus
Language Models Language models are generative models of text
Conditioned Language Models
Conditional Language Models
One Type of Conditional Language Model Sutskever et al. 2014
How to Pass Hidden State?
The Generation Problem
Ancestral Sampling
Greedy Search
Beam Search
Log-linear Interpolation • Weighted combination of log probabilities, normalize
Linear or Log Linear?
Stacking
Basic Evaluation Paradigm
Human Evaluation
Perplexity
Taught by
Graham Neubig