Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

CMU Neural Nets for NLP 2018 - Conditioned Generation

Graham Neubig via YouTube

Overview

Limited-Time Offer: Up to 75% Off Coursera Plus!
7000+ certificate courses from Google, Microsoft, IBM, and many more.
This course covers the fundamentals of neural networks for natural language processing (NLP) with a focus on conditioned generation. By the end of the course, students will be able to understand and implement language models, conditioned language models, ensembling techniques, parameter averaging, ensemble distillation, stacking, and basic evaluation paradigms for NLP tasks. The course teaches skills such as ancestral sampling, linear and log-linear modeling, and various evaluation methods including human evaluation and perplexity. The teaching method involves theoretical explanations, practical examples, and discussions on evaluating unconditioned generation. This course is intended for individuals interested in NLP, neural networks, and machine learning.

Syllabus

Intro
Language Models Language models are generative models of text
Conditioned Language Models
Ancestral Sampling
Ensembling
Linear or Log Linear?
Parameter Averaging
Ensemble Distillation (e.g. Kim et al. 2016)
Stacking
Basic Evaluation Paradigm
Human Evaluation
Perplexity
A Contrastive Note: Evaluating Unconditioned Generation

Taught by

Graham Neubig

Reviews

Start your review of CMU Neural Nets for NLP 2018 - Conditioned Generation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.