Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

IBM

Mastering Generative AI: Language Models with Transformers

IBM via edX

Overview

The demand for transformer-based language models is skyrocketing. AI engineers skilled in using transformer-based models for NLP are essential for developing successful gen AI applications. This course builds the job-ready skills employers need.

During the course, you’ll explore the concepts of transformer-based models for natural language processing (NLP). You’ll look at how to apply transformer-based models for text classification, focusing on the encoder component. Plus, you’ll learn about positional encoding, word embedding, and attention mechanisms in language transformers, and their role in capturing contextual information and dependencies.

You’ll learn about multi-head attention and decoder-based language modeling with generative pre-trained transformers (GPT) for language translation. You’ll consider how to train models and implement models using PyTorch. You’ll explore encoder-based models with bidirectional encoder representations from transformers (BERT) and train them using masked language modeling (MLM) and next sentence prediction (NSP). Plus, you’ll learn to apply transformers for translation using transformer architecture and implement it using PyTorch.

Throughout, you’ll apply your new skills practically in hands-on activities and you’ll complete a final project tackling a real-world scenario.

If you’re looking to build job-ready skills for gen AI applications employers are looking for, ENROLL TODAY and enhance your resume in just 2 weeks!

Prerequisites: To enroll for this course, you need a working knowledge of Python, PyTorch, and machine learning.

Syllabus

Module 1: Fundamental Concepts of Transformer Architecture

  • Video: Course Introduction
  • Reading: Professional Certificate Overview
  • Reading: General Information
  • Reading: Learning Objectives and Syllabus
  • Reading: Grading Schemen
  • Reading: Module Introduction and Learning Objectives
  • Video: Positional Encoding
  • Video: Attention Mechanism
  • Video: Self-attention Mechanism
  • Video: From Attention to Transformers
  • Lab: Attention Mechanism and Positional Encoding
  • Video: Transformers for Classification: Encoder
  • Lab: Applying Transformers for Classification
  • Reading: Summary and Highlights: Fundamental Concepts of Transformer Architecture
  • Practice Quiz: Fundamental Concepts of Transformer Architecture
  • Graded Quiz: Fundamental Concepts of Transformer Architecture

Module 2: Advanced Concepts of Transformer Architecture

  • Reading: Module Introduction and Learning Objectives
  • Video: Language Modeling with the Decoders and GPT-like Models
  • Video: Training Decoder Models
  • Video: Decoder Models- PyTorch Implementation-Causal LM
  • Video: Decoder Models: PyTorch Implementation Using Training and Inference
  • Lab: Decoder GPT-like Models
  • Reading: Summary and Highlights
  • Practice Quiz: Decoder Models
  • Video: Encoder Models with BERT: Pretraining Using MLM
  • Video: Encoder Models with BERT: Pretraining Using NSP
  • Video: Data Preparation for BERT with PyTorch
  • Video: Pretraining BERT Models with PyTorch
  • Lab: Pretraining BERT Models
  • Lab: Data Preparation for BERT
  • Reading: Summary and Highlights: Encoder Models
  • Practice Quiz: Encoder Models
  • Video: Transformer Architecture for Language Translation
  • Video: Transformer Architecture for Translation: PyTorch Implementation
  • Lab: Transformers for Translation
  • Reading: Summary and Highlights: Advanced Concepts of Transformer Architecture
  • Practice Quiz : Advanced Concepts of Transformer Architecture
  • Graded Quiz: Advanced Concepts of Transformer Architecture

Module 3: Course Cheat Sheet, Glossary and Wrap-up

  • Reading: Cheat Sheet: Language Modeling with Transformers
  • Reading: Course Glossary: Language Modeling with Transformers

Course Wrap-Up

  • Course Conclusion
  • Reading: Team and Acknowledgements
  • Reading: Congratulations and Next Steps
  • Reading: Copyrights and Trademarks
  • Course Rating and Feedback
  • Badge
  • Reading: Frequently Asked Questions
  • Reading: Claim your badge here

Taught by

Joseph Santarcangelo

Reviews

Start your review of Mastering Generative AI: Language Models with Transformers

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.