Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

edX

Large Language Models: Foundation Models from the Ground Up

Databricks via edX

Overview

This course dives into the details of LLM foundation models. You will learn the innovations that led to the proliferation of transformer-based architectures, from encoder models (BERT), to decoder models (GPT), to encoder-decoder models (T5). You will also learn about the recent breakthroughs that led to applications like ChatGPT. You will gain understanding about the latest advances that continue to improve LLM functionality including Flash Attention, LoRa, AliBi, and PEFT methods. The course concludes with an overview of multi-modal LLM developments to address NLP problems involving a combination of text, audio, and visual components.

Syllabus

  • Module 1 - Transformer Architecture: Attention & Transformer Fundamentals

  • Module 2 - Efficient Fine Tuning

  • Module 3 - Deployment and Hardware Considerations

  • Module 4 - Beyond Text-Based LLMs: Multi-Modality

Taught by

Sam Raymond, Chengyin Eng, Joseph Bradley and Matei Zaharia

Reviews

Start your review of Large Language Models: Foundation Models from the Ground Up

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.