This talk by Sebastian Wind from NHR@FAU explores the fundamentals of Large Language Models (LLMs) and their relationship with high-performance computing. Discover how LLMs function, their architectural design, and efficient training methodologies on HPC systems. Learn about scaling strategies and principles derived from scaling laws that enable effective LLM deployment. Gain insights into the challenges and opportunities presented by these revolutionary AI systems, with a focus on the open-source community's role in advancing LLM accessibility and innovation. Accompanying slides are available for download, and additional materials from past HPC Café events can be accessed through the provided link.
Overview
Syllabus
HPC Café: Large Language Models for Dummies
Taught by
NHR@FAU