
Overview

FLASH SALE: Ends May 22!
Udemy online courses up to 85% off.
Get Deal
This conference talk by Deepak Das at DevConf.IN 2025 explores best practices and optimization techniques for achieving accurate custom Large Language Models using RHEL AI, a Red Hat product based on the upstream InstructLab project. Learn how to customize LLMs with private data while maintaining optimal performance through key optimization stages: Data Seed, Synthetic Data Generation, Training, Evaluation and Re-Training, and Prompt Engineering. Discover practical techniques applicable not only to RHEL AI but also to upstream InstructLab and third-party LLM models in general. The 49-minute presentation provides valuable insights for anyone looking to enhance LLM accuracy and performance in their AI implementations.
Syllabus
RHEL AI: Best Practices And Optimization Techniques To Achieve Accurate Custom LLM - DevConf.IN 2025
Taught by
DevConf