Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Distilling Foundation Models via Energy Hessians for Fast, Specialized Machine Learning Force Fields

Valence Labs via YouTube

Overview

Coursera Plus Monthly Sale: All Certificates & Courses 40% Off!
This conference talk explores a novel method for transferring knowledge from large foundation models to smaller, faster Machine Learning Force Fields (MLFFs) specialized for specific chemical applications. Learn how researchers Ishan Amin and Sanjeev Raja developed a knowledge distillation procedure that matches energy prediction Hessians between teacher and student models, resulting in specialized MLFFs that can run up to 20 times faster while maintaining or exceeding performance. The presentation covers their approach across multiple foundation models, large-scale datasets, chemical subsets, and downstream tasks, demonstrating how distilled models can maintain energy conservation during molecular dynamics simulations while leveraging representations from large-scale teacher models. Discover a new paradigm for MLFF development that combines the scalability of foundation models with the efficiency of specialized simulation engines for common chemical subsets.

Syllabus

Distilling Foundation Models via Energy Hessians | Ishan Amin & Sanjeev Raja

Taught by

Valence Labs

Reviews

Start your review of Distilling Foundation Models via Energy Hessians for Fast, Specialized Machine Learning Force Fields

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.