Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Improving the Speed and Accuracy of Neural Network Interatomic Potentials Across Chemical Domains

Valence Labs via YouTube

Overview

Coursera Plus Annual Sale: All Certificates & Courses 25% Off!
This conference talk explores the development of the Efficiently Scaled Attention Interatomic Potential (EScAIP), a novel approach to Neural Network Interatomic Potentials (NNIPs) that prioritizes scalability over complex physical domain constraints. Learn how attention mechanisms within graph neural networks can dramatically improve both efficiency and performance, resulting in at least 10x faster inference and 5x less memory usage compared to existing models. Discover how this architecture achieves state-of-the-art results across diverse chemical domains including catalysts (OC20 and OC22), molecules (SPICE), and materials (MPTrj). The presentation emphasizes a philosophical shift toward general-purpose NNIPs that gain expressivity through scaling rather than through increasingly complex physical constraints, potentially avoiding performance plateaus in the long term. The talk is based on research published in a paper available on arXiv.

Syllabus

Improving the Speed and Accuracy of Neural Network Interatomic Potentials Across Chemical Domains

Taught by

Valence Labs

Reviews

Start your review of Improving the Speed and Accuracy of Neural Network Interatomic Potentials Across Chemical Domains

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.