Improving the Speed and Accuracy of Neural Network Interatomic Potentials Across Chemical Domains
Valence Labs via YouTube
Overview
This conference talk explores the development of the Efficiently Scaled Attention Interatomic Potential (EScAIP), a novel approach to Neural Network Interatomic Potentials (NNIPs) that prioritizes scalability over complex physical domain constraints. Learn how attention mechanisms within graph neural networks can dramatically improve both efficiency and performance, resulting in at least 10x faster inference and 5x less memory usage compared to existing models. Discover how this architecture achieves state-of-the-art results across diverse chemical domains including catalysts (OC20 and OC22), molecules (SPICE), and materials (MPTrj). The presentation emphasizes a philosophical shift toward general-purpose NNIPs that gain expressivity through scaling rather than through increasingly complex physical constraints, potentially avoiding performance plateaus in the long term. The talk is based on research published in a paper available on arXiv.
Syllabus
Improving the Speed and Accuracy of Neural Network Interatomic Potentials Across Chemical Domains
Taught by
Valence Labs