Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Hyper-Optimized Tensor Network Contraction - Simplifications, Applications and Approximations

Institute for Pure & Applied Mathematics (IPAM) via YouTube

Overview

Explore tensor network contraction optimization techniques in this 33-minute conference talk from the Tensor Methods and Emerging Applications to the Physical and Data Sciences 2021 workshop. Delve into hyper-optimized methods based on hypergraph partitioning for building efficient contraction trees. Discover a set of powerful tensor network simplifications designed to facilitate easier contraction. Examine applications in quantum circuit simulation and weighted model counting. Gain insights into extending these concepts to approximate contraction. Learn from Johnnie Gray of the California Institute of Technology as he presents advanced strategies for tackling complex tensor network geometries and improving computational efficiency.

Syllabus

Introduction
tensor network
example
contraction tree
hyperindices
partition
partition function
hypergraph partitioning
tensor network simplification
rank simplification
detailed simplifications
low rank decompositions
diagonal hyperindexes
gauge freedom
hybrid reduction
qaoa
weighted model counting
approximate contract
Conclusions

Taught by

Institute for Pure & Applied Mathematics (IPAM)

Reviews

Start your review of Hyper-Optimized Tensor Network Contraction - Simplifications, Applications and Approximations

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.