Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 97% Match Research paper Computational chemists,Materials scientists,Physicists,ML researchers in scientific domains 1 week ago

Tensor Decomposition Networks for Fast Machine Learning Interatomic Potential Computations

graph-neural-networks › molecular-modeling
📄 Abstract

Abstract: $\rm{SO}(3)$-equivariant networks are the dominant models for machine learning interatomic potentials (MLIPs). The key operation of such networks is the Clebsch-Gordan (CG) tensor product, which is computationally expensive. To accelerate the computation, we develop tensor decomposition networks (TDNs) as a class of approximately equivariant networks in which CG tensor products are replaced by low-rank tensor decompositions, such as the CANDECOMP/PARAFAC (CP) decomposition. With the CP decomposition, we prove (i) a uniform bound on the induced error of $\rm{SO}(3)$-equivariance, and (ii) the universality of approximating any equivariant bilinear map. To further reduce the number of parameters, we propose path-weight sharing that ties all multiplicity-space weights across the $\mathcal{O}(L^3)$ CG paths into a single path without compromising equivariance, where $L$ is the maximum angular degree. The resulting layer acts as a plug-and-play replacement for tensor products in existing networks, and the computational complexity of tensor products is reduced from $\mathcal{O}(L^6)$ to $\mathcal{O}(L^4)$. We evaluate TDNs on PubChemQCR, a newly curated molecular relaxation dataset containing 105 million DFT-calculated snapshots. We also use existing datasets, including OC20, and OC22. Results show that TDNs achieve competitive performance with dramatic speedup in computations. Our code is publicly available as part of the AIRS library (\href{https://github.com/divelab/AIRS/tree/main/OpenMol/TDN}{https://github.com/divelab/AIRS/}).
Authors (9)
Yuchao Lin
Cong Fu
Zachary Krueger
Haiyang Yu
Maho Nakata
Jianwen Xie
+3 more
Submitted
July 1, 2025
arXiv Category
cs.LG
arXiv PDF

Key Contributions

Introduces Tensor Decomposition Networks (TDNs) as a class of approximately SO(3)-equivariant networks that replace computationally expensive Clebsch-Gordan tensor products with low-rank tensor decompositions. This significantly accelerates MLIP computations while providing theoretical bounds on equivariance error and enabling parameter reduction through path-weight sharing.

Business Value

Enables faster and more accurate simulations of molecular behavior, accelerating materials discovery, drug development, and chemical process optimization.