Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 95% Match Research Paper GNN Researchers,Machine Learning Engineers,Data Scientists working with graph data 1 week ago

Return of ChebNet: Understanding and Improving an Overlooked GNN on Long Range Tasks

graph-neural-networks β€Ί graph-learning
πŸ“„ Abstract

Abstract: ChebNet, one of the earliest spectral GNNs, has largely been overshadowed by Message Passing Neural Networks (MPNNs), which gained popularity for their simplicity and effectiveness in capturing local graph structure. Despite their success, MPNNs are limited in their ability to capture long-range dependencies between nodes. This has led researchers to adapt MPNNs through rewiring or make use of Graph Transformers, which compromises the computational efficiency that characterized early spatial message-passing architectures, and typically disregards the graph structure. Almost a decade after its original introduction, we revisit ChebNet to shed light on its ability to model distant node interactions. We find that out-of-box, ChebNet already shows competitive advantages relative to classical MPNNs and GTs on long-range benchmarks, while maintaining good scalability properties for high-order polynomials. However, we uncover that this polynomial expansion leads ChebNet to an unstable regime during training. To address this limitation, we cast ChebNet as a stable and non-dissipative dynamical system, which we coin Stable-ChebNet. Our Stable-ChebNet model allows for stable information propagation, and has controllable dynamics which do not require the use of eigendecompositions, positional encodings, or graph rewiring. Across several benchmarks, Stable-ChebNet achieves near state-of-the-art performance.
Authors (9)
Ali Hariri
Álvaro Arroyo
Alessio Gravina
Moshe Eliasof
Carola-Bibiane SchΓΆnlieb
Davide Bacciu
+3 more
Submitted
June 9, 2025
arXiv Category
cs.LG
arXiv PDF

Key Contributions

Revisits and analyzes ChebNet, demonstrating its competitive advantages over MPNNs and Graph Transformers for long-range dependency tasks while maintaining scalability. It identifies that ChebNet's polynomial expansion, while powerful, can lead to instability, suggesting avenues for improvement.

Business Value

Offers a potentially more efficient and scalable approach for analyzing complex networks where long-range interactions are important, such as social networks or biological networks.