Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 95% Match Short Note / Survey Researchers in Graph Neural Networks,Theoretical machine learning researchers,Data scientists working with graph data 1 week ago

A Short Note on Upper Bounds for Graph Neural Operator Convergence Rate

graph-neural-networks › graph-learning
📄 Abstract

Abstract: Graphons, as limits of graph sequences, provide a framework for analyzing the asymptotic behavior of graph neural operators. Spectral convergence of sampled graphs to graphons yields operator-level convergence rates, enabling transferability analyses of GNNs. This note summarizes known bounds under no assumptions, global Lipschitz continuity, and piecewise-Lipschitz continuity, highlighting tradeoffs between assumptions and rates, and illustrating their empirical tightness on synthetic and real data.
Authors (2)
Roxanne Holden
Luana Ruiz
Submitted
October 23, 2025
arXiv Category
stat.ML
arXiv PDF

Key Contributions

This note summarizes and analyzes known upper bounds for the convergence rate of graph neural operators based on spectral convergence to graphons. It highlights the trade-offs between different assumptions (e.g., Lipschitz continuity) and the resulting convergence rates, providing a theoretical foundation for understanding GNN transferability.

Business Value

Enhances the theoretical understanding and reliability of graph-based machine learning models, leading to more robust and predictable performance in applications involving network data.