Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 95% Match Research Paper Machine Learning Researchers,Graph Neural Network Practitioners,Data Scientists 1 day ago

Can Classic GNNs Be Strong Baselines for Graph-level Tasks? Simple Architectures Meet Excellence

graph-neural-networks › graph-learning
📄 Abstract

Abstract: Message-passing Graph Neural Networks (GNNs) are often criticized for their limited expressiveness, issues like over-smoothing and over-squashing, and challenges in capturing long-range dependencies. Conversely, Graph Transformers (GTs) are regarded as superior due to their employment of global attention mechanisms, which potentially mitigate these challenges. Literature frequently suggests that GTs outperform GNNs in graph-level tasks, especially for graph classification and regression on small molecular graphs. In this study, we explore the untapped potential of GNNs through an enhanced framework, GNN+, which integrates six widely used techniques: edge feature integration, normalization, dropout, residual connections, feed-forward networks, and positional encoding, to effectively tackle graph-level tasks. We conduct a systematic re-evaluation of three classic GNNs (GCN, GIN, and GatedGCN) enhanced by the GNN+ framework across 14 well-known graph-level datasets. Our results reveal that, contrary to prevailing beliefs, these classic GNNs consistently match or surpass the performance of GTs, securing top-three rankings across all datasets and achieving first place in eight. Furthermore, they demonstrate greater efficiency, running several times faster than GTs on many datasets. This highlights the potential of simple GNN architectures, challenging the notion that complex mechanisms in GTs are essential for superior graph-level performance. Our source code is available at https://github.com/LUOyk1999/GNNPlus.
Authors (3)
Yuankai Luo
Lei Shi
Xiao-Ming Wu
Submitted
February 13, 2025
arXiv Category
cs.LG
arXiv PDF

Key Contributions

This paper proposes an enhanced framework (GNN+) that integrates six widely used techniques to improve the performance of classic GNNs on graph-level tasks. It systematically re-evaluates GCN, GIN, and GatedGCN with this framework across 14 datasets, challenging the notion that Graph Transformers are always superior.

Business Value

By demonstrating that enhanced classic GNNs can be strong baselines, this research could lead to more efficient and effective graph-based machine learning models for applications in drug discovery, materials science, and social network analysis, potentially reducing reliance on more complex transformer architectures.