Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: Graphs model latent variable relationships in many real-world systems, and
Message Passing Neural Networks (MPNNs) are widely used to learn such
structures for downstream tasks. While edge-based MPNNs effectively capture
local interactions, their expressive power is theoretically bounded, limiting
the discovery of higher-order relationships. We introduce the Higher-Order
Graph Attention (HoGA) module, which constructs a k-order attention matrix by
sampling subgraphs to maximize diversity among feature vectors. Unlike existing
higher-order attention methods that greedily resample similar k-order
relationships, HoGA targets diverse modalities in higher-order topology,
reducing redundancy and expanding the range of captured substructures. Applied
to two single-hop attention models, HoGA achieves at least a 5% accuracy gain
on all benchmark node classification datasets and outperforms recent baselines
on six of eight datasets. Code is available at
https://github.com/TB862/Higher_Order.
Authors (3)
Thomas Bailie
Yun Sing Koh
Karthik Mukkavilli
Submitted
November 18, 2024
Key Contributions
Introduces the Higher-Order Graph Attention (HoGA) module, which enhances Message Passing Neural Networks by constructing k-order attention matrices through diversity-aware k-hop sampling. This method effectively captures higher-order relationships in graphs by targeting diverse modalities in higher-order topology, reducing redundancy and improving performance on node classification tasks.
Business Value
Improves the accuracy of graph-based machine learning models, leading to better insights and predictions in areas like social network analysis, fraud detection, and drug discovery.