Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 95% Match Research Paper GNN Researchers,LLM Researchers,Data Scientists,AI Engineers working with graph data 2 weeks ago

LLM as GNN: Graph Vocabulary Learning for Text-Attributed Graph Foundation Models

graph-neural-networks › graph-learning
📄 Abstract

Abstract: Text-Attributed Graphs (TAGs), where each node is associated with text descriptions, are ubiquitous in real-world scenarios. They typically exhibit distinctive structure and domain-specific knowledge, motivating the development of a Graph Foundation Model (GFM) that generalizes across diverse graphs and tasks. Despite large efforts to integrate Large Language Models (LLMs) and Graph Neural Networks (GNNs) for TAGs, existing approaches suffer from decoupled architectures with two-stage alignment, limiting their synergistic potential. Even worse, existing methods assign out-of-vocabulary (OOV) tokens to graph nodes, leading to graph-specific semantics, token explosion, and incompatibility with task-oriented prompt templates, which hinders cross-graph and cross-task transferability. To address these challenges, we propose PromptGFM, a versatile GFM for TAGs grounded in graph vocabulary learning. PromptGFM comprises two key components: (1) Graph Understanding Module, which explicitly prompts LLMs to replicate the finest GNN workflow within the text space, facilitating seamless GNN-LLM integration and elegant graph-text alignment; (2) Graph Inference Module, which establishes a language-based graph vocabulary ensuring expressiveness, transferability, and scalability, enabling readable instructions for LLM fine-tuning. Extensive experiments demonstrate our superiority and transferability across diverse graphs and tasks. The code is available at this: https://github.com/agiresearch/PromptGFM.
Authors (10)
Xi Zhu
Haochen Xue
Ziwei Zhao
Wujiang Xu
Jingyuan Huang
Minghao Guo
+4 more
Submitted
March 5, 2025
arXiv Category
cs.LG
arXiv PDF

Key Contributions

PromptGFM proposes a versatile Graph Foundation Model (GFM) for Text-Attributed Graphs (TAGs) by introducing graph vocabulary learning and prompt-based integration of LLMs and GNNs. It addresses challenges like decoupled architectures and out-of-vocabulary tokens, enabling better cross-graph and cross-task transferability through a unified graph vocabulary and task-oriented prompting.

Business Value

Enables more powerful and adaptable graph-based AI systems that can leverage both structural graph information and rich textual descriptions, leading to better insights in areas like social network analysis, recommendation engines, and scientific discovery.