Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 90% Match Research Paper ML Researchers,Data Scientists,NLP Engineers,Graph ML Practitioners 1 month ago

Dynamic Bundling with Large Language Models for Zero-Shot Inference on Text-Attributed Graphs

graph-neural-networks › knowledge-graphs
📄 Abstract

Abstract: Large language models (LLMs) have been used in many zero-shot learning problems, with their strong generalization ability. Recently, adopting LLMs in text-attributed graphs (TAGs) has drawn increasing attention. However, the adoption of LLMs faces two major challenges: limited information on graph structure and unreliable responses. LLMs struggle with text attributes isolated from the graph topology. Worse still, they yield unreliable predictions due to both information insufficiency and the inherent weakness of LLMs (e.g., hallucination). Towards this end, this paper proposes a novel method named Dynamic Text Bundling Supervision (DENSE) that queries LLMs with bundles of texts to obtain bundle-level labels and uses these labels to supervise graph neural networks. Specifically, we sample a set of bundles, each containing a set of nodes with corresponding texts of close proximity. We then query LLMs with the bundled texts to obtain the label of each bundle. Subsequently, the bundle labels are used to supervise the optimization of graph neural networks, and the bundles are further refined to exclude noisy items. To justify our design, we also provide theoretical analysis of the proposed method. Extensive experiments across ten datasets validate the effectiveness of the proposed method.

Key Contributions

Proposes DENSE, a novel method for zero-shot inference on text-attributed graphs using LLMs. It addresses LLM limitations by querying LLMs with bundles of texts to obtain bundle-level labels, which then supervise graph neural networks.

Business Value

Enables more accurate and reliable analysis of complex, interconnected data where both text and structure are important, leading to better insights for recommendation systems, knowledge discovery, and network analysis.