Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ai 95% Match Theoretical Research GNN Researchers,Theoretical ML Scientists,Formal Methods Experts 1 week ago

The Logical Expressiveness of Temporal GNNs via Two-Dimensional Product Logics

graph-neural-networks β€Ί graph-learning
πŸ“„ Abstract

Abstract: In recent years, the expressive power of various neural architectures -- including graph neural networks (GNNs), transformers, and recurrent neural networks -- has been characterised using tools from logic and formal language theory. As the capabilities of basic architectures are becoming well understood, increasing attention is turning to models that combine multiple architectural paradigms. Among them particularly important, and challenging to analyse, are temporal extensions of GNNs, which integrate both spatial (graph-structure) and temporal (evolution over time) dimensions. In this paper, we initiate the study of logical characterisation of temporal GNNs by connecting them to two-dimensional product logics. We show that the expressive power of temporal GNNs depends on how graph and temporal components are combined. In particular, temporal GNNs that apply static GNNs recursively over time can capture all properties definable in the product logic of (past) propositional temporal logic PTL and the modal logic K. In contrast, architectures such as graph-and-time TGNNs and global TGNNs can only express restricted fragments of this logic, where the interaction between temporal and spatial operators is syntactically constrained. These provide us with the first results on the logical expressiveness of temporal GNNs.
Authors (3)
Marco SΓ€lzer
PrzemysΕ‚aw Andrzej WaΕ‚Δ™ga
Martin Lange
Submitted
May 17, 2025
arXiv Category
cs.LG
arXiv PDF

Key Contributions

This paper initiates the study of logical characterization for temporal GNNs by connecting them to two-dimensional product logics. It demonstrates that the expressive power of temporal GNNs depends on how graph and temporal components are combined, showing that recursively applied static GNNs can capture properties definable in product logic.

Business Value

Provides a theoretical foundation for understanding and designing more powerful temporal graph neural networks, which can lead to better models for analyzing dynamic systems in various fields.