Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
π Abstract
Abstract: We consider the problem of federated learning (FL) with graph-structured data
distributed across multiple clients. In particular, we address the prevalent
scenario of interconnected subgraphs, where interconnections between clients
significantly influence the learning process. Existing approaches suffer from
critical limitations, either requiring the exchange of sensitive node
embeddings, thereby posing privacy risks, or relying on
computationally-intensive steps, which hinders scalability. To tackle these
challenges, we propose FedLap, a novel framework that leverages global
structure information via Laplacian smoothing in the spectral domain to
effectively capture inter-node dependencies while ensuring privacy and
scalability. We provide a formal analysis of the privacy of FedLap,
demonstrating that it preserves privacy. Notably, FedLap is the first subgraph
FL scheme with strong privacy guarantees. Extensive experiments on benchmark
datasets demonstrate that FedLap achieves competitive or superior utility
compared to existing techniques.
Authors (4)
Javad Aliakbari
Johan Γstman
Ashkan Panahi
Alexandre Graell i Amat
Submitted
October 29, 2025
Key Contributions
FedLap is a novel framework for subgraph federated learning that addresses privacy and scalability concerns. It leverages global structure information via Laplacian smoothing in the spectral domain to capture inter-node dependencies while providing strong privacy guarantees, making it the first subgraph FL scheme with such guarantees.
Business Value
Enables secure and efficient collaborative learning on distributed graph data, crucial for applications in social networks, recommendation systems, and cybersecurity where data privacy is paramount.