Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 90% Match Theoretical Research Researchers in graph algorithms and machine learning,Data scientists working on clustering and segmentation 20 hours ago

Probabilistic Graph Cuts

graph-neural-networks › graph-learning
📄 Abstract

Abstract: Probabilistic relaxations of graph cuts offer a differentiable alternative to spectral clustering, enabling end-to-end and online learning without eigendecompositions, yet prior work centered on RatioCut and lacked general guarantees and principled gradients. We present a unified probabilistic framework that covers a wide class of cuts, including Normalized Cut. Our framework provides tight analytic upper bounds on expected discrete cuts via integral representations and Gauss hypergeometric functions with closed-form forward and backward. Together, these results deliver a rigorous, numerically stable foundation for scalable, differentiable graph partitioning covering a wide range of clustering and contrastive learning objectives.

Key Contributions

This paper unifies probabilistic relaxations for graph cuts, extending beyond RatioCut to include Normalized Cut, providing a rigorous and numerically stable framework. It offers tight analytic upper bounds and closed-form gradients, enabling scalable, end-to-end, and online learning for various clustering and contrastive learning objectives.

Business Value

Facilitates more robust and scalable clustering and segmentation tasks in areas like image analysis and data mining, enabling end-to-end trainable systems.