Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: Collaborative perception significantly enhances individual vehicle perception
performance through the exchange of sensory information among agents. However,
real-world deployment faces challenges due to bandwidth constraints and
inevitable calibration errors during information exchange. To address these
issues, we propose mmCooper, a novel multi-agent, multi-stage,
communication-efficient, and collaboration-robust cooperative perception
framework. Our framework leverages a multi-stage collaboration strategy that
dynamically and adaptively balances intermediate- and late-stage information to
share among agents, enhancing perceptual performance while maintaining
communication efficiency. To support robust collaboration despite potential
misalignments and calibration errors, our framework prevents misleading
low-confidence sensing information from transmission and refines the received
detection results from collaborators to improve accuracy. The extensive
evaluation results on both real-world and simulated datasets demonstrate the
effectiveness of the mmCooper framework and its components.
Authors (7)
Bingyi Liu
Jian Teng
Hongfei Xue
Enshu Wang
Chuanhui Zhu
Pu Wang
+1 more
Submitted
January 21, 2025
Key Contributions
Proposes mmCooper, a novel multi-agent, multi-stage framework for communication-efficient and collaboration-robust cooperative perception. It dynamically balances information sharing and refines received data to improve accuracy despite bandwidth constraints and calibration errors.
Business Value
Enables more reliable and efficient perception for fleets of autonomous vehicles or robots, improving safety and operational capabilities in complex environments.