Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_cv 95% Match Dataset Paper Automated driving researchers,Robotics engineers,Sensor fusion specialists,AI safety researchers 2 weeks ago

Descriptor: Occluded nuScenes: A Multi-Sensor Dataset for Evaluating Perception Robustness in Automated Driving

computer-vision › object-detection
📄 Abstract

Abstract: Robust perception in automated driving requires reliable performance under adverse conditions, where sensors may be affected by partial failures or environmental occlusions. Although existing autonomous driving datasets inherently contain sensor noise and environmental variability, very few enable controlled, parameterised, and reproducible degradations across multiple sensing modalities. This gap limits the ability to systematically evaluate how perception and fusion architectures perform under well-defined adverse conditions. To address this limitation, we introduce the Occluded nuScenes Dataset, a novel extension of the widely used nuScenes benchmark. For the camera modality, we release both the full and mini versions with four types of occlusions, two adapted from public implementations and two newly designed. For radar and LiDAR, we provide parameterised occlusion scripts that implement three types of degradations each, enabling flexible and repeatable generation of occluded data. This resource supports consistent, reproducible evaluation of perception models under partial sensor failures and environmental interference. By releasing the first multi-sensor occlusion dataset with controlled and reproducible degradations, we aim to advance research on robust sensor fusion, resilience analysis, and safety-critical perception in automated driving.
Authors (7)
Sanjay Kumar
Tim Brophy
Reenu Mohandas
Eoin Martino Grua
Ganesh Sistu
Valentina Donzella
+1 more
Submitted
October 21, 2025
arXiv Category
cs.CV
arXiv PDF

Key Contributions

Introduces the Occluded nuScenes Dataset, a novel extension of the nuScenes benchmark designed for evaluating perception robustness in automated driving under controlled occlusions and sensor degradations. It provides parameterised scripts for camera, radar, and LiDAR to enable systematic and reproducible evaluation.

Business Value

Facilitates the development and validation of more reliable perception systems for autonomous vehicles, crucial for safety and public acceptance. Allows for standardized testing and benchmarking of robustness.