Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ml 90% Match Research Paper AI Safety Researchers,ML Engineers,Medical Imaging Professionals,Developers of Safety-Critical AI Systems 20 hours ago

DIsoN: Decentralized Isolation Networks for Out-of-Distribution Detection in Medical Imaging

ai-safety › robustness
📄 Abstract

Abstract: Safe deployment of machine learning (ML) models in safety-critical domains such as medical imaging requires detecting inputs with characteristics not seen during training, known as out-of-distribution (OOD) detection, to prevent unreliable predictions. Effective OOD detection after deployment could benefit from access to the training data, enabling direct comparison between test samples and the training data distribution to identify differences. State-of-the-art OOD detection methods, however, either discard the training data after deployment or assume that test samples and training data are centrally stored together, an assumption that rarely holds in real-world settings. This is because shipping the training data with the deployed model is usually impossible due to the size of training databases, as well as proprietary or privacy constraints. We introduce the Isolation Network, an OOD detection framework that quantifies the difficulty of separating a target test sample from the training data by solving a binary classification task. We then propose Decentralized Isolation Networks (DIsoN), which enables the comparison of training and test data when data-sharing is impossible, by exchanging only model parameters between the remote computational nodes of training and deployment. We further extend DIsoN with class-conditioning, comparing a target sample solely with training data of its predicted class. We evaluate DIsoN on four medical imaging datasets (dermatology, chest X-ray, breast ultrasound, histopathology) across 12 OOD detection tasks. DIsoN performs favorably against existing methods while respecting data-privacy. This decentralized OOD detection framework opens the way for a new type of service that ML developers could provide along with their models: providing remote, secure utilization of their training data for OOD detection services. Code: https://github.com/FelixWag/DIsoN

Key Contributions

DIsoN introduces a decentralized framework for Out-of-Distribution (OOD) detection in medical imaging, addressing the practical challenge of detecting unseen data characteristics without access to the full training dataset. It quantifies the difficulty of separating test samples from the training distribution.

Business Value

Enhances the safety and reliability of AI models deployed in critical applications like healthcare by providing a mechanism to detect and flag potentially erroneous predictions on out-of-distribution inputs, thereby building trust and preventing harm.