Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_stat_ml 90% Match Research Paper Statisticians,Machine learning practitioners,Data scientists,Researchers in decision-making 2 months ago

Unified Conformalized Multiple Testing with Full Data Efficiency

ai-safety › alignment
📄 Abstract

Abstract: Conformalized multiple testing offers a model-free way to control predictive uncertainty in decision-making. Existing methods typically use only part of the available data to build score functions tailored to specific settings. We propose a unified framework that puts data utilization at the center: it uses all available data-null, alternative, and unlabeled-to construct scores and calibrate p-values through a full permutation strategy. This unified use of all available data significantly improves power by enhancing non-conformity score quality and maximizing calibration set size while rigorously controlling the false discovery rate. Crucially, our framework provides a systematic design principle for conformal testing and enables automatic selection of the best conformal procedure among candidates without extra data splitting. Extensive numerical experiments demonstrate that our enhanced methods deliver superior efficiency and adaptability across diverse scenarios.

Key Contributions

This paper proposes a unified framework for conformalized multiple testing that achieves full data efficiency by using all available data (null, alternative, unlabeled). This approach significantly improves power and adaptability by enhancing score quality and maximizing calibration set size while rigorously controlling the false discovery rate.

Business Value

Provides a more powerful and data-efficient method for statistical inference and decision-making under uncertainty, applicable in various fields requiring rigorous hypothesis testing.