Improved Predictive Uncertainty using Corruption-based Calibration

Tiago Salvador, Vikram Voleti, Alexander Iannantuono, Adam Oberman

We propose a simple post hoc calibration method to estimate the confidence/uncertainty that a model prediction is correct on data with covariate shift, as represented by the large-scale corrupted data benchmark [Ovadia et al, 2019]. We achieve this by synthesizing surrogate calibration sets by corrupting the calibration set with varying intensities of a known corruption. Our method demonstrates significant improvements on the benchmark on a wide range of covariate shifts.

Knowledge Graph



Sign up or login to leave a comment