Explanation generation for anomaly detection models applied to the fuel consumption of a vehicle fleet

Alberto Barbado, Óscar Corcho

In this paper we show a complete process for unsupervised anomaly detection for the fuel consumption of a vehicle fleet, that is able to explain which variables affect the consumption in terms of feature relevance. We combine anomaly detection with a surrogate model that is able to provide that feature relevance. For these surrogate models, we evaluate both whitebox ones from the literature, as well as novel variations over them, and blackbox models combined with local posthoc feature relevance techniques. The evaluation is done using real IoT data, and is measured both in terms of model performance, as well as using Explainable AI metrics that compare the explanations generated in terms representativeness, fidelity, stability and contrastiveness. This provides a complete evaluation both in terms of predictive power and XAI. The explanations generate counterfactual recommendations that show what could have been done to reduce the fuel consumption of a vehicle and turn it into an inlier. The procedure is combined with domain knowledge expressed in business rules, and is able to adequate the type of explanations depending on the target user profile.

Knowledge Graph



Sign up or login to leave a comment