[go: up one dir, main page]

Foundations and Trends® in Machine Learning > Vol 16 > Issue 4

Conformal Prediction: A Gentle Introduction

By Anastasios N. Angelopoulos, University of California, Berkeley, USA, angelopoulos@berkeley.edu | Stephen Bates, University of California, Berkeley, USA, stephenbates@berkeley.edu

 
Suggested Citation
Anastasios N. Angelopoulos and Stephen Bates (2023), "Conformal Prediction: A Gentle Introduction", Foundations and TrendsĀ® in Machine Learning: Vol. 16: No. 4, pp 494-591. http://dx.doi.org/10.1561/2200000101

Publication Date: 27 Mar 2023
© 2023 A. N. Angelopoulos and S. Bates
 
Subjects
Classification and prediction,  Nonparametric methods,  Learning and statistical methods,  Statistical/machine learning
 

Free Preview:

Download extract

Share

Download article
In this article:
1. Conformal Prediction
2. Examples of Conformal Procedures
3. Evaluating Conformal Prediction
4. Extensions of Conformal Prediction
5. Worked Examples
6. Full Conformal Prediction
7. Historical Notes on Conformal Prediction
Acknowledgements
Appendices
References

Abstract

Black-box machine learning models are now routinely used in high-risk settings, like medical diagnostics, which demand uncertainty quantification to avoid consequential model failures. Conformal prediction (a.k.a. conformal inference) is a user-friendly paradigm for creating statistically rigorous uncertainty sets/intervals for the predictions of such models. Critically, the sets are valid in a distribution-free sense: they possess explicit, non-asymptotic guarantees even without distributional assumptions or model assumptions. One can use conformal prediction with any pre-trained model, such as a neural network, to produce sets that are guaranteed to contain the ground truth with a user-specified probability, such as 90%. It is easy-to-understand, easy-to-use, and general, applying naturally to problems arising in the fields of computer vision, natural language processing, deep reinforcement learning, and so on.

This hands-on introduction is aimed to provide the reader a working understanding of conformal prediction and related distribution-free uncertainty quantification techniques with one self-contained document. We lead the reader through practical theory for and examples of conformal prediction and describe its extensions to complex machine learning tasks involving structured outputs, distribution shift, timeseries, outliers, models that abstain, and more. Throughout, there are many explanatory illustrations, examples, and code samples in Python. With each code sample comes a Jupyter notebook implementing the method on a real-data example; the notebooks can be accessed and easily run by following the code footnotes.

DOI:10.1561/2200000101
ISBN: 978-1-63828-158-0
114 pp. $80.00
Buy book (pb)
 
ISBN: 978-1-63828-159-7
114 pp. $150.00
Buy E-book (.pdf)
Table of contents:
1. Conformal Prediction
2. Examples of Conformal Procedures
3. Evaluating Conformal Prediction
4. Extensions of Conformal Prediction
5. Worked Examples
6. Full Conformal Prediction
7. Historical Notes on Conformal Prediction
8. Acknowledgements
Appendices
References

Conformal Prediction: A Gentle Introduction

Black-box machine learning models are now routinely used in high-risk settings, like medical diagnostics, which demand uncertainty quantification to avoid consequential model failures. Conformal prediction is a user-friendly paradigm for creating statistically rigorous uncertainty sets/intervals for the predictions of such models. One can use conformal prediction with any pre-trained model, such as a neural network, to produce sets that are guaranteed to contain the ground truth with a user-specified probability, such as 90%. It is easy-to-understand, easy-to-use, and in general, applies naturally to problems arising in the fields of computer vision, natural language processing, deep reinforcement learning, amongst others.

In this hands-on introduction the authors provide the reader with a working understanding of conformal prediction and related distribution-free uncertainty quantification techniques. They lead the reader through practical theory and examples of conformal prediction and describe its extensions to complex machine learning tasks involving structured outputs, distribution shift, time-series, outliers, models that abstain, and more. Throughout, there are many explanatory illustrations, examples, and code samples in Python. With each code sample comes a Jupyter notebook implementing the method on a real-data example.

This hands-on tutorial, full of practical and accessible examples, is essential reading for all students, practitioners and researchers working on all types of systems deploying machine learning techniques.

 
MAL-101