Skip to Main content Skip to Navigation
New interface
Conference papers

PyCP: An Open-Source Conformal Predictions Toolkit

Abstract : The Conformal Predictions framework is a new game-theoretic approach to reliable machine learning, which provides a methodology to obtain error calibration under classification and regression settings. The framework combines principles of transductive inference, algorithmic randomness and hypothesis testing to provide guaranteed error calibration in online settings (and calibration in offline settings supported by empirical studies). As the framework is being increasingly used in a variety of machine learning settings such as active learning, anomaly detection, feature selection, and change detection, there is a need to develop algorithmic implementations of the framework that can be used and further improved by researchers and practitioners. In this paper, we introduce PyCP, an open-source implementation of the Conformal Predictions framework that currently provides support for classification problems within transductive and Mondrian settings. PyCP is modular, extensible and intended for community sharing and development.
Document type :
Conference papers
Complete list of metadata

Cited literature [15 references]  Display  Hide  Download
Contributor : Hal Ifip Connect in order to contact the contributor
Submitted on : Tuesday, February 7, 2017 - 1:05:39 PM
Last modification on : Thursday, March 5, 2020 - 5:41:28 PM
Long-term archiving on: : Monday, May 8, 2017 - 2:09:27 PM


Files produced by the author(s)


Distributed under a Creative Commons Attribution 4.0 International License



Vineeth N. Balasubramanian, Aaron Baker, Matthew Yanez, Shayok Chakraborty, Sethuraman Panchanathan. PyCP: An Open-Source Conformal Predictions Toolkit. 9th Artificial Intelligence Applications and Innovations (AIAI), Sep 2013, Paphos, Greece. pp.361-370, ⟨10.1007/978-3-642-41142-7_37⟩. ⟨hal-01459631⟩



Record views


Files downloads