Better classifier calibration for small datasets
Alasalmi, Tuomo; Suutala, Jaakko; Röning, Juha; Koskimäki, Heli (2020-05-01)
Alasalmi Tuomo, Jaakko Suutala, Juha Röning, and Heli Koskimäki. 2020. Better Classifier Calibration for Small Datasets. ACM Trans. Knowl. Discov. Data 14, 3, Article 34 (May 2020), 19 pages. DOI:https://doi.org/10.1145/3385656
© 2020 Association for Computing Machinery. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record was published in ACM Transactions on Knowledge Discovery from Data, https://doi.org/10.1145/3385656.
https://rightsstatements.org/vocab/InC/1.0/
https://urn.fi/URN:NBN:fi-fe2020052739290
Tiivistelmä
Abstract
Classifier calibration does not always go hand in hand with the classifier’s ability to separate the classes. There are applications where good classifier calibration, i.e., the ability to produce accurate probability estimates, is more important than class separation. When the amount of data for training is limited, the traditional approach to improve calibration starts to crumble. In this article, we show how generating more data for calibration is able to improve calibration algorithm performance in many cases where a classifier is not naturally producing well-calibrated outputs and the traditional approach fails. The proposed approach adds computational cost but considering that the main use case is with small datasets this extra computational cost stays insignificant and is comparable to other methods in prediction time. From the tested classifiers, the largest improvement was detected with the random forest and naive Bayes classifiers. Therefore, the proposed approach can be recommended at least for those classifiers when the amount of data available for training is limited and good calibration is essential.
Kokoelmat
- Avoin saatavuus [34357]