TY - GEN
T1 - Cal-Net
T2 - 2021 International Joint Conference on Neural Networks, IJCNN 2021
AU - Datta, Arghya
AU - Flynn, Noah R.
AU - Swamidass, S. Joshua
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021/7/18
Y1 - 2021/7/18
N2 - Datasets in critical domains are often class imbalanced, with a minority class far rarer than the majority class, and classification models face challenges to produce calibrated predictions on these datasets. A common approach to address this issue is to train classification models in the first step and subsequently use post-processing parametric or non-parametric calibration techniques to re-scale the model's outputs in the second step without tuning any underlying parameters in the model to improve calibration. In this study, we have shown that these common approaches are vulnerable to class imbalanced data, often producing unstable results that do not jointly optimize classification or calibration performance. We have introduced Cal-Net, a 'self-calibrating' neural network architecture that simultaneously optimizes classification and calibration performances for class imbalanced datasets in a single training phase, thereby eliminating the need for any post-processing procedure for confidence calibration. Empirical results have shown that Cal-Net outperforms far more complex neural networks and post-processing calibration techniques in both classification and calibration performances on four synthetic and four benchmark class imbalanced binary classification datasets. Furthermore, Cal-Net can readily be extended to more complicated learning tasks, online learning and can be incorporated in more complex architectures as the final state.
AB - Datasets in critical domains are often class imbalanced, with a minority class far rarer than the majority class, and classification models face challenges to produce calibrated predictions on these datasets. A common approach to address this issue is to train classification models in the first step and subsequently use post-processing parametric or non-parametric calibration techniques to re-scale the model's outputs in the second step without tuning any underlying parameters in the model to improve calibration. In this study, we have shown that these common approaches are vulnerable to class imbalanced data, often producing unstable results that do not jointly optimize classification or calibration performance. We have introduced Cal-Net, a 'self-calibrating' neural network architecture that simultaneously optimizes classification and calibration performances for class imbalanced datasets in a single training phase, thereby eliminating the need for any post-processing procedure for confidence calibration. Empirical results have shown that Cal-Net outperforms far more complex neural networks and post-processing calibration techniques in both classification and calibration performances on four synthetic and four benchmark class imbalanced binary classification datasets. Furthermore, Cal-Net can readily be extended to more complicated learning tasks, online learning and can be incorporated in more complex architectures as the final state.
UR - http://www.scopus.com/inward/record.url?scp=85108790229&partnerID=8YFLogxK
U2 - 10.1109/IJCNN52387.2021.9534411
DO - 10.1109/IJCNN52387.2021.9534411
M3 - Conference contribution
AN - SCOPUS:85108790229
T3 - Proceedings of the International Joint Conference on Neural Networks
BT - IJCNN 2021 - International Joint Conference on Neural Networks, Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 18 July 2021 through 22 July 2021
ER -