Abstract

For many years now, there is a growing interest around ROC curve
for characterizing machine learning performances. This is
particularly due to the fact that in real-world problems
misclassification costs are not known and thus, ROC curve and
related metrics such as the Area Under ROC curve (AUC)
can be a more meaningful performance measures.
In this paper, we propose a SVMs based algorithm for AUC
maximization and show that under certain conditions this algorithm
is related to 2-norm soft margin Support Vector Machines. We
present experiments that compare SVMs performances to those of
other AUC maximization based algorithms and provide empirical
analysis of SVMs behavior with regards to ROC- based metrics. Our
main conclusion is that SVMs can maximize both AUC and accuracy.