I am working on binary classification problem, I try to evaluate the performance of some classification algorithms (LR,Decission Tree , Random forest ...). I am using a 10 fold cross-validation technique (to avoid over-fitting) with AUC ROC as scoring function to compare the performance of the algorithms, but I am getting a weird results with Random forest and AdbBoost, I have a perfect AUC_ROC score (i.e. =1) despite the fact that the Recall(TPR) and FPR of this algorithms are different from 1 and 0 respectively .
Asked
Active
Viewed 20 times
0
-
2Duplicate of [Confused AUC ROC score](https://datascience.stackexchange.com/questions/78032/confused-auc-roc-score) – Jonathan Jul 21 '20 at 17:12
