Questions tagged [precision-recall-curve]

8 questions
2
votes
2 answers

Plotting a no-skill model in a precision-recall curve

I am following this tutorial to apply threshold tuning using precision-recall curve for an imbalanced dataset Within the tutorial, a no-skill model is defined as: A no-skill model is represented by a horizontal line with a precision that is the…
0
votes
0 answers

Mean Average Precision with 11 points interpolation method Python libs

I want to calculate mAP with 11 points interpolation method for object detection, as described here: https://learnopencv.com/mean-average-precision-map-object-detection-model-evaluation-metric/ What Python libs do implement this algorithm? For…
0
votes
0 answers

Multilabel metrics: micro vs. macro vs. weighted vs. samples?

I'm working on a multilabel classification problem; there are $N$ classes and each example can belong to $[0, N]$ of those classes. Below you can see the precision and recall computed using various averaging options with…
0
votes
1 answer

Can I use macro recall to check if my RF model is overfitting?

I have a dataset with 837377 observations (51% to train, 25% to validation and 24% to test) and 19 features. I calculated the recall score using average macro for train, validation and test and obtained: Train: 0.9981845060159042 Val:…
0
votes
1 answer

Precision vs probability

Say I have a model which predicts a class $C_i$ from an input $X$, with a probability of 0.95 i.e $P(C_i| X)=0.95$. That would mean that if we do this over and over, then 95/100 times we would be correct. Having a model with a precision of 0.95 (for…
0
votes
0 answers

High Recall and Low Precision for Binary CNN model

I was training a CNN model for binary classification. The training and validation accuracy seemed good. However, the precision is low and the recall is high (High false positive). Recall of the model is 1.00 Precision of the model is 0.71 F1-score…
0
votes
0 answers

Low value of area under PR and high value of a area under ROC interpretation?

I am still a rookie. I am training a random forest model and I am getting 0.27 for area under PR and 0.85 for area under ROC. My data was very imbalanced for the negative labels and I did perform a down-sampling to balance it. I understand what ROC…
Alex Man
  • 101
-2
votes
1 answer

How to increase , precision-recall value in your Deep learning model

I am getting good accuracy metrics around 80 with precision =66, recall =37, F1 =47. How can I improve precision, and recall metrics in anomaly detection scenarios.. any suggestions?