1

I am training a neural network and doing 10-fold cross validation to measure performance. I have read lots of documentation and forums telling that the set of weights that should be saved or checkpointed are the ones that results to lowest val_loss and not highest val_accuracy, since the former usually results to higher testing accuracy.

Out of curiosity, I checkpointed both highest val_accuracy and lowest val_loss during my training. However, I found out that for some folds, I am getting better testing accuracy when I use the set of weights with highest val_accuracy compared to lowest val_loss. So during my cross-validation, I chose the set of weights that resulted to higher testing accuracy regardless of whether it came from the highest val_accuracy or lowest val_loss, and then just averaged the resulting testing accuracy across the 10 folds.

Is my methodology valid?

Ethan
  • 1,625
  • 8
  • 23
  • 39
nununu
  • 11
  • 1
  • 1
    This might help: [Should I prefer the model with the lowest validation loss or the highest validation accuracy to deploy?](https://ai.stackexchange.com/questions/17975/should-i-prefer-the-model-with-the-lowest-validation-loss-or-the-highest-validat) – serali Oct 20 '21 at 08:56
  • Also, [What is the relationship between the accuracy and the loss in deep learning?](https://datascience.stackexchange.com/questions/42599/what-is-the-relationship-between-the-accuracy-and-the-loss-in-deep-learning#:%7E:text=There%20is%20no%20relationship%20between,you%20made%20on%20the%20data.) – desertnaut Oct 20 '21 at 21:47

0 Answers0