0

I would like to train a GCN for protien-ligand binding affinity regression. I use GCNConv from pytorch geometric, ReLU for all activations and Dropout (0.2) after 2 dense layers each. Using ReLU for all activation functions.

While my training curves look overall at the same level enter image description here, i noticed weird periodic jumps at validation stage enter image description here. What might be the issue with the model or/and training process?

For me it seems that the network starts initializing again after each epoch, but it is not what my code seems like doing.

  • 1
    It seems that your model is not learning at all, one cause could be your optimization function, try to change it or to reduce learning rate. Other cause could be that you do not have enough data for what you are trying to achieve. – Let's try Aug 07 '20 at 11:28
  • It also seems like your validation values are discrete, which should not be an issue for MSE loss (if reading the graphs right). It might be possible with accuracy when there are few validation samples, so I would check what is the case here by printing out some detailed error values. – maksym33 Aug 07 '20 at 14:07

0 Answers0