I am very new to the field of machine learning and hope this question fits here.
Is there a way to generate an estimate of the computational effort required to classify a value using a previously trained machine learning model? Or to simulate the computational effort required?
I've already looked at some articles on this, but I'm not sure if I've understood it correctly. Here, for example (https://www.kaggle.com/general/263127), the various complexities of the models are described with mathematical models. But how could I estimate how much time it would take to simulate the models on a certain processor?