1

I have deleloped several SVR models for my case study using the linear kernel, and those models were optimized using the RMSE as criterion. Now Im searching for additional evaluation metrics and it turns the most publications use R squared to compare model performance during training and validation phases. It's generally suggested to avoid to use R-squared to assess the model if it uses non-linear kernel such as polynominal or radial basis function. And this refers to the fact that using R-squared for non-linear models may lead to wrong judgements.

Nevertheless, the linear kernel function equation

(1): $\displaystyle{K(x,x_i) = \sum_i x \cdot x_i}$

still looks quite nonlinear, doesnt it?

Could anybody please then explain why SVR/SVMs with linear kernel are considered as linear models, and if using R-squared for the evaluation in these cases is valid.

Nikos M.
  • 2,301
  • 1
  • 6
  • 11
tabumis
  • 33
  • 4
  • 1
    SVMs are basicaly linear models. Only thing that can make them non-linear is the kernel. So if the kernel is linear as well then they are truly linear. – Nikos M. Jun 20 '21 at 18:01
  • @Nikos, thank you for your input. Ive just came across this [thread](https://stats.stackexchange.com/questions/2167/applying-the-kernel-trick-to-linear-methods), which implies that kernels are predominantly applied for non-linear models.. – tabumis Jun 22 '21 at 12:28
  • The kernel trick allows to treat a problem via linear methods by applying a non-linear transform to the data. So yes if the problem is non-linear then a non-linear kernel trick may help. But the kernel by default is linear unless explicitly altered – Nikos M. Jun 22 '21 at 16:42

0 Answers0