0

I have a univariate time series. where I'm trying to predict a current value of a variable based on the previous 10 values of the same variable. I tried three approaches: 1- linear regression where I use the previous 10 values as features. 2- ARIMA model 3- deep learning (LSTM) model

I computed the RMSE between the predicted and expected current values. To my surprise, the linear regression model is performing slightly better.

RMSE linear regression = 27.73
RMSE Arima= 29
RMSE LSTM= 28.7

I know that this could be a vague question, please let me know if you need any further information such as code snippets or plots.

My questions is: Is this expected ? I would have thought that LSTM would be the most powerful approach. Is it possible for a simple linear regression model to outperform an LSTM? Do you have any advice on how to improve the performance of LSTM ?

the phoenix
  • 145
  • 4
  • I'm not surprised as I've come across instances where a linear model has outperformed more complex models. To answer your question it would help to see a sample of the data and also some plots comparing the 3 models. Also not sure why you're only training on 10 samples. Seems really low. – fswings May 30 '21 at 15:23
  • hi @fswings, I updated my post in the following link: https://datascience.stackexchange.com/questions/95227/how-to-know-if-a-time-series-sequence-is-predictibale-or-just-random-univariate. – the phoenix Jun 03 '21 at 13:50

0 Answers0