2

I have a time series that I want to predict with an LSTM. I am able to get very good results using 50 datapoints predicting 51, but I struggle to get any accuracy using something like 200 datapoints to predict 220. After an epoch, my network outputs 0 for all inputs. Is there a technique for predicting multiple timesteps ahead of the final output with a neural network?

For example, would it make more sense to predict 1 timestep ahead 20 times in a row feeding the outputs back in to get to that 20th timestep? Training it on a sequence followed by the timestep 20 ahead does not seem to work so far.

Rob
  • 314
  • 1
  • 2
  • 12

1 Answers1

3

Yes, you could try applying the LSTM iteratively 20 times. In other words: use the first 200 datapoints to predict the 201th; then use datapoints 2..201 to predict the 202th; and so on, until you predict the 220th. You'll have to evaluate how well this works on a test set; it might work, or it might not.

This could still fail badly. It could even be that there is just no way to predict 20 timesteps out. For instance, it's possible that the short-term correlation is high but the long-term correlation is low. Think of the weather: it's possible to predict tomorrow's weather with relatively high accuracy, but seems to be extremely hard to predict the weather 3 weeks out. So there might just be fundamental barriers to making predictions that far into the future.

D.W.
  • 167,959
  • 22
  • 232
  • 500