Time Series Prediction via Neural Networks
Asked Answered
C

3

20

I have been working on Neural Networks for various purposes lately. I have had great success in digit recognition, XOR, and various other easy/hello world'ish applications.

I would like to tackle the domain of time series estimation. I do not have a University account at the moment to read all the IEEE/ACM papers on the topic (for free), nor can I find many resources detailing using ANN for time series forcasting.

I would like to know if anyone has any suggestions or can recommend any resources concerning using ANN for forcasting via time series data?

I would assume that to train the NN, you would insert a few immediately time steps and the expected output would be the next timestep (example: inputs of n-5, n-4, n-3, n-2, n-1 should come out with an output of result at timestep N. ... and slide down some amount of timesteps and do it all again.

Can anyone confirm this or comment on it? I would appreciate it!

Cowpea answered 20/11, 2010 at 2:16 Comment(0)
H
22

I think that you've got the basic idea: a "sliding window" approach where a network is trained to use the last k values of a series (Tn-k ... Tn-1) to predict the current value (Tn).

There are a lot of ways you can do this, however. For example:

  • How big should that window be?
  • Should the data be preprocessed in any way (e.g. to remove outliers)?
  • What network configuration (e.g. # of hidden nodes, # of layers) and algorithm should be used?

Often people end up figuring out the best way to learn from their particular data by trial and error.

There are a fair number of publicly-accessible papers out there about this stuff. Start with these, and look at their citations and papers that cite them via Google Scholar, and you should have plenty to read:

Hamish answered 20/11, 2010 at 12:43 Comment(0)
K
2

There is a kind of neural networks named recurrent neural networks (RNNs. One advantage of using these models is you do not have to define an sliding window for the input examples. A variant of RNNs known as Long-Short Term Memory (LSTM) can potentially take into account many instances in the previous time stamps and a "forget gate" is used to allow or disallow remembering the previous results from the previous time stamps.

Koziel answered 28/12, 2015 at 20:56 Comment(0)
A
1

Technically this is the same as your digit recognition - it is recognizing something and returning what it was...

Well - now your inputs are the previous steps (T-5 ... T-1) - and your output or outputs are the predicted steps (T0, T1...).

The mechanics in the ANN itself are the same - you will have to teach every layer for feature detection, correcting its reconstruction of the thing, so that it looks like what is actually going to happen.

(some more info about what do I mean: tech talk )

Aneto answered 21/11, 2010 at 9:49 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.