Abstract
Neural network time series forecasting error comprises autocorrelation error, due to an imperfect model, and random noise, inherent in the data. Both problems are addressed here, the first using a two stage training, growth-network neuron: the autocorrelation error (ACE) neuron. The second is considered as a post-processing noise filtering problem. These techniques are applied in forecasting the sunspot time series, with comparison of stochastic, BFGS and conjugate gradient solvers.
Similar content being viewed by others
References
Casdagli M, Eubank S, eds. Nonlinear modeling and forecasting. Sante Fe Institute Proceedings vol. XII; Addison-Wesley, Redwood City, CA, 1992
Lapedes AS, Farber RM. Nonlinear signal processing using neural networks: prediction and system modelling. Technical Report LA-UR-87-2662, Los Alamos National Laboratory, 1987.
Weigend AS, Huberman BA, Rumelhart DE. Predicting the future: a connectionist approach. Int J Neural Syst 1990; 1: 193–209
Chakraborty K, Mehrotra K, Mohan CK, Ranka S. Forecasting the behavior of multivariate time series using neural networks. Neural Networks 1992; 5: 961–970
Refenes AN, Azema-Barac M, Chen L, Karoussos SA. Currency Exchange Rate Prediction and Neural Network Design Strategies. Neural Comput Applic 1993; 1: 46–58
Azoff EM. Neural network time series forecasting of financial markets. John Wiley, Chichester (To be published 1994)
Mezard M, Nadal J-P. Learning in feedforward layered networks: the tiling algorithm. J Physics A 1989; 21: 2191–2204
Fahlman SE, Lebiere C. The cascade-correlation learning architecture. In: Touretzky, DS, ed. Advances in Neural Information Processing Systems, vol 2. Morgan Kaufmann, San Mateo, CA, 1990, 524–532
Frean M. The upstart algorithm: a method for constructing and training feedforward neural networks. Neural Computation 1990; 2: 198–209
Refenes AN, Vithlani S. Constructive learning by specialisation. In: Proc. ICANN 1991, Helsinki, Finland
Werbos PJ. Beyond regression: new tools for prediction and analysis in the behavioral sciences. PhD Thesis, Harvard University, 1974
Le Cun Y. A learning scheme for asymmetric threshold network. In: CESTA-AFCET, ed. Cognitiva 85: A la Frontiere de l'Intelligence Artificielle des Sciences de la Connaissance des Neurosciences, Paris, 1985, 599–604
Rumelhart DE, Hinton GE, Williams RJ. Learning representations by backpropagating errors. Nature 1986; 323: 533–536
Azoff EM. Neural network principal components preprocessing and diffraction tomography. Neural Comput Applic 1993; 1: 107–114
Hertz J, Krogh A, Palmer RG. Introduction to the theory of neural computation. Lecture Notes vol I, Sante Fe Institute. Addison-Wesley, Redwood City, CA, 1991
Jacobs DAH, ed. The state of the art in numerical analysis. Academic Press, London, 1977
Battiti R. First- and second-order methods for learning, between steepest descent and newton's method. Neural Computation 1992; 4: 141–166
Shanno DF. Conjugate gradient methods with inexact searches. Math Operat Res 1978; 3: 244–256
Marple SL. Digital spectral analysis. Prentice-Hall, Englewood Cliffs, NJ 1987
Koeckelenbergh A. Sunspot Index Data Center, 3 Avenue Circulaire, B-1180, Bruxelles, Belgium
Gabr MM, Subba Rao T. The estimation and prediction of subset bilinear time series models with applications. J Time Series Analysis 1981; 2: 155–171
Tong H, Lim KS. Threshold autoregression, limit cycles and cyclical data. J Roy Statist Soc 1980; B42: 245–292
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Azoff, E.M. Reducing error in neural network time series forecasting. Neural Comput & Applic 1, 240–247 (1993). https://doi.org/10.1007/BF02098741
Received:
Issue Date:
DOI: https://doi.org/10.1007/BF02098741