Skip to main content
Erschienen in: Neural Processing Letters 3/2019

12.06.2019

Time Series Forecasting Using Neural Networks: Are Recurrent Connections Necessary?

verfasst von: Salihu A. Abdulkarim, Andries P. Engelbrecht

Erschienen in: Neural Processing Letters | Ausgabe 3/2019

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Artificial neural networks (NNs) are widely used in modeling and forecasting time series. Since most practical time series are non-stationary, NN forecasters are often implemented using recurrent/delayed connections to handle the temporal component of the time varying sequence. These recurrent/delayed connections increase the number of weights required to be optimized during training of the NN. Particle swarm optimization (PSO) is now an established method for training NNs, and was shown in several studies to outperform the classical backpropagation training algorithm. The original PSO was, however, designed for static environments. In dealing with non-stationary data, modified versions of PSOs for optimization in dynamic environments are used. These dynamic PSOs have been successfully used to train NNs on classification problems under non-stationary environments. This paper formulates training of a NN forecaster as dynamic optimization problem to investigate if recurrent/delayed connections are necessary in a NN time series forecaster when a dynamic PSO is used as the training algorithm. Experiments were carried out on eight forecasting problems. For each problem, a feedforward NN (FNN) is trained with a dynamic PSO algorithm and the performance is compared to that obtained from four different types of recurrent NNs (RNN) each trained using gradient descent, a standard PSO for static environments and the dynamic PSO algorithm. The RNNs employed were an Elman NN, a Jordan NN, a multirecurrent NN and a time delay NN. The performance of these forecasting models were evaluated under three different dynamic environmental scenarios. The results show that the FNNs trained with the dynamic PSO significantly outperformed all the RNNs trained using any of the other algorithms considered. These findings highlight that recurrent/delayed connections are not necessary in NNs used for time series forecasting (for the time series considered in this study) as long as a dynamic PSO algorithm is used as the training method.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Abdulkarim SA (2016) Time series prediction with simple recurrent neural networks. Bayero J Pure Appl Sci 9(1):19–24CrossRef Abdulkarim SA (2016) Time series prediction with simple recurrent neural networks. Bayero J Pure Appl Sci 9(1):19–24CrossRef
2.
Zurück zum Zitat Abdulkarim SA (2018) Time series forecasting using dynamic particle swarm optimizer trained neural networks. Ph.d. thesis, University of Pretoria Abdulkarim SA (2018) Time series forecasting using dynamic particle swarm optimizer trained neural networks. Ph.d. thesis, University of Pretoria
3.
Zurück zum Zitat Adhikari R, Agrawal R (2011) Effectiveness of PSO based neural network for seasonal time series forecasting. In: Proceedings of the fifth Indian international conference on artificial intelligence, pp 231–244 Adhikari R, Agrawal R (2011) Effectiveness of PSO based neural network for seasonal time series forecasting. In: Proceedings of the fifth Indian international conference on artificial intelligence, pp 231–244
4.
Zurück zum Zitat Blackwell T, Bentley P (2002) Dynamic search with charged swarms. In: Proceedings of the genetic and evolutionary computation conference, pp 9–16 Blackwell T, Bentley P (2002) Dynamic search with charged swarms. In: Proceedings of the genetic and evolutionary computation conference, pp 9–16
5.
Zurück zum Zitat Blackwell T, Branke J (2006) Multiswarms, exclusion, and anti-convergence in dynamic environments. IEEE Trans Evol Comput 10(4):459–472CrossRef Blackwell T, Branke J (2006) Multiswarms, exclusion, and anti-convergence in dynamic environments. IEEE Trans Evol Comput 10(4):459–472CrossRef
6.
Zurück zum Zitat Blackwell T, Branke J, Li X (2008) Particle swarms for dynamic optimization problems. Swarm intelligence. Springer, Berlin, pp 193–217 Blackwell T, Branke J, Li X (2008) Particle swarms for dynamic optimization problems. Swarm intelligence. Springer, Berlin, pp 193–217
7.
Zurück zum Zitat Chatterjee S, Hore S, Dey N, Chakraborty S, Ashour AS (2017) Dengue fever classification using gene expression data: a PSO based artificial neural network approach. In: Proceedings of the 5th international conference on frontiers in intelligent computing: theory and applications. Springer, Berlin, pp 331–341 Chatterjee S, Hore S, Dey N, Chakraborty S, Ashour AS (2017) Dengue fever classification using gene expression data: a PSO based artificial neural network approach. In: Proceedings of the 5th international conference on frontiers in intelligent computing: theory and applications. Springer, Berlin, pp 331–341
8.
Zurück zum Zitat Clerc M, Kennedy J (2002) The particle swarm-explosion, stability, and convergence in a multidimensional complex space. IEEE Trans Evol Comput 6(1):58–73CrossRef Clerc M, Kennedy J (2002) The particle swarm-explosion, stability, and convergence in a multidimensional complex space. IEEE Trans Evol Comput 6(1):58–73CrossRef
9.
Zurück zum Zitat Deb K, Joshi D, Anand A (2002) Real-coded evolutionary algorithms with parent-centric recombination. IEEE Proc Congr Evol Comput 1:61–66 Deb K, Joshi D, Anand A (2002) Real-coded evolutionary algorithms with parent-centric recombination. IEEE Proc Congr Evol Comput 1:61–66
10.
Zurück zum Zitat Dorffner G (1996) Neural networks for time series processing. Neural Netw World 6:447–468 Dorffner G (1996) Neural networks for time series processing. Neural Netw World 6:447–468
11.
Zurück zum Zitat Duhain J (2011) Particle swarm optimization in dynamically changing environment an empirical study. M.Sc. thesis, University of Pretoria Duhain J (2011) Particle swarm optimization in dynamically changing environment an empirical study. M.Sc. thesis, University of Pretoria
12.
Zurück zum Zitat Eberhart R, Kennedy J (1995) A new optimizer using particle swarm theory. In: Proceedings of the sixth international symposium on micro machine and human science, vol 1. pp 39–43 Eberhart R, Kennedy J (1995) A new optimizer using particle swarm theory. In: Proceedings of the sixth international symposium on micro machine and human science, vol 1. pp 39–43
13.
Zurück zum Zitat Eberhart R, Shi Y (2000) Comparing inertia weights and constriction factors in particle swarm optimization. Proc IEEE Congr Evol Comput 1:84–88 Eberhart R, Shi Y (2000) Comparing inertia weights and constriction factors in particle swarm optimization. Proc IEEE Congr Evol Comput 1:84–88
14.
15.
Zurück zum Zitat Engelbrecht A (2007) Computational intelligence: an introduction. Wiley, New YorkCrossRef Engelbrecht A (2007) Computational intelligence: an introduction. Wiley, New YorkCrossRef
16.
Zurück zum Zitat Hamzacebi C (2008) Improving artificial neural networks performance in seasonal time series forecasting. Inf Sci 178(23):4550–4559CrossRef Hamzacebi C (2008) Improving artificial neural networks performance in seasonal time series forecasting. Inf Sci 178(23):4550–4559CrossRef
17.
Zurück zum Zitat Han HG, Lu W, Hou Y, Qiao JF (2018) An adaptive-PSO-based self-organizing RBF neural network. IEEE Trans Neural Netw Learn Syst 29(1):104–117MathSciNetCrossRef Han HG, Lu W, Hou Y, Qiao JF (2018) An adaptive-PSO-based self-organizing RBF neural network. IEEE Trans Neural Netw Learn Syst 29(1):104–117MathSciNetCrossRef
18.
Zurück zum Zitat Harrison K, Ombuki-berman B, Engelbrecht A (2016) A radius-free quantum particle swarm optimization technique for dynamic optimization problems. In: Proceedings of IEEE congress on evolutionary computation, pp 578–585 Harrison K, Ombuki-berman B, Engelbrecht A (2016) A radius-free quantum particle swarm optimization technique for dynamic optimization problems. In: Proceedings of IEEE congress on evolutionary computation, pp 578–585
19.
Zurück zum Zitat Hore S, Chatterjee S, Santhi V, Dey N, Ashour AS, Balas VE, Shi F (2017) Indian sign language recognition using optimized neural networks. In: Information technology and intelligent transportation systems, Springer, pp 553–563 Hore S, Chatterjee S, Santhi V, Dey N, Ashour AS, Balas VE, Shi F (2017) Indian sign language recognition using optimized neural networks. In: Information technology and intelligent transportation systems, Springer, pp 553–563
21.
Zurück zum Zitat Jha G, Thulasiraman P, Thulasiram R (2009) PSO based neural network for time series forecasting. In: Proceedings of IEEE international joint conference on neural networks, pp 1422–1427 Jha G, Thulasiraman P, Thulasiram R (2009) PSO based neural network for time series forecasting. In: Proceedings of IEEE international joint conference on neural networks, pp 1422–1427
22.
Zurück zum Zitat Jordan M (1986) Attractor dynamics and parallellism in a connectionist sequential machine. In: Eighth annual conference of the cognitive science society. pp 513–546 Jordan M (1986) Attractor dynamics and parallellism in a connectionist sequential machine. In: Eighth annual conference of the cognitive science society. pp 513–546
23.
Zurück zum Zitat Kennedy J, Mendes R (2002) Population structure and particle swarm performance. Proc IEEE Congr Evol Comput 2:1671–1676 Kennedy J, Mendes R (2002) Population structure and particle swarm performance. Proc IEEE Congr Evol Comput 2:1671–1676
24.
Zurück zum Zitat Lawal I, Abdulkarim S, Hassan M, Sadiq J (2016) Improving HSDPA traffic forecasting using ensemble of neural networks. In: Proceedings of 15th IEEE international conference machine learning and applications. IEEE, pp 308–313 Lawal I, Abdulkarim S, Hassan M, Sadiq J (2016) Improving HSDPA traffic forecasting using ensemble of neural networks. In: Proceedings of 15th IEEE international conference machine learning and applications. IEEE, pp 308–313
25.
Zurück zum Zitat LeCun Y, Bottou L, Orr G, Müller K (2012) Efficient Backprop. Neural networks: tricks of the trade. Springer, Berlin, pp 9–48 LeCun Y, Bottou L, Orr G, Müller K (2012) Efficient Backprop. Neural networks: tricks of the trade. Springer, Berlin, pp 9–48
26.
Zurück zum Zitat Li X, Dam K (2003) Comparing particle swarms for tracking extrema in dynamic environments. IEEE Proc Evol Comput 3:1772–1779 Li X, Dam K (2003) Comparing particle swarms for tracking extrema in dynamic environments. IEEE Proc Evol Comput 3:1772–1779
27.
Zurück zum Zitat Mann H, Whitney D (1947) On a test of whether one of two random variables is stochastically larger than the other. Ann Math Stat 18(1):50–60MathSciNetCrossRef Mann H, Whitney D (1947) On a test of whether one of two random variables is stochastically larger than the other. Ann Math Stat 18(1):50–60MathSciNetCrossRef
28.
Zurück zum Zitat Mao C, Lin R, Xu C, He Q (2017) Towards a trust prediction framework for cloud services based on PSO-driven neural network. IEEE Access 5:2187–2199CrossRef Mao C, Lin R, Xu C, He Q (2017) Towards a trust prediction framework for cloud services based on PSO-driven neural network. IEEE Access 5:2187–2199CrossRef
29.
Zurück zum Zitat Mendes R, Cortez P, Rocha M, Neves J (2002) Particle swarms for feedforward neural network training. In: Proceedings of IEEE international joint conference on neural networks, pp 1895–1899 Mendes R, Cortez P, Rocha M, Neves J (2002) Particle swarms for feedforward neural network training. In: Proceedings of IEEE international joint conference on neural networks, pp 1895–1899
30.
Zurück zum Zitat Morrison R (2003) Performance measurement in dynamic environments. In: Proceedings of the GECCO workshop on evolutionary algorithms for dynamic optimization problems, pp 5–8 Morrison R (2003) Performance measurement in dynamic environments. In: Proceedings of the GECCO workshop on evolutionary algorithms for dynamic optimization problems, pp 5–8
31.
Zurück zum Zitat Muralitharan K, Sakthivel R, Vishnuvarthan R (2018) Neural network based optimization approach for energy demand prediction in smart grid. Neurocomputing 273:199–208CrossRef Muralitharan K, Sakthivel R, Vishnuvarthan R (2018) Neural network based optimization approach for energy demand prediction in smart grid. Neurocomputing 273:199–208CrossRef
32.
Zurück zum Zitat Parsaie A, Azamathulla HM, Haghiabi AH (2018) Prediction of discharge coefficient of cylindrical weir-gate using GMDH-PSO. ISH J Hydraul Eng 24(2):116–123CrossRef Parsaie A, Azamathulla HM, Haghiabi AH (2018) Prediction of discharge coefficient of cylindrical weir-gate using GMDH-PSO. ISH J Hydraul Eng 24(2):116–123CrossRef
33.
Zurück zum Zitat Rakitianskaia A, Engelbrecht A (2008) Cooperative charged particle swarm optimiser. In: IEEE congress on evolutionary computation, pp 933–939 Rakitianskaia A, Engelbrecht A (2008) Cooperative charged particle swarm optimiser. In: IEEE congress on evolutionary computation, pp 933–939
34.
Zurück zum Zitat Rakitianskaia A, Engelbrecht A (2009) Training neural networks with PSO in dynamic environments. In: Proceedings of IEEE congress on evolutionary computation, pp 667–673 Rakitianskaia A, Engelbrecht A (2009) Training neural networks with PSO in dynamic environments. In: Proceedings of IEEE congress on evolutionary computation, pp 667–673
35.
Zurück zum Zitat Rakitianskaia A, Engelbrecht A (2012) Training feedforward neural networks with dynamic particle swarm optimisation. Swarm Intell 6(3):233–270CrossRef Rakitianskaia A, Engelbrecht A (2012) Training feedforward neural networks with dynamic particle swarm optimisation. Swarm Intell 6(3):233–270CrossRef
36.
Zurück zum Zitat Rakitianskaia A, Engelbrecht A (2015) Saturation in PSO neural network training: good or evil ? IEEE Congr Evolut Comput 2:125–132 Rakitianskaia A, Engelbrecht A (2015) Saturation in PSO neural network training: good or evil ? IEEE Congr Evolut Comput 2:125–132
37.
Zurück zum Zitat Rezaee J (2014) Particle swarm optimisation for dynamic optimisation problems: a review. Neural Compu Appl 25:1507–1516CrossRef Rezaee J (2014) Particle swarm optimisation for dynamic optimisation problems: a review. Neural Compu Appl 25:1507–1516CrossRef
38.
Zurück zum Zitat Riedmiller M (1994) Rprop—description and implementation details. Technical Report Riedmiller M (1994) Rprop—description and implementation details. Technical Report
39.
Zurück zum Zitat Riedmiller M, Braun H (1993) A direct adaptive method for faster backpropagation learning: the RPROP algorithm. In: IEEE international conference on neural networks, pp 586–591 Riedmiller M, Braun H (1993) A direct adaptive method for faster backpropagation learning: the RPROP algorithm. In: IEEE international conference on neural networks, pp 586–591
40.
Zurück zum Zitat Röbel A (1994) The dynamic pattern selection algorithm: effective training and controlled generalization of backpropagation neural networks. In: Technical report. Tech-nische Universität Berlin Röbel A (1994) The dynamic pattern selection algorithm: effective training and controlled generalization of backpropagation neural networks. In: Technical report. Tech-nische Universität Berlin
41.
Zurück zum Zitat Tang Z, Fishwick P (1993) Feedforward neural nets as models for time series forecasting. ORSA J Comput 5(4):374–385CrossRef Tang Z, Fishwick P (1993) Feedforward neural nets as models for time series forecasting. ORSA J Comput 5(4):374–385CrossRef
42.
Zurück zum Zitat Unger N, Ombuki-Berman B, Engelbrecht A (2013) Cooperative particle swarm optimization in dynamic environments. In: Proceedings of IEEE symposium on swarm intelligence, pp 172–179 Unger N, Ombuki-Berman B, Engelbrecht A (2013) Cooperative particle swarm optimization in dynamic environments. In: Proceedings of IEEE symposium on swarm intelligence, pp 172–179
43.
Zurück zum Zitat Van Den Bergh F (2001) An analysis of particle swarm optimizers. Ph.d. thesis, University of Pretoria Van Den Bergh F (2001) An analysis of particle swarm optimizers. Ph.d. thesis, University of Pretoria
44.
Zurück zum Zitat Van den Bergh F, Engelbrecht A (2001) Effects of swarm size on cooperative particle swarm optimisers. In: Proceedings of the 3rd annual conference on genetic and evolutionary computation, pp 892–899 Van den Bergh F, Engelbrecht A (2001) Effects of swarm size on cooperative particle swarm optimisers. In: Proceedings of the 3rd annual conference on genetic and evolutionary computation, pp 892–899
45.
Zurück zum Zitat Van den Bergh F, Engelbrecht A (2004) A cooperative approach to particle swarm optimization. IEEE Trans Evol Comput 8(3):225–239CrossRef Van den Bergh F, Engelbrecht A (2004) A cooperative approach to particle swarm optimization. IEEE Trans Evol Comput 8(3):225–239CrossRef
46.
Zurück zum Zitat Waibel A, Hanazawa T, Hinton G, Shikano K, Lang KJ (1990) Phoneme recognition using time-delay neural networks. In: Readings in speech recognition. Elsevier, pp 393–404 Waibel A, Hanazawa T, Hinton G, Shikano K, Lang KJ (1990) Phoneme recognition using time-delay neural networks. In: Readings in speech recognition. Elsevier, pp 393–404
47.
Zurück zum Zitat Wessels L, Barnard E (1992) Avoiding false local minima by proper initialization of connections. IEEE Trans Neural Netw 3(6):899–905CrossRef Wessels L, Barnard E (1992) Avoiding false local minima by proper initialization of connections. IEEE Trans Neural Netw 3(6):899–905CrossRef
48.
Zurück zum Zitat Wyk A, Engelbrecht A (2010) Overfitting by PSO trained feedforward neural networks. In: IEEE congress on evolutionary computation, pp 1–8 Wyk A, Engelbrecht A (2010) Overfitting by PSO trained feedforward neural networks. In: IEEE congress on evolutionary computation, pp 1–8
Metadaten
Titel
Time Series Forecasting Using Neural Networks: Are Recurrent Connections Necessary?
verfasst von
Salihu A. Abdulkarim
Andries P. Engelbrecht
Publikationsdatum
12.06.2019
Verlag
Springer US
Erschienen in
Neural Processing Letters / Ausgabe 3/2019
Print ISSN: 1370-4621
Elektronische ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-019-10061-5

Weitere Artikel der Ausgabe 3/2019

Neural Processing Letters 3/2019 Zur Ausgabe

Neuer Inhalt