Skip to main content
Erschienen in: Neural Processing Letters 3/2016

01.12.2016

Selected an Stacking ELMs for Time Series Prediction

verfasst von: Zhongchen Ma, Qun Dai

Erschienen in: Neural Processing Letters | Ausgabe 3/2016

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Extreme learning machine (ELM) has several interesting and significant features. In this paper, a novel pruned Stacking ELMs (PS-ELMs) algorithm for time series prediction (TSP) is proposed. It employs ELM as the level-0 algorithm to train several models for Stacking. And our previously proposed reduce-error pruning for TSP (ReTSP)-Trend pruning technique is used to solve the problem that the level-0 learners might make many correlated error predictions. ReTSP-Trend refers to an evaluation measure for reduce-error pruning for TSP (ReTSP), which takes into account the time series trend and the forecasting error direction. What’s more, ELM and simple averaging are used to generate the level-1 model. With the development of PS-ELMs, firstly, those essential advantages of ELM will be naturally inherited. Secondly, those specific defects of ELM are ameliorated to some extent, with the help of ensemble pruning paradigm. Thirdly, ensemble pruning is employed to raise the robustness and accuracy of time series forecasting, making up for the shortages of the existing research. Fourthly, our previously proposed pruning measure ReTSP-Trend is employed in PS-ELMs, which indeed guarantees that the remaining predictor which supplements the subensemble the most will be selected. And finally, the development of PS-ELMs will promote our investigation to the popular ensemble technique of Stacked Generalization. The experimental results on four benchmark financial time series datasets verified the validity of the proposed PS-ELMs algorithm.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Neto A, Calvalcanti GD, Ren TI (2009) Financial time series prediction using exogenous series and combined neural networks. In: International joint conference on neural networks, 2009. IJCNN 2009, pp 149–156 Neto A, Calvalcanti GD, Ren TI (2009) Financial time series prediction using exogenous series and combined neural networks. In: International joint conference on neural networks, 2009. IJCNN 2009, pp 149–156
2.
Zurück zum Zitat Abu-Mostafa YS, Atiya AF (1996) Introduction to financial forecasting. Appl Intell 6:205–213CrossRef Abu-Mostafa YS, Atiya AF (1996) Introduction to financial forecasting. Appl Intell 6:205–213CrossRef
3.
Zurück zum Zitat Jiang H, He W (2012) Grey relational grade in local support vector regression for financial time series prediction. Expert Syst Appl 39:2256–2262CrossRef Jiang H, He W (2012) Grey relational grade in local support vector regression for financial time series prediction. Expert Syst Appl 39:2256–2262CrossRef
4.
Zurück zum Zitat Crone SF, Hibon M, Nikolopoulos K (2011) Advances in forecasting with neural networks? Empirical evidence from the NN3 competition on time series prediction. Int J Forecast 27:635–660CrossRef Crone SF, Hibon M, Nikolopoulos K (2011) Advances in forecasting with neural networks? Empirical evidence from the NN3 competition on time series prediction. Int J Forecast 27:635–660CrossRef
5.
Zurück zum Zitat White H (1988) Economic prediction using neural networks: the case of IBM daily stock returns. In: IEEE international conference on neural networks, 1988, pp 451–458 White H (1988) Economic prediction using neural networks: the case of IBM daily stock returns. In: IEEE international conference on neural networks, 1988, pp 451–458
6.
Zurück zum Zitat Zhiqiang G, Huaiqing W, Quan L (2013) Financial time series forecasting using LPP and SVM optimized by PSO. Soft Comput 17:805–818CrossRef Zhiqiang G, Huaiqing W, Quan L (2013) Financial time series forecasting using LPP and SVM optimized by PSO. Soft Comput 17:805–818CrossRef
7.
Zurück zum Zitat Huang G-B, Zhu Q-Y, Siew C-K (2006) Real-time learning capability of neural networks. IEEE Trans Neural Netw 17:863–878CrossRef Huang G-B, Zhu Q-Y, Siew C-K (2006) Real-time learning capability of neural networks. IEEE Trans Neural Netw 17:863–878CrossRef
8.
Zurück zum Zitat Huang G-B, Zhu Q-Y, Siew C-K (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501CrossRef Huang G-B, Zhu Q-Y, Siew C-K (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501CrossRef
9.
Zurück zum Zitat He Y-L, Geng Z-Q, Xu Y, Zhu Q-X (2015) A robust hybrid model integrating enhanced inputs based extreme learning machine with PLSR (PLSR-EIELM) and its application to intelligent measurement. ISA Trans 58:533–542 He Y-L, Geng Z-Q, Xu Y, Zhu Q-X (2015) A robust hybrid model integrating enhanced inputs based extreme learning machine with PLSR (PLSR-EIELM) and its application to intelligent measurement. ISA Trans 58:533–542
10.
Zurück zum Zitat Peng Y, Wang S, Long X, Lu B-L (2015) Discriminative graph regularized extreme learning machine and its application to face recognition. Neurocomputing 149:340–353CrossRef Peng Y, Wang S, Long X, Lu B-L (2015) Discriminative graph regularized extreme learning machine and its application to face recognition. Neurocomputing 149:340–353CrossRef
11.
Zurück zum Zitat Na W, Zhu Q, Su Z, Jiang Q (2015) Research on well production prediction based on improved extreme learning machine. Int J Model Identif Control 23:238–247CrossRef Na W, Zhu Q, Su Z, Jiang Q (2015) Research on well production prediction based on improved extreme learning machine. Int J Model Identif Control 23:238–247CrossRef
12.
Zurück zum Zitat Wong KI, Vong CM, Wong PK, Luo J (2015) Sparse Bayesian extreme learning machine and its application to biofuel engine performance prediction. Neurocomputing 149:397–404CrossRef Wong KI, Vong CM, Wong PK, Luo J (2015) Sparse Bayesian extreme learning machine and its application to biofuel engine performance prediction. Neurocomputing 149:397–404CrossRef
13.
Zurück zum Zitat Shamshirband S, Mohammadi K, Tong CW, Petković D, Porcu E, Mostafaeipour A et al (2015) Application of extreme learning machine for estimation of wind speed distribution. Clim Dyn 1–15. doi:10.1007/s00382-015-2682-2 Shamshirband S, Mohammadi K, Tong CW, Petković D, Porcu E, Mostafaeipour A et al (2015) Application of extreme learning machine for estimation of wind speed distribution. Clim Dyn 1–15. doi:10.​1007/​s00382-015-2682-2
14.
Zurück zum Zitat Wang W, Yu L, Liu H, Sun F (2015) Extreme learning machine for linear dynamical systems classification: application to human activity recognition. In: Proceedings of ELM-2014, vol 2. Springer, Berlin, pp 11–20 Wang W, Yu L, Liu H, Sun F (2015) Extreme learning machine for linear dynamical systems classification: application to human activity recognition. In: Proceedings of ELM-2014, vol 2. Springer, Berlin, pp 11–20
15.
Zurück zum Zitat Daliri MR (2012) A hybrid automatic system for the diagnosis of lung cancer based on genetic algorithm and fuzzy extreme learning machines. J Med Syst 36:1001–1005CrossRef Daliri MR (2012) A hybrid automatic system for the diagnosis of lung cancer based on genetic algorithm and fuzzy extreme learning machines. J Med Syst 36:1001–1005CrossRef
16.
Zurück zum Zitat Daliri MR (2015) Combining extreme learning machines using support vector machines for breast tissue classification. Comput Methods Biomech Biomed Eng 18:185–191CrossRef Daliri MR (2015) Combining extreme learning machines using support vector machines for breast tissue classification. Comput Methods Biomech Biomed Eng 18:185–191CrossRef
17.
Zurück zum Zitat Miche Y, Sorjamaa A, Bas P, Simula O, Jutten C, Lendasse A (2010) OP-ELM: optimally pruned extreme learning machine. IEEE Trans Neural Netw 21:158–162CrossRef Miche Y, Sorjamaa A, Bas P, Simula O, Jutten C, Lendasse A (2010) OP-ELM: optimally pruned extreme learning machine. IEEE Trans Neural Netw 21:158–162CrossRef
18.
Zurück zum Zitat Gonzalez-Carrasco I, Garcia-Crespo A, Ruiz-Mezcua B, Lopez-Cuadrado JL (2012) An optimization methodology for machine learning strategies and regression problems in ballistic impact scenarios. Appl Intell 36:424–441CrossRef Gonzalez-Carrasco I, Garcia-Crespo A, Ruiz-Mezcua B, Lopez-Cuadrado JL (2012) An optimization methodology for machine learning strategies and regression problems in ballistic impact scenarios. Appl Intell 36:424–441CrossRef
19.
Zurück zum Zitat Gonzalez-Carrasco I, Garcia-Crespo A, Ruiz-Mezcua B, Lopez-Cuadrado JL, Colomo-Palacios R (2014) Towards a framework for multiple artificial neural network topologies validation by means of statistics. Expert Syst 31:20–36CrossRef Gonzalez-Carrasco I, Garcia-Crespo A, Ruiz-Mezcua B, Lopez-Cuadrado JL, Colomo-Palacios R (2014) Towards a framework for multiple artificial neural network topologies validation by means of statistics. Expert Syst 31:20–36CrossRef
20.
Zurück zum Zitat Lai KK, Yu L, Wang S, Wei H (2006) A novel nonlinear neural network ensemble model for financial time series forecasting. In: Computational science—ICCS 2006. Springer, Berlin, pp 790–793 Lai KK, Yu L, Wang S, Wei H (2006) A novel nonlinear neural network ensemble model for financial time series forecasting. In: Computational science—ICCS 2006. Springer, Berlin, pp 790–793
21.
Zurück zum Zitat Kim D, Kim C (1997) Forecasting time series with genetic fuzzy predictor ensemble. IEEE Trans Fuzzy Syst 5:523–535CrossRef Kim D, Kim C (1997) Forecasting time series with genetic fuzzy predictor ensemble. IEEE Trans Fuzzy Syst 5:523–535CrossRef
22.
Zurück zum Zitat Qian B, Rasheed K (2010) Foreign exchange market prediction with multiple classifiers. J Forecast 29:271–284MathSciNetMATH Qian B, Rasheed K (2010) Foreign exchange market prediction with multiple classifiers. J Forecast 29:271–284MathSciNetMATH
23.
Zurück zum Zitat Khashei M, Bijari M (2012) A new class of hybrid models for time series forecasting. Expert Syst Appl 39:4344–4357CrossRef Khashei M, Bijari M (2012) A new class of hybrid models for time series forecasting. Expert Syst Appl 39:4344–4357CrossRef
24.
Zurück zum Zitat Hernández-Lobato D, Martínez-Muñoz G, Suárez A (2011) Empirical analysis and evaluation of approximate techniques for pruning regression bagging ensembles. Neurocomputing 74:2250–2264CrossRef Hernández-Lobato D, Martínez-Muñoz G, Suárez A (2011) Empirical analysis and evaluation of approximate techniques for pruning regression bagging ensembles. Neurocomputing 74:2250–2264CrossRef
25.
Zurück zum Zitat Margineantu DD, Dietterich TG (1997) Pruning adaptive boosting. In: ICML, 1997, pp 211–218 Margineantu DD, Dietterich TG (1997) Pruning adaptive boosting. In: ICML, 1997, pp 211–218
26.
Zurück zum Zitat Prodromidis AL, Stolfo SJ (2001) Cost complexity-based pruning of ensemble classifiers. Knowl Inf Syst 3:449–469CrossRefMATH Prodromidis AL, Stolfo SJ (2001) Cost complexity-based pruning of ensemble classifiers. Knowl Inf Syst 3:449–469CrossRefMATH
27.
Zurück zum Zitat Martınez-Munoz G, Suárez A (2004) Aggregation ordering in bagging. In: Proceedings of the IASTED international conference on artificial intelligence and applications, 2004, pp 258–263 Martınez-Munoz G, Suárez A (2004) Aggregation ordering in bagging. In: Proceedings of the IASTED international conference on artificial intelligence and applications, 2004, pp 258–263
28.
Zurück zum Zitat Zhou Z-H, Tang W (2003) Selective ensemble of decision trees. In: Rough sets, fuzzy sets, data mining, and granular computing. Springer, Berlin, pp 476–483 Zhou Z-H, Tang W (2003) Selective ensemble of decision trees. In: Rough sets, fuzzy sets, data mining, and granular computing. Springer, Berlin, pp 476–483
29.
Zurück zum Zitat Caruana R, Niculescu-Mizil A, Crew G, Ksikes A (2004) Ensemble selection from libraries of models. In: Proceedings of the twenty-first international conference on machine learning, 2004, p 18 Caruana R, Niculescu-Mizil A, Crew G, Ksikes A (2004) Ensemble selection from libraries of models. In: Proceedings of the twenty-first international conference on machine learning, 2004, p 18
30.
Zurück zum Zitat Banfield RE, Hall LO, Bowyer KW, Kegelmeyer WP (2005) Ensemble diversity measures and their application to thinning. Inf Fusion 6:49–62CrossRef Banfield RE, Hall LO, Bowyer KW, Kegelmeyer WP (2005) Ensemble diversity measures and their application to thinning. Inf Fusion 6:49–62CrossRef
31.
Zurück zum Zitat Martínez-Muñoz G, Suárez A (2006) Pruning in ordered bagging ensembles. In: Proceedings of the 23rd international conference on machine learning, 2006, pp 609–616 Martínez-Muñoz G, Suárez A (2006) Pruning in ordered bagging ensembles. In: Proceedings of the 23rd international conference on machine learning, 2006, pp 609–616
32.
Zurück zum Zitat Martínez-Muñoz G, Suárez A (2007) Using boosting to prune bagging ensembles. Pattern Recognit Lett 28:156–165CrossRef Martínez-Muñoz G, Suárez A (2007) Using boosting to prune bagging ensembles. Pattern Recognit Lett 28:156–165CrossRef
33.
34.
Zurück zum Zitat Ma ZC, Dai Q, Liu NZ (2015) Several novel evaluation measures for rank-based ensemble pruning with applications to time series prediction. Expert Syst Appl 42:280–292CrossRef Ma ZC, Dai Q, Liu NZ (2015) Several novel evaluation measures for rank-based ensemble pruning with applications to time series prediction. Expert Syst Appl 42:280–292CrossRef
35.
Zurück zum Zitat Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal Mach Intell 12:993–1001CrossRef Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal Mach Intell 12:993–1001CrossRef
36.
Zurück zum Zitat Grigorievskiy A, Miche Y, Ventelä A-M, Séverin E, Lendasse A (2014) Long-term time series prediction using OP-ELM. Neural Netw 51:50–56CrossRefMATH Grigorievskiy A, Miche Y, Ventelä A-M, Séverin E, Lendasse A (2014) Long-term time series prediction using OP-ELM. Neural Netw 51:50–56CrossRefMATH
37.
Zurück zum Zitat Zhao G, Shen Z, Miao C, Gay RK (2008) Enhanced extreme learning machine with stacked generalization. In: IJCNN, 2008, pp 1191–1198 Zhao G, Shen Z, Miao C, Gay RK (2008) Enhanced extreme learning machine with stacked generalization. In: IJCNN, 2008, pp 1191–1198
38.
Zurück zum Zitat Huang G-B (2003) Learning capability and storage capacity of two-hidden-layer feedforward networks. IEEE Trans Neural Netw 14:274–281CrossRef Huang G-B (2003) Learning capability and storage capacity of two-hidden-layer feedforward networks. IEEE Trans Neural Netw 14:274–281CrossRef
39.
Zurück zum Zitat Huang G-B, Zhu Q-Y, Siew C-K (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: 2004 IEEE international joint conference on neural networks, 2004. Proceedings, pp 985–990 Huang G-B, Zhu Q-Y, Siew C-K (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: 2004 IEEE international joint conference on neural networks, 2004. Proceedings, pp 985–990
40.
Zurück zum Zitat Ledezma A, Aler R, Sanchis A, Borrajo D (2010) GA-stacking: evolutionary stacked generalization. Intell Data Anal 14:89–119 Ledezma A, Aler R, Sanchis A, Borrajo D (2010) GA-stacking: evolutionary stacked generalization. Intell Data Anal 14:89–119
41.
Zurück zum Zitat Dietterich TG (2000) Ensemble methods in machine learning. In: Multiple classifier systems. Springer, Heidelberg, pp 1–15 Dietterich TG (2000) Ensemble methods in machine learning. In: Multiple classifier systems. Springer, Heidelberg, pp 1–15
42.
Zurück zum Zitat Ting KM, Witten IH (1999) Issues in stacked generalization. J Art Intel Res 10:271–289 Ting KM, Witten IH (1999) Issues in stacked generalization. J Art Intel Res 10:271–289
43.
Zurück zum Zitat Dzeroski S, Zenko B (2002) Is combining classifiers better than selecting the best one? In: ICML, 2002, pp 123–130 Dzeroski S, Zenko B (2002) Is combining classifiers better than selecting the best one? In: ICML, 2002, pp 123–130
44.
Zurück zum Zitat Merz CJ (1999) Using correspondence analysis to combine classifiers. Mach Learn 36:33–58CrossRef Merz CJ (1999) Using correspondence analysis to combine classifiers. Mach Learn 36:33–58CrossRef
45.
46.
Zurück zum Zitat Tamon C, Xiang J (2000) On the boosting pruning problem. In: Machine learning: ECML 2000. Springer, Berlin, pp 404–412 Tamon C, Xiang J (2000) On the boosting pruning problem. In: Machine learning: ECML 2000. Springer, Berlin, pp 404–412
47.
Zurück zum Zitat Partalas I, Tsoumakas G, Vlahavas I (2009) Pruning an ensemble of classifiers via reinforcement learning. Neurocomputing 72:1900–1909CrossRef Partalas I, Tsoumakas G, Vlahavas I (2009) Pruning an ensemble of classifiers via reinforcement learning. Neurocomputing 72:1900–1909CrossRef
48.
Zurück zum Zitat Kuncheva LI, Whitaker CJ (2003) Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach Learn 51:181–207CrossRefMATH Kuncheva LI, Whitaker CJ (2003) Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach Learn 51:181–207CrossRefMATH
49.
Zurück zum Zitat Martinez-Munoz G, Hernández-Lobato D, Suárez A (2009) An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Trans Pattern Anal Mach Intell 31:245–259CrossRef Martinez-Munoz G, Hernández-Lobato D, Suárez A (2009) An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Trans Pattern Anal Mach Intell 31:245–259CrossRef
50.
Zurück zum Zitat Assaad M, Boné R, Cardot H (2008) A new boosting algorithm for improved time-series forecasting with recurrent neural networks. Inf Fusion 9:41–55CrossRef Assaad M, Boné R, Cardot H (2008) A new boosting algorithm for improved time-series forecasting with recurrent neural networks. Inf Fusion 9:41–55CrossRef
53.
Zurück zum Zitat Neto A, Calvalcanti G, Ren TI (2009) Financial time series prediction using exogenous series and combined neural networks. In: International joint conference on neural networks, 2009. IJCNN 2009, pp 149–156 Neto A, Calvalcanti G, Ren TI (2009) Financial time series prediction using exogenous series and combined neural networks. In: International joint conference on neural networks, 2009. IJCNN 2009, pp 149–156
Metadaten
Titel
Selected an Stacking ELMs for Time Series Prediction
verfasst von
Zhongchen Ma
Qun Dai
Publikationsdatum
01.12.2016
Verlag
Springer US
Erschienen in
Neural Processing Letters / Ausgabe 3/2016
Print ISSN: 1370-4621
Elektronische ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-016-9499-9

Weitere Artikel der Ausgabe 3/2016

Neural Processing Letters 3/2016 Zur Ausgabe

Neuer Inhalt