Skip to main content
Erschienen in: Neural Computing and Applications 1/2013

01.01.2013 | Cont. Dev. of Neural Compt. & Appln.

Improving combination method of NCL experts using gating network

verfasst von: Reza Ebrahimpour, Seyed Ali Asghar Abbaszadeh Arani, Saeed Masoudnia

Erschienen in: Neural Computing and Applications | Ausgabe 1/2013

Einloggen

Aktivieren Sie unsere intelligente Suche um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Negative Correlation Learning (NCL) is a popular combining method that employs special error function for the simultaneous training of base neural network (NN) experts. In this article, we propose an improved version of NCL method in which the capability of gating network, as the combining part of Mixture of Experts method, is used to combine the base NNs in the NCL ensemble method. The special error function of the NCL method encourages each NN expert to learn different parts or aspects of the training data. Thus, the local competence of the experts should be considered in the combining approach. The gating network provides a way to support this needed functionality for combining the NCL experts. So the proposed method is called Gated NCL. The improved ensemble method is compared with the previous approaches were used for combining NCL experts, including winner-take-all (WTA) and average (AVG) combining techniques, in solving several classification problems from UCI machine learning repository. The experimental results show that our proposed ensemble method significantly improved performance over the previous combining approaches.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Kittler J, Hatef M, Duin RPW, Matas J (1998) On combining classifiers. IEEE T Pattern Anal 20(3):226–239CrossRef Kittler J, Hatef M, Duin RPW, Matas J (1998) On combining classifiers. IEEE T Pattern Anal 20(3):226–239CrossRef
2.
Zurück zum Zitat Haykin S (1994) Neural networks: a comprehensive foundation. Prentice Hall PTR, Upper Saddle RiverMATH Haykin S (1994) Neural networks: a comprehensive foundation. Prentice Hall PTR, Upper Saddle RiverMATH
3.
Zurück zum Zitat Abraham A, Grosan C, Tigan s (2007) Ensemble of hybrid neural network learning approaches for designing pharmaceutical drugs. Neural Comput Appl 16(3):307–316CrossRef Abraham A, Grosan C, Tigan s (2007) Ensemble of hybrid neural network learning approaches for designing pharmaceutical drugs. Neural Comput Appl 16(3):307–316CrossRef
4.
Zurück zum Zitat Nanni L, Lumini A (2009) Machine learning multi-classifiers for peptide classification. Neural Comput Appl 18(2):185–192CrossRef Nanni L, Lumini A (2009) Machine learning multi-classifiers for peptide classification. Neural Comput Appl 18(2):185–192CrossRef
5.
Zurück zum Zitat Sesmero M, Alonso-Weber J, Gutiérrez G, Ledezma A, Sanchis A (2010) A new artificial neural network ensemble based on feature selection and class recoding. Neural Comput Appl 1:1–13 Sesmero M, Alonso-Weber J, Gutiérrez G, Ledezma A, Sanchis A (2010) A new artificial neural network ensemble based on feature selection and class recoding. Neural Comput Appl 1:1–13
6.
Zurück zum Zitat Dai Q (2010) The build of a dynamic classifier selection ICBP system and its application to pattern recognition. Neural Comput Appl 19(1):123–137CrossRef Dai Q (2010) The build of a dynamic classifier selection ICBP system and its application to pattern recognition. Neural Comput Appl 19(1):123–137CrossRef
7.
Zurück zum Zitat Yu-Quan Z, Ji-Shun O, Geng C, Hai-Ping Y (2011) Dynamic weighting ensemble classifiers based on cross-validation. Neural Comput Appl 20(3):309–317 Yu-Quan Z, Ji-Shun O, Geng C, Hai-Ping Y (2011) Dynamic weighting ensemble classifiers based on cross-validation. Neural Comput Appl 20(3):309–317
8.
Zurück zum Zitat Kuncheva LI (2004) Combining pattern classifiers: methods and algorithms. Wiley-Interscience, LondonMATHCrossRef Kuncheva LI (2004) Combining pattern classifiers: methods and algorithms. Wiley-Interscience, LondonMATHCrossRef
9.
Zurück zum Zitat Tresp V, Taniguchi M (1995) Combining estimators using non-constant weighting functions. In: Tesauro G, Touretzky DS, Leen TK (eds) Advances in neural information processing systems, vol. 7. MIT Press, Cambridge, pp 419–426 Tresp V, Taniguchi M (1995) Combining estimators using non-constant weighting functions. In: Tesauro G, Touretzky DS, Leen TK (eds) Advances in neural information processing systems, vol. 7. MIT Press, Cambridge, pp 419–426
10.
Zurück zum Zitat Meir R (1995) Bias, variance and the combination of least squares estimators; the case of least linear squares. In: Tesauro G, Touretzky DS (eds) Advances in neural information processing systems, vol. 7. MIT Press, Cambridge, pp 295–302 Meir R (1995) Bias, variance and the combination of least squares estimators; the case of least linear squares. In: Tesauro G, Touretzky DS (eds) Advances in neural information processing systems, vol. 7. MIT Press, Cambridge, pp 295–302
11.
Zurück zum Zitat Tumer K, Ghosh J (1996) Error correlation and error reduction in ensemble classifiers. Connect Sci 8(3):385–404CrossRef Tumer K, Ghosh J (1996) Error correlation and error reduction in ensemble classifiers. Connect Sci 8(3):385–404CrossRef
12.
Zurück zum Zitat Jacobs RA (1997) Bias/variance analyses of mixtures-of-experts architectures. Neural Comput 9(2):369–383MATHCrossRef Jacobs RA (1997) Bias/variance analyses of mixtures-of-experts architectures. Neural Comput 9(2):369–383MATHCrossRef
13.
Zurück zum Zitat Hansen JV (2000) Combining predictors: meta machine learning methods and bias/variance and ambiguity decompositions. Computer Science Dept, Aarhus Univ Hansen JV (2000) Combining predictors: meta machine learning methods and bias/variance and ambiguity decompositions. Computer Science Dept, Aarhus Univ
15.
Zurück zum Zitat Schapire RE (1990) The strength of weak learn ability. Mach Learn 5(2):197–227 Schapire RE (1990) The strength of weak learn ability. Mach Learn 5(2):197–227
16.
Zurück zum Zitat Liu Y, Yao X (1999) Ensemble learning via negative correlation. Neural Networks 12(10):1399–1404CrossRef Liu Y, Yao X (1999) Ensemble learning via negative correlation. Neural Networks 12(10):1399–1404CrossRef
17.
Zurück zum Zitat Jacobs RA, Jordan MI, Nowlan SJ, Hinton GE (1991) Adaptive mixtures of local experts. Neural Comput 3(1):79–87CrossRef Jacobs RA, Jordan MI, Nowlan SJ, Hinton GE (1991) Adaptive mixtures of local experts. Neural Comput 3(1):79–87CrossRef
19.
Zurück zum Zitat Waterhouse SR (1997) Classification and regression using mixtures of experts. Unpublished doctoral dissertation, Cambridge University Waterhouse SR (1997) Classification and regression using mixtures of experts. Unpublished doctoral dissertation, Cambridge University
20.
Zurück zum Zitat Waterhouse S, Cook G (1997) Ensemble methods for phoneme classification. In: Mozer MC, Jordan MI, Petsche T (eds) Advances in neural information processing systems, vol. 9. MIT Press, Cambridge, pp 800–806 Waterhouse S, Cook G (1997) Ensemble methods for phoneme classification. In: Mozer MC, Jordan MI, Petsche T (eds) Advances in neural information processing systems, vol. 9. MIT Press, Cambridge, pp 800–806
21.
Zurück zum Zitat Avnimelech R, Intrator N (1999) Boosted mixture of experts: an ensemble learning scheme. Neural Comput 11(2):483–497CrossRef Avnimelech R, Intrator N (1999) Boosted mixture of experts: an ensemble learning scheme. Neural Comput 11(2):483–497CrossRef
22.
Zurück zum Zitat Liu Y, Yao X (1999) Simultaneous training of negatively correlated neural networks in an ensemble. IEEE T Syst Man Cy B 29(6):716–725CrossRef Liu Y, Yao X (1999) Simultaneous training of negatively correlated neural networks in an ensemble. IEEE T Syst Man Cy B 29(6):716–725CrossRef
23.
Zurück zum Zitat Ueda N, Nakano R (1996) Generalization error of ensemble estimators. In: IEEE International Conference on Neural Networks, vol. 91. Washington, DC, pp 90–95 Ueda N, Nakano R (1996) Generalization error of ensemble estimators. In: IEEE International Conference on Neural Networks, vol. 91. Washington, DC, pp 90–95
24.
Zurück zum Zitat Jacobs RA, Jordan MI, Barto AG (1991) Task decomposition through competition in a modular connectionist architecture: the what and where vision tasks. Cognitive Sci 15(2):219–250CrossRef Jacobs RA, Jordan MI, Barto AG (1991) Task decomposition through competition in a modular connectionist architecture: the what and where vision tasks. Cognitive Sci 15(2):219–250CrossRef
25.
Zurück zum Zitat Ebrahimpour R, Nikoo H, Masoudnia S, Yousefi MR, Ghaemi MS (2010) Mixture of MLP-experts for trend forecasting of time series: a case study of the Tehran stock exchange. Int J Forecast 27(3):804–816 Ebrahimpour R, Nikoo H, Masoudnia S, Yousefi MR, Ghaemi MS (2010) Mixture of MLP-experts for trend forecasting of time series: a case study of the Tehran stock exchange. Int J Forecast 27(3):804–816
27.
Zurück zum Zitat Polikar R (2006) Ensemble based systems in decision making. Circ Sys Mag IEEE 6(3):21–45CrossRef Polikar R (2006) Ensemble based systems in decision making. Circ Sys Mag IEEE 6(3):21–45CrossRef
28.
Zurück zum Zitat Brown G (2009) Ensemble learning. In: Sammut C, Webb G (eds) Encyclopedia of machine learning. Springer, New York Brown G (2009) Ensemble learning. In: Sammut C, Webb G (eds) Encyclopedia of machine learning. Springer, New York
31.
Zurück zum Zitat Lowry R (2005) Concepts and applications of inferential statistics. Web Site for Statistical Computation, VassarStats Lowry R (2005) Concepts and applications of inferential statistics. Web Site for Statistical Computation, VassarStats
Metadaten
Titel
Improving combination method of NCL experts using gating network
verfasst von
Reza Ebrahimpour
Seyed Ali Asghar Abbaszadeh Arani
Saeed Masoudnia
Publikationsdatum
01.01.2013
Verlag
Springer-Verlag
Erschienen in
Neural Computing and Applications / Ausgabe 1/2013
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-011-0746-8

Weitere Artikel der Ausgabe 1/2013

Neural Computing and Applications 1/2013 Zur Ausgabe