Skip to main content
Erschienen in: Neural Processing Letters 1/2015

01.08.2015

Neighborhood Guided Smoothed Emphasis for Real Adaboost Ensembles

verfasst von: Anas Ahachad, Adil Omari, Aníbal R. Figueiras-Vidal

Erschienen in: Neural Processing Letters | Ausgabe 1/2015

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The exceptional capabilities of the much celebrated Real Adaboost ensembles for solving decision and classification problems are universally recognized. These capabilities come from progressively constructing unstable and weak learners that pay more attention to samples that oppose more difficulties to be correctly classified, and linearly combine them in a progressive manner. However, the corresponding emphasis can be excessive, especially when there is an intensive noise or in the presence of outliers. Although there are many modifications to control the emphasis, they show limited success for imbalanced or asymmetric problems. In this paper, we use the neighborhood concept to design a simple modification of the emphasis mechanisms that is able to deal with these situations. It can also be combined with other emphasis control mechanisms. Experimental results confirm the potential of the proposed modification by itself and also when combined with a previously tested mixed emphasis algorithm. The main conclusions of our work and some suggestions for further research along this line close the paper.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Kuncheva LI (2004) Combining pattern classifiers: methods and algorithms. Wiley, HobokenCrossRef Kuncheva LI (2004) Combining pattern classifiers: methods and algorithms. Wiley, HobokenCrossRef
2.
Zurück zum Zitat Rokach L (2010) Pattern classification using ensemble methods. World Scientific, Singapore Rokach L (2010) Pattern classification using ensemble methods. World Scientific, Singapore
3.
Zurück zum Zitat Schapire RE, Freund Y (2012) Boosting: foundation and algorithms. MIT Press, Cambridge Schapire RE, Freund Y (2012) Boosting: foundation and algorithms. MIT Press, Cambridge
4.
Zurück zum Zitat Schapire RE, Singer Y (1999) Improved boosting algorithms using confidence-rated predictions. Mach Learn 37:297–336CrossRef Schapire RE, Singer Y (1999) Improved boosting algorithms using confidence-rated predictions. Mach Learn 37:297–336CrossRef
5.
Zurück zum Zitat Bauer E, Kohavi R (1997) An empirical comparison of voting classification algorithms: bagging, boosting, and variants. Mach Learn 36:105–139CrossRef Bauer E, Kohavi R (1997) An empirical comparison of voting classification algorithms: bagging, boosting, and variants. Mach Learn 36:105–139CrossRef
6.
Zurück zum Zitat Dietterich TG (2000) An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Mach Learn 40:139–157CrossRef Dietterich TG (2000) An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization. Mach Learn 40:139–157CrossRef
7.
Zurück zum Zitat Breiman L (1999) Prediction games and arcing algorithms. Neural Comput 11:1493–1518CrossRef Breiman L (1999) Prediction games and arcing algorithms. Neural Comput 11:1493–1518CrossRef
8.
Zurück zum Zitat Freund Y (2001) An adaptive version of the boost by majority algorithm. Mach Learn 43:293–318CrossRef Freund Y (2001) An adaptive version of the boost by majority algorithm. Mach Learn 43:293–318CrossRef
9.
Zurück zum Zitat Gómez-Verdejo V, Ortega-Moral M, Arenas-García J, Figueiras-Vidal AR (2006) Boosting by weighting critical and erroneous samples. Neurocomputing 69:679–685CrossRef Gómez-Verdejo V, Ortega-Moral M, Arenas-García J, Figueiras-Vidal AR (2006) Boosting by weighting critical and erroneous samples. Neurocomputing 69:679–685CrossRef
10.
Zurück zum Zitat Gómez-Verdejo V, Arenas-García J, Figueiras-Vidal AR (2008) A dynamically adjusted mixed emphasis method for building boosting ensembles. IEEE Trans Neural Netw 19:3–17CrossRef Gómez-Verdejo V, Arenas-García J, Figueiras-Vidal AR (2008) A dynamically adjusted mixed emphasis method for building boosting ensembles. IEEE Trans Neural Netw 19:3–17CrossRef
11.
Zurück zum Zitat Rätsch G, Onoda T, Müller KR (1999) Regularizing adaboost. In: Kearns M, Solla S, Cohn D (eds) Advances in neural information processing systems, vol 11. MIT Press, Cambridge, pp 564–570 Rätsch G, Onoda T, Müller KR (1999) Regularizing adaboost. In: Kearns M, Solla S, Cohn D (eds) Advances in neural information processing systems, vol 11. MIT Press, Cambridge, pp 564–570
12.
Zurück zum Zitat Rätsch G, Onoda T, Müller KR (2001) Soft margins for AdaBoost. Mach Learn 42:287–320CrossRef Rätsch G, Onoda T, Müller KR (2001) Soft margins for AdaBoost. Mach Learn 42:287–320CrossRef
13.
Zurück zum Zitat Rätsch G, Warmuth MK (2005) Efficient margin maximizing with boosting. J Mach Learn Res 6:2131–2152 Rätsch G, Warmuth MK (2005) Efficient margin maximizing with boosting. J Mach Learn Res 6:2131–2152
14.
Zurück zum Zitat Sun Y, Todorovic S, Li J (2006) Reducing the overfitting of AdaBoost by controlling its data distribution skewness. Pattern Recognit Artif Intell 20:1093–1116CrossRef Sun Y, Todorovic S, Li J (2006) Reducing the overfitting of AdaBoost by controlling its data distribution skewness. Pattern Recognit Artif Intell 20:1093–1116CrossRef
15.
Zurück zum Zitat Shen C, Li H (2010) Boosting through optimization of margin distributions. IEEE Trans Neural Netw 21:659–666CrossRef Shen C, Li H (2010) Boosting through optimization of margin distributions. IEEE Trans Neural Netw 21:659–666CrossRef
16.
Zurück zum Zitat Zhang C-X, Zhang J-S, Zhang G-Y (2008) An efficient modified boosting method for solving classification problems. Comput Appl Math 214:381–392CrossRef Zhang C-X, Zhang J-S, Zhang G-Y (2008) An efficient modified boosting method for solving classification problems. Comput Appl Math 214:381–392CrossRef
17.
Zurück zum Zitat Zhang C-X, Zhang J-S (2008) A local boosting algorithm for solving classification problems. Comput Stat Data Anal 52:1928–1941CrossRef Zhang C-X, Zhang J-S (2008) A local boosting algorithm for solving classification problems. Comput Stat Data Anal 52:1928–1941CrossRef
18.
Zurück zum Zitat Mayhua-López E, Gómez-Verdejo V, Figueiras-Vidal AR Boosting ensembles with subsampling LPSVM learners. Inf Fusion (submitted) Mayhua-López E, Gómez-Verdejo V, Figueiras-Vidal AR Boosting ensembles with subsampling LPSVM learners. Inf Fusion (submitted)
19.
Zurück zum Zitat Rangel P, Lozano F, García E (2005) Boosting of Support Vector Machines with application to editing. In: Proceedings of the 4th international conference on machine learning and applications. IEEE Computer Society, Los Angeles, pp 374–379 Rangel P, Lozano F, García E (2005) Boosting of Support Vector Machines with application to editing. In: Proceedings of the 4th international conference on machine learning and applications. IEEE Computer Society, Los Angeles, pp 374–379
20.
Zurück zum Zitat Li X, Wang L, Sung E (2008) Adaboost with SVM-based component classifiers. Eng Appl Artif Intell 21:785–795CrossRef Li X, Wang L, Sung E (2008) Adaboost with SVM-based component classifiers. Eng Appl Artif Intell 21:785–795CrossRef
21.
Zurück zum Zitat Omari A, Figueiras-Vidal AR (2013) Feature combiners with gate-generated weights for classification. IEEE Trans Neural Netw Learn Syst 24:158–163CrossRef Omari A, Figueiras-Vidal AR (2013) Feature combiners with gate-generated weights for classification. IEEE Trans Neural Netw Learn Syst 24:158–163CrossRef
22.
Zurück zum Zitat Mayhua-López E, Gómez-Verdejo V, Figueiras-Vidal AR (2012) Real Adaboost with gate controlled fusion. IEEE Trans Neural Netw Learn Syst 23:2003–2009CrossRef Mayhua-López E, Gómez-Verdejo V, Figueiras-Vidal AR (2012) Real Adaboost with gate controlled fusion. IEEE Trans Neural Netw Learn Syst 23:2003–2009CrossRef
23.
Zurück zum Zitat Lyhyaoui A, Martínez-Ramón M, Mora-Jiménez I, Vázquez-Castro M, Sancho-Gómez JL, Figueiras-Vidal AR (1999) Sample selection via clustering to construct support vector-like classifiers. IEEE Trans Neural Netw 10:1474–1481CrossRef Lyhyaoui A, Martínez-Ramón M, Mora-Jiménez I, Vázquez-Castro M, Sancho-Gómez JL, Figueiras-Vidal AR (1999) Sample selection via clustering to construct support vector-like classifiers. IEEE Trans Neural Netw 10:1474–1481CrossRef
24.
Zurück zum Zitat Shin H, Cho S (2007) Neighborhood property-based pattern selection for Support Vector Machines. Neural Comput 19:816–855CrossRef Shin H, Cho S (2007) Neighborhood property-based pattern selection for Support Vector Machines. Neural Comput 19:816–855CrossRef
25.
Zurück zum Zitat Geebelen D, Suykens JAK, Vandewalle J (2012) Reducing the number of support vectors of SVM classifiers using the smoothed separable case approximation. IEEE Trans Neural Netw Learn Syst 23:682–688CrossRef Geebelen D, Suykens JAK, Vandewalle J (2012) Reducing the number of support vectors of SVM classifiers using the smoothed separable case approximation. IEEE Trans Neural Netw Learn Syst 23:682–688CrossRef
27.
Zurück zum Zitat Ripley BD (1996) Pattern recognition and neural networks. Cambridge University Press, CambridgeCrossRef Ripley BD (1996) Pattern recognition and neural networks. Cambridge University Press, CambridgeCrossRef
28.
Zurück zum Zitat Kwok JTY (1999) Moderating the outputs of support vector machine classifiers. IEEE Trans Neural Netw 10:1018–1031CrossRef Kwok JTY (1999) Moderating the outputs of support vector machine classifiers. IEEE Trans Neural Netw 10:1018–1031CrossRef
29.
Zurück zum Zitat Wilcoxon F (1945) Individual comparisons by ranking methods. Biom Bull 1:80–83CrossRef Wilcoxon F (1945) Individual comparisons by ranking methods. Biom Bull 1:80–83CrossRef
30.
Zurück zum Zitat Ahachad A, Omari A, Figueiras-Vidal AR (2013) Smoothed emphasis for boosting ensembles. In: Rojas I, Joya G, Cabestany J (eds) Advances in computational intelligence. LNCS, vol 7902. Springer, Berlin, pp 367–375CrossRef Ahachad A, Omari A, Figueiras-Vidal AR (2013) Smoothed emphasis for boosting ensembles. In: Rojas I, Joya G, Cabestany J (eds) Advances in computational intelligence. LNCS, vol 7902. Springer, Berlin, pp 367–375CrossRef
Metadaten
Titel
Neighborhood Guided Smoothed Emphasis for Real Adaboost Ensembles
verfasst von
Anas Ahachad
Adil Omari
Aníbal R. Figueiras-Vidal
Publikationsdatum
01.08.2015
Verlag
Springer US
Erschienen in
Neural Processing Letters / Ausgabe 1/2015
Print ISSN: 1370-4621
Elektronische ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-014-9386-1

Weitere Artikel der Ausgabe 1/2015

Neural Processing Letters 1/2015 Zur Ausgabe

Neuer Inhalt