Skip to main content
Top

2013 | OriginalPaper | Chapter

5. Meta Net: A New Meta-Classifier Family

Authors : Massimo Buscema, William J. Tastle, Stefano Terzi

Published in: Data Mining Applications Using Artificial Adaptive Systems

Publisher: Springer New York

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

An innovative taxonomy for the classification of classifiers is presented. This new family of meta-classifiers called Meta-Net, having its foundation in the theory of independent judges, is introduced, defined, described, and shown to possess very good performance when compared to other known meta-classifiers.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
We gratefully acknowledge Marco Intraligi (Semeion Staff) who helped the authors during the training sessions.
 
2
This detailed analysis of the shared misclassifications among algorithms was conducted thanks to a suggestion of Dr. Giulia Massini (Semeion researcher).
 
3
We acknowledge Dr. Giulia Massini (2010–2011) (Semeion researcher) who has generated this dendrogram, using specific software she wrote.
 
Literature
go back to reference Aha DW, Kibler D, Albert MK (1991) Instance-based learning algorithms. Mach Learn 6:37–66 Aha DW, Kibler D, Albert MK (1991) Instance-based learning algorithms. Mach Learn 6:37–66
go back to reference Anderson JA, Rosenfeld E (eds) (1988) Neurocomputing foundations of research. The MIT Press, Cambridge, MA Anderson JA, Rosenfeld E (eds) (1988) Neurocomputing foundations of research. The MIT Press, Cambridge, MA
go back to reference Arbib MA (ed) (1995) The handbook of brain theory and neural networks. A Bradford book. The MIT Press, Cambridge, MA Arbib MA (ed) (1995) The handbook of brain theory and neural networks. A Bradford book. The MIT Press, Cambridge, MA
go back to reference Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, Oxford Bishop CM (1995) Neural networks for pattern recognition. Oxford University Press, Oxford
go back to reference Breiman L et al (1993) Classification and regression trees. Chapman and Hall, Boca Raton Breiman L et al (1993) Classification and regression trees. Chapman and Hall, Boca Raton
go back to reference Bremner D, Demaine E, Erickson J, Iacono J, Langerman S, Morin P, Toussaint G (2005) Output-sensitive algorithms for computing nearest-neighbor decision boundaries. Discrete Comput Geometry 33(4):593–604MathSciNetMATHCrossRef Bremner D, Demaine E, Erickson J, Iacono J, Langerman S, Morin P, Toussaint G (2005) Output-sensitive algorithms for computing nearest-neighbor decision boundaries. Discrete Comput Geometry 33(4):593–604MathSciNetMATHCrossRef
go back to reference Buscema M (ed) (1998a) Artificial neural networks and complex social systems – 2. Models, substance use & misuse, vol 33(2). Marcel Dekker, New York Buscema M (ed) (1998a) Artificial neural networks and complex social systems – 2. Models, substance use & misuse, vol 33(2). Marcel Dekker, New York
go back to reference Buscema M (ed) (1998b) Artificial neural networks and complex social systems – 3. Applications, substance use & misuse, vol 33(3). Marcel Dekker, New York Buscema M (ed) (1998b) Artificial neural networks and complex social systems – 3. Applications, substance use & misuse, vol 33(3). Marcel Dekker, New York
go back to reference Buscema M (ed) (1998d) Artificial neural networks and complex social systems – 1. Theory, substance use & misuse, vol 33(1). Marcel Dekker, New York Buscema M (ed) (1998d) Artificial neural networks and complex social systems – 1. Theory, substance use & misuse, vol 33(1). Marcel Dekker, New York
go back to reference Buscema M, Benzi R (2011) Quakes prediction using a highly non linear system and a minimal dataset. In: Buscema M, Ruggieri M (eds) Advanced networks, algorithms and modeling for earthquake prediction. River Publisher Series in Information Science and Technology, Aalborg Buscema M, Benzi R (2011) Quakes prediction using a highly non linear system and a minimal dataset. In: Buscema M, Ruggieri M (eds) Advanced networks, algorithms and modeling for earthquake prediction. River Publisher Series in Information Science and Technology, Aalborg
go back to reference Buscema M (1998e) MetaNet: the theory of independent judges. In: Substance use & misuse, vol 33(2) (Models). Marcel Dekker, Inc., New York, pp 439–461 Buscema M (1998e) MetaNet: the theory of independent judges. In: Substance use & misuse, vol 33(2) (Models). Marcel Dekker, Inc., New York, pp 439–461
go back to reference Buscema M (1999–2010) Supervised ANNs. Semeion software #12, version 16.0 Buscema M (1999–2010) Supervised ANNs. Semeion software #12, version 16.0
go back to reference Buscema M (2004) Genetic doping algorithm (GenD): theory and application. Expert Syst 21:2CrossRef Buscema M (2004) Genetic doping algorithm (GenD): theory and application. Expert Syst 21:2CrossRef
go back to reference Buscema M (2008–2010) MetaNets. Semeion Software #44, version 8.0 Buscema M (2008–2010) MetaNets. Semeion Software #44, version 8.0
go back to reference Buscema M et al (1999) Reti Neurali Artificiali e Sistemi Sociali Complessi. Vol II – Applicazioni. Franco Angeli, Milano, pp 288–291 [Artificial Neural Networks and Complex Systems. Applications] Buscema M et al (1999) Reti Neurali Artificiali e Sistemi Sociali Complessi. Vol II – Applicazioni. Franco Angeli, Milano, pp 288–291 [Artificial Neural Networks and Complex Systems. Applications]
go back to reference Buscema M, Grossi E, Intraligi M, Garbagna N (2005) An optimized experimental protocol based on neuro evolutionary algorithms. Application to the classification of dyspeptic patients and to the prediction of the effectiveness of their treatment. Artif Intell Med 34:279–305CrossRef Buscema M, Grossi E, Intraligi M, Garbagna N (2005) An optimized experimental protocol based on neuro evolutionary algorithms. Application to the classification of dyspeptic patients and to the prediction of the effectiveness of their treatment. Artif Intell Med 34:279–305CrossRef
go back to reference Buscema M, Terzi S, Tastle W (2010) A new meta-classifier. In: NAFIPS 2010, 12–14 July, Toronto Buscema M, Terzi S, Tastle W (2010) A new meta-classifier. In: NAFIPS 2010, 12–14 July, Toronto
go back to reference Buscema M, Terzi S, Breda M (2006) Using sinusoidal modulated weights improve feed-forward neural networks performances in classification and functional approximation problems. WSEAS Trans Inf Sci Appl 3(5):885–893 Buscema M, Terzi S, Breda M (2006) Using sinusoidal modulated weights improve feed-forward neural networks performances in classification and functional approximation problems. WSEAS Trans Inf Sci Appl 3(5):885–893
go back to reference Buscema M (1998e) Back propagation neural networks. Subst Use Misuse 33(2):233–270CrossRef Buscema M (1998e) Back propagation neural networks. Subst Use Misuse 33(2):233–270CrossRef
go back to reference Carpenter GA, Grossberg S (1991) Pattern recognition by self-organizing neural network. MIT Press, Cambridge, MA Carpenter GA, Grossberg S (1991) Pattern recognition by self-organizing neural network. MIT Press, Cambridge, MA
go back to reference Chauvin Y, Rumelhart DE (eds) (1995) Backpropagation: theory, architectures, and applications. Lawrence Erlbaum Associates, Inc. Publishers, Brodway-Hillsdale Chauvin Y, Rumelhart DE (eds) (1995) Backpropagation: theory, architectures, and applications. Lawrence Erlbaum Associates, Inc. Publishers, Brodway-Hillsdale
go back to reference Cho SB, Kim JH (1995) Combining multiple neural networks by fuzzy integral and robust classification. IEEE Trans Syst Man Cybern 25:380–384CrossRef Cho SB, Kim JH (1995) Combining multiple neural networks by fuzzy integral and robust classification. IEEE Trans Syst Man Cybern 25:380–384CrossRef
go back to reference Cleary JG, Trigg LE (1995) K*: an instance-based learner using an entropic distance measure. Machine learning inernational conference. Morgan Kaufmann Publishers, San Francisco, pp 108–114 Cleary JG, Trigg LE (1995) K*: an instance-based learner using an entropic distance measure. Machine learning inernational conference. Morgan Kaufmann Publishers, San Francisco, pp 108–114
go back to reference Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297MATH Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297MATH
go back to reference Cover TM, Hart PE (1967) Nearest neighbor pattern classification. IEEE Trans Inf Theory 13(1):21–27MATHCrossRef Cover TM, Hart PE (1967) Nearest neighbor pattern classification. IEEE Trans Inf Theory 13(1):21–27MATHCrossRef
go back to reference Day WHE (1988) Consensus methods as tools for data analysis. In: Bock HH (ed) Classification and related methods for data analysis. Elsevier Science Publishers, North Holland, pp 317–324 Day WHE (1988) Consensus methods as tools for data analysis. In: Bock HH (ed) Classification and related methods for data analysis. Elsevier Science Publishers, North Holland, pp 317–324
go back to reference Denoeux T (1995) A k-nearest neighbor classification rule based on Dempster–Shafer theory. IEEE Trans Syst Man Cybern 25:804–813CrossRef Denoeux T (1995) A k-nearest neighbor classification rule based on Dempster–Shafer theory. IEEE Trans Syst Man Cybern 25:804–813CrossRef
go back to reference Dietterich T (2002) Ensemble learning. In: Arbib MA (ed) The handbook of brain theory and neural networks, 2nd edn. The MIT Press, Cambridge, MA Dietterich T (2002) Ensemble learning. In: Arbib MA (ed) The handbook of brain theory and neural networks, 2nd edn. The MIT Press, Cambridge, MA
go back to reference Dong SL, Frank L, Kramer E (2005) Ensembles of balanced nested dichotomies for multi-class problems. Lecture notes in computer science. NUMB 3721, Springer, Berlin, pp 84–95 Dong SL, Frank L, Kramer E (2005) Ensembles of balanced nested dichotomies for multi-class problems. Lecture notes in computer science. NUMB 3721, Springer, Berlin, pp 84–95
go back to reference Duda RO, Hart PE, Stork DG (2001) Pattern classification. Wiley, New YorkMATH Duda RO, Hart PE, Stork DG (2001) Pattern classification. Wiley, New YorkMATH
go back to reference Freund Y, Schapire RE (1997) A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci 55(1):119–139MathSciNetMATHCrossRef Freund Y, Schapire RE (1997) A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci 55(1):119–139MathSciNetMATHCrossRef
go back to reference Friedman N, Geiger D, Goldszmidt M (1997) Bayesian networks classifiers. Mach Learn 29:131–163MATHCrossRef Friedman N, Geiger D, Goldszmidt M (1997) Bayesian networks classifiers. Mach Learn 29:131–163MATHCrossRef
go back to reference Geva S, Sitte J (1991) Adaptive nearest neighbor pattern classification. IEEE Trans Neural Netw 2(2):318–322 Geva S, Sitte J (1991) Adaptive nearest neighbor pattern classification. IEEE Trans Neural Netw 2(2):318–322
go back to reference Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. SIGKDD Explorations, 11(1) Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. SIGKDD Explorations, 11(1)
go back to reference Ho TK (1998) The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 20(8):832–844CrossRef Ho TK (1998) The random subspace method for constructing decision forests. IEEE Trans Pattern Anal Mach Intell 20(8):832–844CrossRef
go back to reference Ho TK (2001) Data complexity analysis for classifier combination. In: Proceedings of the international workshop on multiple classifier systems. LNCS, vol 2096. Springer, Cambridge, UK, pp 53–67 Ho TK (2001) Data complexity analysis for classifier combination. In: Proceedings of the international workshop on multiple classifier systems. LNCS, vol 2096. Springer, Cambridge, UK, pp 53–67
go back to reference Holbech S, Nielsen TD (2008) Adapting Bayes network structures to non-stationary domains. Int J Approx Reason 49:379–397MATHCrossRef Holbech S, Nielsen TD (2008) Adapting Bayes network structures to non-stationary domains. Int J Approx Reason 49:379–397MATHCrossRef
go back to reference Hopfield JJ (1988) Neural networks and physical systems with emergent collective computational abilities. Proc Natl Acad Sci 79:2554–2558MathSciNetCrossRef Hopfield JJ (1988) Neural networks and physical systems with emergent collective computational abilities. Proc Natl Acad Sci 79:2554–2558MathSciNetCrossRef
go back to reference Jacobs RA (1988) Increased rates of convergence through learning rate adaptation. Neural Netw 1(4):295–307CrossRef Jacobs RA (1988) Increased rates of convergence through learning rate adaptation. Neural Netw 1(4):295–307CrossRef
go back to reference John GH, Langley P (1995) Estimating continuous distributions in Bayesian classifiers. In: Proceedings of the eleventh conference on uncertainty in artificial intelligence. Morgan Kaufmann Publishers, San Mateo John GH, Langley P (1995) Estimating continuous distributions in Bayesian classifiers. In: Proceedings of the eleventh conference on uncertainty in artificial intelligence. Morgan Kaufmann Publishers, San Mateo
go back to reference Kamath C, Cantú-Paz E and Littau D (2001) Approximate splitting for ensembles of trees using histograms. Preprint UCRL-JC-145576, Lawrence Livermore National Laboratory, 1 Oct. US Dept of Energy, Sept 2001 Kamath C, Cantú-Paz E and Littau D (2001) Approximate splitting for ensembles of trees using histograms. Preprint UCRL-JC-145576, Lawrence Livermore National Laboratory, 1 Oct. US Dept of Energy, Sept 2001
go back to reference Keerthi SS, Shevade SK, Bhattacharyya C, Murthy KRK (2001) Improvements to Platt’s SMO algorithm for SVM classifier design. Neural Comput 13:637–649MATHCrossRef Keerthi SS, Shevade SK, Bhattacharyya C, Murthy KRK (2001) Improvements to Platt’s SMO algorithm for SVM classifier design. Neural Comput 13:637–649MATHCrossRef
go back to reference Kibriya AM, Frank E, Pfahringer B, Holmes G (2005) Multinomial Naïve Bayes for text categorization revisited. Lecture Notes Comput Sci 3339:235–252 Kibriya AM, Frank E, Pfahringer B, Holmes G (2005) Multinomial Naïve Bayes for text categorization revisited. Lecture Notes Comput Sci 3339:235–252
go back to reference Kittler J, Hatef M, Duin RPW, Matas J (1998) On combining classifiers. IEEE Trans Pattern Anal Mach Intell 20(3):226–239CrossRef Kittler J, Hatef M, Duin RPW, Matas J (1998) On combining classifiers. IEEE Trans Pattern Anal Mach Intell 20(3):226–239CrossRef
go back to reference Klir GJ (1985) Architecture of systems problem solving. Plenum Press, New YorkMATH Klir GJ (1985) Architecture of systems problem solving. Plenum Press, New YorkMATH
go back to reference Kohavi R, Provost F (1998) Glossary of terms. Editorial for the special issue on applications of machine learning and the knowledge discovery process, vol 30(2/3) Kohavi R, Provost F (1998) Glossary of terms. Editorial for the special issue on applications of machine learning and the knowledge discovery process, vol 30(2/3)
go back to reference Kohavi R (1995) A study of cross-validation and bootstrap for accuracy estimation and model selection. In: Proceedings of the fourteenth international joint conference on artificial intelligence. Morgan Kaufmann, San Mateo Kohavi R (1995) A study of cross-validation and bootstrap for accuracy estimation and model selection. In: Proceedings of the fourteenth international joint conference on artificial intelligence. Morgan Kaufmann, San Mateo
go back to reference Kohonen T (1990) Improved versions of learning vector quantization, 1st edn. International Joint Conference on Neural Networks, San Diego, pp 545–550 Kohonen T (1990) Improved versions of learning vector quantization, 1st edn. International Joint Conference on Neural Networks, San Diego, pp 545–550
go back to reference Kuncheva LI (2000) Clustering-and-Selection model for classifier combination. In: Proceedings of the knowledge-based intelligent engineering systems and allied technologies, Brighton, pp 185–188 Kuncheva LI (2000) Clustering-and-Selection model for classifier combination. In: Proceedings of the knowledge-based intelligent engineering systems and allied technologies, Brighton, pp 185–188
go back to reference Kuncheva LI (2001) Switching between selection and fusion in combining classifiers: an experiment. IEEE Trans Syst Man Cybern B 32:146–156CrossRef Kuncheva LI (2001) Switching between selection and fusion in combining classifiers: an experiment. IEEE Trans Syst Man Cybern B 32:146–156CrossRef
go back to reference Kuncheva LI (2004) Combining pattern classifiers: methods and algorithms. John Wiley and Sons, Inc., Hoboken, pp 112–125MATHCrossRef Kuncheva LI (2004) Combining pattern classifiers: methods and algorithms. John Wiley and Sons, Inc., Hoboken, pp 112–125MATHCrossRef
go back to reference Kuncheva LI, Bezdek JC, Duin RPW (2001) Decision templates for multiple classifier fusion: an experimental comparison. Pattern Recognit 34(2):299–314MATHCrossRef Kuncheva LI, Bezdek JC, Duin RPW (2001) Decision templates for multiple classifier fusion: an experimental comparison. Pattern Recognit 34(2):299–314MATHCrossRef
go back to reference LeFevre K, DeWitt DJ, Ramakrishnan R (2006) Workload Aware Anonymization, KDD’06, Philadelphia, 20–23 Aug LeFevre K, DeWitt DJ, Ramakrishnan R (2006) Workload Aware Anonymization, KDD’06, Philadelphia, 20–23 Aug
go back to reference Liu CL (2005) Classifier combination based on confidence transformation. Pattern Recognit 38(1):11–28MATHCrossRef Liu CL (2005) Classifier combination based on confidence transformation. Pattern Recognit 38(1):11–28MATHCrossRef
go back to reference Livingston F (2005) Implementing Breiman’s random forest algorithm into Weka. ECE591Q machine learning conference papers, 27 Nov Livingston F (2005) Implementing Breiman’s random forest algorithm into Weka. ECE591Q machine learning conference papers, 27 Nov
go back to reference Massini G (2010–2011) MST class. Semeion software #56, ver. 1.0, Rome Massini G (2010–2011) MST class. Semeion software #56, ver. 1.0, Rome
go back to reference McClelland JL, Rumelhart DE (1988) Explorations in parallel distributed processing. The MIT Press, Cambridge, MA McClelland JL, Rumelhart DE (1988) Explorations in parallel distributed processing. The MIT Press, Cambridge, MA
go back to reference Melville P, Mooney RJ (2003) Constructing diverse classifier ensembles using artificial training examples. In: Proceedings of the IJCAI-2003, Acapulco, pp 505–510 Melville P, Mooney RJ (2003) Constructing diverse classifier ensembles using artificial training examples. In: Proceedings of the IJCAI-2003, Acapulco, pp 505–510
go back to reference Mohammed HS, Leander J, Marbach M, Polikar R (2006) Can AdaBoost.M1 Learn Incrementally? A comparison to learn++ under different combination rules. In: Kollias S et al (eds) ICANN 2006, Part I, LNCS 4131, pp 254–263, Springer, Berlin Mohammed HS, Leander J, Marbach M, Polikar R (2006) Can AdaBoost.M1 Learn Incrementally? A comparison to learn++ under different combination rules. In: Kollias S et al (eds) ICANN 2006, Part I, LNCS 4131, pp 254–263, Springer, Berlin
go back to reference Moody J, Darken CJ (1989) Fast learning in networks of locally tuned processing units. Neural Comput 1:281–294CrossRef Moody J, Darken CJ (1989) Fast learning in networks of locally tuned processing units. Neural Comput 1:281–294CrossRef
go back to reference NeuralWare (1995) Neural computing. NeuralWare Inc., Pittsburgh NeuralWare (1995) Neural computing. NeuralWare Inc., Pittsburgh
go back to reference NeuralWare (1998) Neuralworks Professional II/Plus, version 5.35 NeuralWare (1998) Neuralworks Professional II/Plus, version 5.35
go back to reference Patterson D (1996) Artificial neural networks. Prentice Hall, SingaporeMATH Patterson D (1996) Artificial neural networks. Prentice Hall, SingaporeMATH
go back to reference Pearl J (1988) Probabilistic reasoning in intelligent systems, representation & reasoning. Morgan Kaufmann Publishers, San Mateo Pearl J (1988) Probabilistic reasoning in intelligent systems, representation & reasoning. Morgan Kaufmann Publishers, San Mateo
go back to reference Platt JC (2000) Probabilistic outputs for support vector machines and comparison to regularized likelihood methods. In: Smola AJ, Bartlett P, Scholkopf B, Schuurmans D (eds) Advances in large margin classifiers. MIT Press, Cambridge, MA Platt JC (2000) Probabilistic outputs for support vector machines and comparison to regularized likelihood methods. In: Smola AJ, Bartlett P, Scholkopf B, Schuurmans D (eds) Advances in large margin classifiers. MIT Press, Cambridge, MA
go back to reference Poggio T, Girosi F (1994) A theory of network of approximation and learning. The MIT Press, Cambridge, MA Poggio T, Girosi F (1994) A theory of network of approximation and learning. The MIT Press, Cambridge, MA
go back to reference Powell MJD (1985) Radial basis function for multi-variable interpolation: a review. IMA conference on algorithms for the approximation of function and data. RMCS, Shrivenham. Also report DAMTP/NA12, Department of Applied Mathematics and Theoretical Physics, University of Cambridge Powell MJD (1985) Radial basis function for multi-variable interpolation: a review. IMA conference on algorithms for the approximation of function and data. RMCS, Shrivenham. Also report DAMTP/NA12, Department of Applied Mathematics and Theoretical Physics, University of Cambridge
go back to reference Quinlan JR (1993) C4.5: programs for machine learning. Morgan Kaufman, San Mateo Quinlan JR (1993) C4.5: programs for machine learning. Morgan Kaufman, San Mateo
go back to reference Quinlan JR (1996) Improve use of continuous attributes in C4.5. J Artif Intell Res 4:77–90MATH Quinlan JR (1996) Improve use of continuous attributes in C4.5. J Artif Intell Res 4:77–90MATH
go back to reference Ripley BD (1996) Pattern recognition and neural networks. Cambridge University Press, Cambridge, UKMATH Ripley BD (1996) Pattern recognition and neural networks. Cambridge University Press, Cambridge, UKMATH
go back to reference Rish I (2001) An empirical study of the naïve Bayes classifier. In: IBM research report, RC 22230 (W0111-014), New York Rish I (2001) An empirical study of the naïve Bayes classifier. In: IBM research report, RC 22230 (W0111-014), New York
go back to reference Rodriguez JJ, Kuncheva LI, Alonso CJ (2006) Rotation forest: a new classifier ensemble method. IEEE Trans Pattern Anal Mach Intell 28(10):1619–1630CrossRef Rodriguez JJ, Kuncheva LI, Alonso CJ (2006) Rotation forest: a new classifier ensemble method. IEEE Trans Pattern Anal Mach Intell 28(10):1619–1630CrossRef
go back to reference Rogova G (1994) Combining the results of several neural network classifiers. Neural Netw 7:777–781CrossRef Rogova G (1994) Combining the results of several neural network classifiers. Neural Netw 7:777–781CrossRef
go back to reference Rokach L, Mainon O (2001) Theory and applications of attribute decomposition. IEEE international conference on Data Mining, 2001. ICDM 2001, Proceedings IEEE International Conference on 29 Nov-2 Dec 2001 Rokach L, Mainon O (2001) Theory and applications of attribute decomposition. IEEE international conference on Data Mining, 2001. ICDM 2001, Proceedings IEEE International Conference on 29 Nov-2 Dec 2001
go back to reference Rokach L (2009) Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography. Comput Stat Data Anal 53:4046–4072MathSciNetMATHCrossRef Rokach L (2009) Taxonomy for characterizing ensemble methods in classification tasks: a review and annotated bibliography. Comput Stat Data Anal 53:4046–4072MathSciNetMATHCrossRef
go back to reference Rumelhart DE, McClelland JL (eds) (1986) Parallel distributed processing, vol 1. Foundations, explorations in the microstructure of cognition, vol 2. Psychological and biological models. The MIT Press, Cambridge, MA Rumelhart DE, McClelland JL (eds) (1986) Parallel distributed processing, vol 1. Foundations, explorations in the microstructure of cognition, vol 2. Psychological and biological models. The MIT Press, Cambridge, MA
go back to reference Rumelhart DE, Hinton GE, Williams RJ (1986a) Learning internal representations by error propagation. In: Rumelhart DE, McClelland JL (eds) Parallel distributed processing, 1st edn, Foundations, Explorations in the Microstructure of Cognition. The MIT Press, Cambridge, MA Rumelhart DE, Hinton GE, Williams RJ (1986a) Learning internal representations by error propagation. In: Rumelhart DE, McClelland JL (eds) Parallel distributed processing, 1st edn, Foundations, Explorations in the Microstructure of Cognition. The MIT Press, Cambridge, MA
go back to reference Rumelhart DE, Smolensky P, McClelland JL, Hinton GE (1986b) Schemata and sequential thought processes in PDP models. In: McClelland JL, Rumelhart DE, the PDP Group, Parallel distributed processing, vol II. MIT Press, Cambridge, MA, pp 7–57 Rumelhart DE, Smolensky P, McClelland JL, Hinton GE (1986b) Schemata and sequential thought processes in PDP models. In: McClelland JL, Rumelhart DE, the PDP Group, Parallel distributed processing, vol II. MIT Press, Cambridge, MA, pp 7–57
go back to reference Simpson P (ed) (1996) Neural networks. Theory, technology, and applications. IEEE Technology Update Series, New YorkMATH Simpson P (ed) (1996) Neural networks. Theory, technology, and applications. IEEE Technology Update Series, New YorkMATH
go back to reference Smyth P, Wolpert DH (1999) Linearly combining density estimators via stacking. Mach Learn 36:59–83CrossRef Smyth P, Wolpert DH (1999) Linearly combining density estimators via stacking. Mach Learn 36:59–83CrossRef
go back to reference Srivastava S, Gupta MR, Frigyik BA (2007) Bayesian quadratic discriminant analysis. J Mach Learn Res 8:1277–1305MathSciNetMATH Srivastava S, Gupta MR, Frigyik BA (2007) Bayesian quadratic discriminant analysis. J Mach Learn Res 8:1277–1305MathSciNetMATH
go back to reference Tastle WJ, Wierman MJ, Dumdum UR (2005) Ranking ordinal scales using the consensus measure. Issues Inf Syst VI(2):96–102 Tastle WJ, Wierman MJ, Dumdum UR (2005) Ranking ordinal scales using the consensus measure. Issues Inf Syst VI(2):96–102
go back to reference Turney PD (1993) Robust classification with context-sensitive features. In: Proceedings of the sixth international conference on industrial and engineering applications of artificial intelligence and expert systems (IEA/AIE-93), Edinburgh Turney PD (1993) Robust classification with context-sensitive features. In: Proceedings of the sixth international conference on industrial and engineering applications of artificial intelligence and expert systems (IEA/AIE-93), Edinburgh
go back to reference Valentini G, Masulli F (2002) Ensembles of learning machines. In: Tagliaferri R, Marinaro M (eds) Neural nets, WIRN. Lecture Notes in Computer Science, vol 2486. Springer, Berlin, pp 3–19 Valentini G, Masulli F (2002) Ensembles of learning machines. In: Tagliaferri R, Marinaro M (eds) Neural nets, WIRN. Lecture Notes in Computer Science, vol 2486. Springer, Berlin, pp 3–19
go back to reference Wang PS (ed) (2010) Pattern recognition and machine vision. River Publishers, Aalborg Wang PS (ed) (2010) Pattern recognition and machine vision. River Publishers, Aalborg
go back to reference Witten IH, Frank E (2005) Data mining. Morgan Kaufmann, San FranciscoMATH Witten IH, Frank E (2005) Data mining. Morgan Kaufmann, San FranciscoMATH
go back to reference Woods K, Kegelmeyer WP, Bowyer K (1997) Combination of multiple classifiers using local accuracy estimates. IEEE Trans Pattern Anal Mach Intell 19:405–410CrossRef Woods K, Kegelmeyer WP, Bowyer K (1997) Combination of multiple classifiers using local accuracy estimates. IEEE Trans Pattern Anal Mach Intell 19:405–410CrossRef
go back to reference Zimmermann HJ (1996) Fuzzy set theory and its applications, 3rd edn. Kluwer, Boston/Dordrecht/LondonMATH Zimmermann HJ (1996) Fuzzy set theory and its applications, 3rd edn. Kluwer, Boston/Dordrecht/LondonMATH
go back to reference Zorkadis V, Karras DA, Panayotou M (2005) Efficient information theoretic strategies for classifier combination, feature extraction and performance evaluation in improving false positives and false negatives for spam e-mail filtering. Neural Netw 18(5–6), IJCNN 2005, July–Aug, pp 799–807 Zorkadis V, Karras DA, Panayotou M (2005) Efficient information theoretic strategies for classifier combination, feature extraction and performance evaluation in improving false positives and false negatives for spam e-mail filtering. Neural Netw 18(5–6), IJCNN 2005, July–Aug, pp 799–807
Metadata
Title
Meta Net: A New Meta-Classifier Family
Authors
Massimo Buscema
William J. Tastle
Stefano Terzi
Copyright Year
2013
Publisher
Springer New York
DOI
https://doi.org/10.1007/978-1-4614-4223-3_5

Premium Partner