Elsevier

Pattern Recognition

Volume 30, Issue 11, November 1997, Pages 1893-1904
Pattern Recognition

Structure-driven induction of decision tree classifiers through neural learning

https://doi.org/10.1016/S0031-3203(97)00005-8Get rights and content

Abstract

The decision tree classifiers represent a nonparametric classification methodology that is equally popular in pattern recognition and machine learning. Such classifiers are also popular in neural networks under the label of neural trees. This paper presents a new approach for designing these classifiers. Instead of following the common top-down approach to generate a decision tree, a structure-driven approach for induction of decision trees, SDIDT, is proposed. In this approach, a tree structure of fixed size with empty internal nodes, i.e. nodes without any splitting function, and labeled terminal nodes is first assumed. Using a collection of training vectors of known classification, a neural learning scheme combining backpropagation and soft competitive learning is then used to simultaneously determine the splits for each decision tree node. The advantage of the SDIDT approach is that it generates compact trees that have multifeature splits at each internal node which are determined on global rather than local basis; consequently it produces decision trees yielding better classification and interpretation of the underlying relationships in the data. Several well-known examples of data sets of different complexities and characteristics are used to demonstrate the strengths of the SDIDT method.

References (39)

  • K. Srinivasan et al.

    Machine learning approaches to estimating software-development effort

    IEEE Trans. Software Eng.

    (1995)
  • D. Bowserchao et al.

    Comparison of the use of binary decision trees and neural networks in top-quark detection

    Phys. Rev. D

    (1993)
  • L. Breiman et al.

    Classification and Regression Tree

    (1984)
  • S.B. Gelfand et al.

    An iterative growing and pruning algorithm for classification tree design

    IEEE Trans. Pattern Analysis Mach. Intell.

    (1991)
  • R.M. Goodman et al.

    Decision tree design from a communication theory standpoint

    IEEE Trans. Inform. Theory

    (1988)
  • X. Li et al.

    Tree classifiers with a permutation statistic

    Pattern Recognition

    (1990)
  • J.R. Quinlan

    Induction of decision trees

    Mach. Learning

    (1986)
  • I.K. Sethi et al.

    Efficient decision tree design for discrete variable pattern recognition problems

    Pattern Recognition

    (1978)
  • I.K. Sethi et al.

    Hierarchical classifier design using mutual information

    IEEE Trans. Pattern Analysis Mach. Intell.

    (1982)
  • Cited by (27)

    • A novel acoustic emission detection module for leakage recognition in a gas pipeline valve

      2017, Process Safety and Environmental Protection
      Citation Excerpt :

      Then, the effective features were trained and tested using the SVM to determine the level of valve leakage. To verify the SVM models, the performance of the classifiers, including the accuracy, the Cohen’s kappa number, and the training time, were compared to the corresponding data from the k-nearest neighbor classifier (k-NN) (Liao and Vemuri, 2002), neural network classifier (NN) (Yu and Junsheng, 2006), naive Bayes classifier (NB) (Jiang et al., 2012), and decision tree classifier (DT) (Sethi, 1997). Acoustic emission is a spontaneous release of elastic energy during the deformation of a material and can be detected by an AE sensor in all directions.

    • A balanced neural tree for pattern classification

      2012, Neural Networks
      Citation Excerpt :

      On the other hand, one cannot decide an ideal architecture of an NN (number of hidden layers and number of nodes in each hidden layer) for a given training dataset. For this reason, a hybridisation of these two methodologies called neural tree (NT) (Deffuant, 1990; Lippmann, 1987; Sankar & Mammone, 1992; Sethi & Yoo, 1997; Sirat & Nadal, 1990; Utgoff, 1989), has been investigated to combine the advantages of both DTs and NNs. Some approaches to this problem were motivated by the lack of a reliable procedure for defining the appropriate size of feed-forward neural networks in practical applications.

    • Efficient design of neural network tree using a new splitting criterion

      2008, Neurocomputing
      Citation Excerpt :

      The NNTree is a decision tree with each non-terminal node being a neural network. In [35], Sethi and Yoo have proposed a decision tree whose hierarchy of splits is determined in a global fashion by a neural learning algorithm. Recently, Zhou and Chen [45] have introduced a hybrid learning approach named HDT that embeds neural network in some leaf nodes of a binary decision tree.

    • Geno-fuzzy classification trees

      2004, Pattern Recognition
    • Neural network induction graph for pattern recognition

      2004, Neurocomputing
      Citation Excerpt :

      This method is now much used in browsing data methods such as knowledge retrieval from the data (also called data-mining [7]). Many works use a tree structure to build either a neural tree [19] (the nodes of the tree being neurons which are used as non-linear binary decision functions), or neural networks trees [17,18] (nodes of the tree being neural networks which are used as non-linear n-ary decision functions). We propose to define a new structure based on a graph of neural networks which is called an NNIG.

    View all citing articles on Scopus
    View full text