Skip to main content

2006 | OriginalPaper | Buchkapitel

29. Logistic Regression Tree Analysis

verfasst von : Wei-Yin Loh

Erschienen in: Springer Handbook of Engineering Statistics

Verlag: Springer London

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

This chapter describes a tree-structured extension and generalization of the logistic regression method for fitting models to a binary-valued response variable. The technique overcomes a significant disadvantage of logistic regression viz. the interpretability of the model in the face of multi-collinearity and Simpsonʼs paradox. Section 29.1 summarizes the statistical theory underlying the logistic regression model and the estimation of its parameters. Section 29.2 reviews two standard approaches to model selection for logistic regression, namely, model deviance relative to its degrees of freedom and the Akaike information criterion (AIC) criterion. A dataset on tree damage during a severe thunderstorm is used to compare the approaches and to highlight their weaknesses. A recently published partial one-dimensional model that addresses some of the weaknesses is also reviewed.
Section 29.3 introduces the idea of a logistic regression tree model. The latter consists of a binary tree in which a simple linear logistic regression (i.e., a linear logistic regression using a single predictor variable) is fitted to each leaf node. A split at an intermediate node is characterized by a subset of values taken by a (possibly different) predictor variable. The objective is to partition the dataset into rectangular pieces according to the values of the predictor variables such that a simple linear logistic regression model adequately fits the data in each piece. Because the tree structure and the piecewise models can be presented graphically, the whole model can be easily understood. This is illustrated with the thunderstorm dataset using the LOTUS algorithm.
Section 29.4 describes the basic elements of the LOTUS algorithm, which is based on recursive partitioning and cost-complexity pruning. A key feature of the algorithm is a correction for bias in variable selection at the splits of the tree. Without bias correction, the splits can yield incorrect inferences. Section 29.5 shows an application of LOTUS to a dataset on automobile crash tests involving dummies. This dataset is challenging because of its large size, its mix of ordered and unordered variables, and its large number of missing values. It also provides a demonstration of Simpsonʼs paradox. The chapter concludes with some remarks in Sect. 29.5.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
29.1.
29.2.
Zurück zum Zitat A. Agresti: An Introduction to Categorical Data Analysis (Wiley, New York 1996)MATH A. Agresti: An Introduction to Categorical Data Analysis (Wiley, New York 1996)MATH
29.3.
Zurück zum Zitat J. M. Chambers, T. J. Hastie: Statistical Models in S (Wadsworth, Pacific Grove 1992)MATH J. M. Chambers, T. J. Hastie: Statistical Models in S (Wadsworth, Pacific Grove 1992)MATH
29.4.
Zurück zum Zitat K.-Y. Chan, W.-Y. Loh: LOTUS: An algorithm for building accurate and comprehensible logistic regression trees, J. Comp. Graph. Stat. 13, 826–852 (2004)CrossRefMathSciNet K.-Y. Chan, W.-Y. Loh: LOTUS: An algorithm for building accurate and comprehensible logistic regression trees, J. Comp. Graph. Stat. 13, 826–852 (2004)CrossRefMathSciNet
29.5.
Zurück zum Zitat J. N. Morgan, J. A. Sonquist: Problems in the analysis of survey data, and a proposal, J. Am. Stat. Assoc. 58, 415–434 (1963)CrossRefMATH J. N. Morgan, J. A. Sonquist: Problems in the analysis of survey data, and a proposal, J. Am. Stat. Assoc. 58, 415–434 (1963)CrossRefMATH
29.6.
Zurück zum Zitat L. Breiman, J. H. Friedman, R. A. Olshen, C. J. Stone: Classification and Regression Trees (Wadsworth, Belmont 1984)MATH L. Breiman, J. H. Friedman, R. A. Olshen, C. J. Stone: Classification and Regression Trees (Wadsworth, Belmont 1984)MATH
29.7.
Zurück zum Zitat J. R. Quinlan: Learning with continuous classes, Proceedings of AIʼ92 Australian National Conference on Artificial Intelligence (World Scientific, Singapore 1992) pp. 343–348 J. R. Quinlan: Learning with continuous classes, Proceedings of AIʼ92 Australian National Conference on Artificial Intelligence (World Scientific, Singapore 1992) pp. 343–348
29.8.
Zurück zum Zitat P. Doyle: The use of automatic interaction detector and similar search procedures, Oper. Res. Q. 24, 465–467 (1973)CrossRefMathSciNet P. Doyle: The use of automatic interaction detector and similar search procedures, Oper. Res. Q. 24, 465–467 (1973)CrossRefMathSciNet
29.9.
Zurück zum Zitat W.-Y. Loh: Regression trees with unbiased variable selection and interaction detection, Stat. Sin. 12, 361–386 (2002)MathSciNetMATH W.-Y. Loh: Regression trees with unbiased variable selection and interaction detection, Stat. Sin. 12, 361–386 (2002)MathSciNetMATH
Metadaten
Titel
Logistic Regression Tree Analysis
verfasst von
Wei-Yin Loh
Copyright-Jahr
2006
DOI
https://doi.org/10.1007/978-1-84628-288-1_29

Neuer Inhalt