Open Access
August, 1986 Generalized Additive Models
Trevor Hastie, Robert Tibshirani
Statist. Sci. 1(3): 297-310 (August, 1986). DOI: 10.1214/ss/1177013604

Abstract

Likelihood-based regression models such as the normal linear regression model and the linear logistic model, assume a linear (or some other parametric) form for the covariates $X_1, X_2, \cdots, X_p$. We introduce the class of generalized additive models which replaces the linear form $\sum \beta_jX_j$ by a sum of smooth functions $\sum s_j(X_j)$. The $s_j(\cdot)$'s are unspecified functions that are estimated using a scatterplot smoother, in an iterative procedure we call the local scoring algorithm. The technique is applicable to any likelihood-based regression model: the class of generalized linear models contains many of these. In this class the linear predictor $\eta = \Sigma \beta_jX_j$ is replaced by the additive predictor $\Sigma s_j(X_j)$; hence, the name generalized additive models. We illustrate the technique with binary response and survival data. In both cases, the method proves to be useful in uncovering nonlinear covariate effects. It has the advantage of being completely automatic, i.e., no "detective work" is needed on the part of the statistician. As a theoretical underpinning, the technique is viewed as an empirical method of maximizing the expected log likelihood, or equivalently, of minimizing the Kullback-Leibler distance to the true model.

Citation

Download Citation

Trevor Hastie. Robert Tibshirani. "Generalized Additive Models." Statist. Sci. 1 (3) 297 - 310, August, 1986. https://doi.org/10.1214/ss/1177013604

Information

Published: August, 1986
First available in Project Euclid: 19 April 2007

zbMATH: 0645.62068
MathSciNet: MR858512
Digital Object Identifier: 10.1214/ss/1177013604

Keywords: generalized linear models , nonlinearity , Nonparametric regression , partial residuals , smoothing

Rights: Copyright © 1986 Institute of Mathematical Statistics

Vol.1 • No. 3 • August, 1986
Back to Top