Weitere Kapitel dieses Buchs durch Wischen aufrufen
In the analysis of experiments, there are many variable selection algorithms for linear models. Most of these approaches select the best model based on some criteria such as AIC. These criteria do not allow for any relationship between predictors. However, in practice, the analysis is driven by following three principles: Effect Hierarchy, Effect Sparsity, and Effect Heredity Principle. The approach depending solely on those criteria ignore these principles, so it would often select a hard to interpretable models, for instance, which are consisted with only interaction terms. In this article, we extend the LASSO method to identify significant interaction terms mainly focusing on the heredity principle. And we compare the proposed method with ordinary LASSO and traditional variable selection approach. In the example, we analyze the data obtained from designed experiments such as Placket-Burman design and supersaturated design.
Bitte loggen Sie sich ein, um Zugang zu diesem Inhalt zu erhalten
Sie möchten Zugang zu diesem Inhalt erhalten? Dann informieren Sie sich jetzt über unsere Produkte:
Breiman, L. (1995). Better subset regression using the non-negative garrote. Technometrics, 37, 373–384.
Chipman, H., Hamada, M., & Wu, C. F. J. (1997). A Bayesian variable-selection approach for analyzing designed experiments with complex aliasing. Technometrics, 39, 372–381.
Efron, B., Hastie, T., Johnstone, I., & Tibshirani, R. (2004). Least angle regressionh (with discussion). The Annals of Statistics, 32(2), 407–499.
Hamada, M., & Wu, C. F. J. (1992). Analysis of designed experiments with complex aliasing. Journal of Quality Technology, 24, 130–137.
Lin, D. K. J. (1993). A new class of supersaturated designs. Technometrics, 35, 28–31.
Nam, H. C., William, L., & Ji. Z. (2010). Variable selection with the strong heredity constraint and its oracle property. Journal of the American Statistical Association, 105, 354–364.
Nelder, J. A. (1998). The selection of terms in response-surface models: How strong is the weak-heredity principle? The American Statistician, 52, 315–318.
Tibshirani, R. (1996). Regression shrinkage and selection via the Lasso. Journal of the Royal Statistical Society Series B, 58, 267–288.
Wu, C. F. J., & Hamada, M. (2000). Experiments: Planning, analysis, and parameter design optimization. New York: Wiley.
Yuan, M., Joseph, V., & Lin, Y. (2007). An efficient variable selection approach for analyzing designed experiments. Technometrics, 49(4), 430–439.
Zhang, H., & Lu, W. (2007). Adaptive LASSO for Cox’s proportional hazard model. Biometrika, 94, 691–703.
Zou, H. (2006). The adaptive Lasso and its oracle properties. Journal of the American Statistical Association, 101(476), 1418–1429.
- A Practical Variable Selection for Linear Models
- Physica-Verlag HD
Neuer Inhalt/© ITandMEDIA