Skip to main content
Erschienen in: International Journal of Machine Learning and Cybernetics 3/2011

01.09.2011 | Original Article

Gradient descent algorithms for quantile regression with smooth approximation

Erschienen in: International Journal of Machine Learning and Cybernetics | Ausgabe 3/2011

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Gradient based optimization methods often converge quickly to a local optimum. However, the check loss function used by quantile regression model is not everywhere differentiable, which prevents the gradient based optimization methods from being applicable. As such, this paper introduces a smooth function to approximate the check loss function so that the gradient based optimization methods could be employed for fitting quantile regression model. The properties of the smooth approximation are discussed. Two algorithms are proposed for minimizing the smoothed objective function. The first method directly applies gradient descent, resulting the gradient descent smooth quantile regression model; the second approach minimizes the smoothed objective function in the framework of functional gradient descent by changing the fitted model along the negative gradient direction in each iteration, which yields boosted smooth quantile regression algorithm. Extensive experiments on simulated data and real-world data show that, compared to alternative quantile regression models, the proposed smooth quantile regression algorithms can achieve higher prediction accuracy and are more efficient in removing noninformative predictors.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Weitere Produktempfehlungen anzeigen
Fußnoten
1
For example, |x| is not smooth at x = 0, and its second order derivative at x = 0 could be understood as \(\infty\) in some sense.
 
2
A random variable X follows a double exponential distribution if its probability density function is \(f(x)=\frac{1}{2} e^{-|x|}. \)
 
3
Quantile regression forest was implemented based on the R package “quantregForest”, while all other algorithms in this paper were implemented using MATLAB, thus the computing time of QReg forest is not comparable to other algorithms. As such, we choose not to provide the training time of QReg forest.
 
4
IP-QReg and MM-QReg were implemented based on the MATLAB code downloaded from http://​www.​stat.​psu.​edu/​∼dhunter/​code/​qrmatlab/​. When the dimensionality is greater than the sample size, the software package gives error message. Thus, the performances of IP-QReg and MM-QReg are not provided.
 
Literatur
1.
Zurück zum Zitat Belloni A, Chernozhukov V (2011) ℓ1-Penalized quantile regression in high-dimensional sparse models. Ann Stat 39(1):82–130MathSciNetMATHCrossRef Belloni A, Chernozhukov V (2011) ℓ1-Penalized quantile regression in high-dimensional sparse models. Ann Stat 39(1):82–130MathSciNetMATHCrossRef
2.
Zurück zum Zitat Bühlmann P, Hothorn T (2007) Boosting algorithms: regularization, prediction and model fitting. Stat Sci 22:477–505CrossRef Bühlmann P, Hothorn T (2007) Boosting algorithms: regularization, prediction and model fitting. Stat Sci 22:477–505CrossRef
3.
Zurück zum Zitat Bühlmann P, Yu B (2003) Boosting with the L 2 Loss: regression and classification. J Am Stat Assoc 98:324–340MATHCrossRef Bühlmann P, Yu B (2003) Boosting with the L 2 Loss: regression and classification. J Am Stat Assoc 98:324–340MATHCrossRef
4.
Zurück zum Zitat Cade BS, Noon BR (2003) A gentle introduction to quantile regression for ecologists. Front Ecol Environ 1(8):412–420CrossRef Cade BS, Noon BR (2003) A gentle introduction to quantile regression for ecologists. Front Ecol Environ 1(8):412–420CrossRef
5.
Zurück zum Zitat Chen C, Mangasarian OL (1996) A class of smoothing functions for nonlinear and mixed complementarity problems. Comput Optim Appl 5:97–138MathSciNetMATHCrossRef Chen C, Mangasarian OL (1996) A class of smoothing functions for nonlinear and mixed complementarity problems. Comput Optim Appl 5:97–138MathSciNetMATHCrossRef
6.
Zurück zum Zitat Duffy N, Helmbold D (2002) Boosting methods for regression. Mach Learn 47(2–3):153–200MATHCrossRef Duffy N, Helmbold D (2002) Boosting methods for regression. Mach Learn 47(2–3):153–200MATHCrossRef
7.
Zurück zum Zitat Freund Y, Schapire R (1997) A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci 55(1):119–139MathSciNetMATHCrossRef Freund Y, Schapire R (1997) A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci 55(1):119–139MathSciNetMATHCrossRef
8.
Zurück zum Zitat Friedman J (2001) Greedy function approximation: a gradient boosting machine. Ann Stat 29:1189–1232MATHCrossRef Friedman J (2001) Greedy function approximation: a gradient boosting machine. Ann Stat 29:1189–1232MATHCrossRef
9.
Zurück zum Zitat Friedman J, Hastie T, Tibshirani R (2000) Additive logistic regression: a statistical view of boosting. Ann Stat 28(2):337–407MathSciNetMATHCrossRef Friedman J, Hastie T, Tibshirani R (2000) Additive logistic regression: a statistical view of boosting. Ann Stat 28(2):337–407MathSciNetMATHCrossRef
10.
Zurück zum Zitat Hastie T, Tibshirani R, Friedman J (2009) The elements of statistical learning: data mining, inference, and prediction. 2nd edn. Springer, New YorkMATH Hastie T, Tibshirani R, Friedman J (2009) The elements of statistical learning: data mining, inference, and prediction. 2nd edn. Springer, New YorkMATH
11.
Zurück zum Zitat Hendricks W, Koenker R (1992) Hierarchical spline models for conditional quantiles and the demand for electricity. J Am Stat Assoc 93:58–68CrossRef Hendricks W, Koenker R (1992) Hierarchical spline models for conditional quantiles and the demand for electricity. J Am Stat Assoc 93:58–68CrossRef
12.
Zurück zum Zitat Hunter DR, Lange K (2000) Quantile regression via an MM algorithm. J Comput Graph Stat 19(1):60–77 Hunter DR, Lange K (2000) Quantile regression via an MM algorithm. J Comput Graph Stat 19(1):60–77
13.
Zurück zum Zitat Hwang C, Shim J (2005) A simple quantile regression via support vector machine. In: Lecture notes in computer science, vol 3610/2005, pp 512–520 Hwang C, Shim J (2005) A simple quantile regression via support vector machine. In: Lecture notes in computer science, vol 3610/2005, pp 512–520
14.
16.
Zurück zum Zitat Koenker R, Geling R (2001) Reappraising medfly longevity: a quantile regression survival analysis. J Am Stat Assoc 96:458–468MathSciNetMATHCrossRef Koenker R, Geling R (2001) Reappraising medfly longevity: a quantile regression survival analysis. J Am Stat Assoc 96:458–468MathSciNetMATHCrossRef
17.
Zurück zum Zitat Koenker R, Hallock K (2001) Quantile regression. J Econ Perspect 15:143–156CrossRef Koenker R, Hallock K (2001) Quantile regression. J Econ Perspect 15:143–156CrossRef
18.
Zurück zum Zitat Koenker R, Park BJ (1996) An interior point algorithm for nonlinear quantile regression. J Econ 71:265–283MathSciNetMATH Koenker R, Park BJ (1996) An interior point algorithm for nonlinear quantile regression. J Econ 71:265–283MathSciNetMATH
19.
Zurück zum Zitat Kriegler B, Berk R (2007) Boosting the quantile distribution: a cost-sensitive statistical learning procedure. Technical report, Department of Statitics. University of California, Los Angeles Kriegler B, Berk R (2007) Boosting the quantile distribution: a cost-sensitive statistical learning procedure. Technical report, Department of Statitics. University of California, Los Angeles
20.
Zurück zum Zitat Langford J, Oliveira R, Zadrozny B (2006) Predicting conditional quantiles via reduction to classification. In: Proceedings of uncertainty in artificical intelligence, pp 257–264 Langford J, Oliveira R, Zadrozny B (2006) Predicting conditional quantiles via reduction to classification. In: Proceedings of uncertainty in artificical intelligence, pp 257–264
21.
22.
Zurück zum Zitat Li C, Wei Y, Chappell R, He X (2011) Bent line quantile regression with application to an allometric study of land mammals’ speed and mass. Biometrics 67(1):242–249MATHCrossRef Li C, Wei Y, Chappell R, He X (2011) Bent line quantile regression with application to an allometric study of land mammals’ speed and mass. Biometrics 67(1):242–249MATHCrossRef
23.
Zurück zum Zitat Li SZ, Zhang Z (2004) FloatBoost learning and statistical face detection. IEEE Trans Pattern Anal Mach Intell 26(9):1112–1123CrossRef Li SZ, Zhang Z (2004) FloatBoost learning and statistical face detection. IEEE Trans Pattern Anal Mach Intell 26(9):1112–1123CrossRef
24.
25.
Zurück zum Zitat Li Y, Zhu J (2008) L 1-norm quantile regression. J Comput Graph Stat 17(1):163–185CrossRef Li Y, Zhu J (2008) L 1-norm quantile regression. J Comput Graph Stat 17(1):163–185CrossRef
26.
Zurück zum Zitat Mason L, Baxter J, Bartlett PL, Frean M (2000) Boosting algorithms as gradient descent. Adv Neural Inform Process Syst 12:512–518 Mason L, Baxter J, Bartlett PL, Frean M (2000) Boosting algorithms as gradient descent. Adv Neural Inform Process Syst 12:512–518
27.
Zurück zum Zitat Meinshausen N (2006) Quantile regression forests. J Mach Learn Res 7:983–999MathSciNet Meinshausen N (2006) Quantile regression forests. J Mach Learn Res 7:983–999MathSciNet
28.
Zurück zum Zitat Sohn I, Kim S, Hwang C, Lee JW (2008) New normalization methods using support vector machine quantile regression approach in microarray analysis. Comput Stat Data Anal 52(8):4104–4115MathSciNetMATHCrossRef Sohn I, Kim S, Hwang C, Lee JW (2008) New normalization methods using support vector machine quantile regression approach in microarray analysis. Comput Stat Data Anal 52(8):4104–4115MathSciNetMATHCrossRef
29.
Zurück zum Zitat Sohn I, Kim S, Hwang C, Lee JW, Shim J (2008) Support vector machine quantile regression for detecting differentially expressed genes in microarray analysis. Methods Inf Med 47(5):459–467 Sohn I, Kim S, Hwang C, Lee JW, Shim J (2008) Support vector machine quantile regression for detecting differentially expressed genes in microarray analysis. Methods Inf Med 47(5):459–467
30.
Zurück zum Zitat Takeuchi I, Le QV, Sears TD, Smola AJ (2006) Nonparametric quantile estimation. J Mach Learn Res 7:1231–1264MathSciNet Takeuchi I, Le QV, Sears TD, Smola AJ (2006) Nonparametric quantile estimation. J Mach Learn Res 7:1231–1264MathSciNet
31.
Zurück zum Zitat Torralba A, Murphy KP, Freeman WT (2004) Sharing features: efficient boosting procedures for multiclass object detection. In: Proceeding of IEEE conference on computer vision and pattern recognition (CVPR), pp 762–769 Torralba A, Murphy KP, Freeman WT (2004) Sharing features: efficient boosting procedures for multiclass object detection. In: Proceeding of IEEE conference on computer vision and pattern recognition (CVPR), pp 762–769
32.
Zurück zum Zitat Walsh GR (1975) Methods of optimization. Wiley, New YorkMATH Walsh GR (1975) Methods of optimization. Wiley, New YorkMATH
34.
Zurück zum Zitat Wu Y, Liu Y (2009) Variable selection in quantile regression. Stat Sin 19:801–817MATH Wu Y, Liu Y (2009) Variable selection in quantile regression. Stat Sin 19:801–817MATH
35.
Zurück zum Zitat Zemel R, Pitassi T (2001) A gradient-based boosting algorithm for regression problems. In: Proceedings of advances in neural information processing systems Zemel R, Pitassi T (2001) A gradient-based boosting algorithm for regression problems. In: Proceedings of advances in neural information processing systems
36.
37.
Zurück zum Zitat Zheng S (2010) Boosting based conditional quantile estimation for regression and binary classification. In: The 9th Mexican international conference on artificial intelligence, Pachuca, LNAI 6438, pp 67–79 Zheng S (2010) Boosting based conditional quantile estimation for regression and binary classification. In: The 9th Mexican international conference on artificial intelligence, Pachuca, LNAI 6438, pp 67–79
Metadaten
Titel
Gradient descent algorithms for quantile regression with smooth approximation
Publikationsdatum
01.09.2011
Erschienen in
International Journal of Machine Learning and Cybernetics / Ausgabe 3/2011
Print ISSN: 1868-8071
Elektronische ISSN: 1868-808X
DOI
https://doi.org/10.1007/s13042-011-0031-2

Weitere Artikel der Ausgabe 3/2011

International Journal of Machine Learning and Cybernetics 3/2011 Zur Ausgabe

Editorial

Editorial

Neuer Inhalt