Skip to main content
Erschienen in: Neural Processing Letters 1/2016

01.02.2016

Enhancing Least Square Support Vector Regression with Gradient Information

verfasst von: Xiao Jian Zhou, Ting Jiang

Erschienen in: Neural Processing Letters | Ausgabe 1/2016

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Traditional methods of constructing of least square support vector regression (LSSVR) do not consider the gradients of the true function but just think about the exact responses at samples. If gradient information is easy to get, it should be used to enhance the surrogate. In this paper, the gradient-enhanced least square support vector regression (GELSSVR) is developed with a direct formulation by incorporating gradient information into the traditional LSSVR. The efficiencies of this technique are compared by analytical function fitting and two real life problems (the recent U.S. actuarial life table and Borehole). The results show that GELSSVR provides more reliable prediction results than LSSVR alone.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Alexander IJ, Forrester AJK (2009) Recent advances in surrogate-based optimization. Prog Aerosp Sci 45:50–79CrossRef Alexander IJ, Forrester AJK (2009) Recent advances in surrogate-based optimization. Prog Aerosp Sci 45:50–79CrossRef
3.
Zurück zum Zitat Bendtsen C, Stauning O (1996) Fadbad, a flexible c++ package for automatic differentiation. Tech. rep., Technical University of Denmark, IMM, Departement of Mathematical Modeling Bendtsen C, Stauning O (1996) Fadbad, a flexible c++ package for automatic differentiation. Tech. rep., Technical University of Denmark, IMM, Departement of Mathematical Modeling
4.
Zurück zum Zitat Bischof C, Khademi P, Mauer A, Carle A (1996) Adifor 2.0: Automatic differentiation of fortran 77 programs. Comput Sci Eng, IEEE 3(3):18–32CrossRef Bischof C, Khademi P, Mauer A, Carle A (1996) Adifor 2.0: Automatic differentiation of fortran 77 programs. Comput Sci Eng, IEEE 3(3):18–32CrossRef
5.
Zurück zum Zitat Bischof CH, Roh L, Mauer-Oats AJ (1997) Adic: an extensible automatic differentiation tool for ansi-c. Softw Pract Exp 27(12):1427–1456CrossRef Bischof CH, Roh L, Mauer-Oats AJ (1997) Adic: an extensible automatic differentiation tool for ansi-c. Softw Pract Exp 27(12):1427–1456CrossRef
6.
Zurück zum Zitat Bischof CH, Bucker H, Lang B, Rasch A, Vehreschild A (2002) Combining source transformation and operator overloading techniques to compute derivatives for matlab programs. In: Source Code Analysis and Manipulation, 2002. Proceedings. Second IEEE International Workshop on, IEEE, pp 65–72 Bischof CH, Bucker H, Lang B, Rasch A, Vehreschild A (2002) Combining source transformation and operator overloading techniques to compute derivatives for matlab programs. In: Source Code Analysis and Manipulation, 2002. Proceedings. Second IEEE International Workshop on, IEEE, pp 65–72
7.
Zurück zum Zitat Chen X, Ankenman BE, Nelson BL (2013) Enhancing stochastic kriging metamodels with gradient estimators. Oper Res 61(2):512–528MathSciNetCrossRefMATH Chen X, Ankenman BE, Nelson BL (2013) Enhancing stochastic kriging metamodels with gradient estimators. Oper Res 61(2):512–528MathSciNetCrossRefMATH
8.
Zurück zum Zitat Chung HS, Alonso JJ (2001) Using gradients to construct response surface models for high-dimensional design optimization problems. In: 39th AIAA Aerospace Sciences Meeting and Exhibit, AIAA, Reno, NV, USA Chung HS, Alonso JJ (2001) Using gradients to construct response surface models for high-dimensional design optimization problems. In: 39th AIAA Aerospace Sciences Meeting and Exhibit, AIAA, Reno, NV, USA
9.
Zurück zum Zitat Chung HS, Alonso JJ (2002a) Design of a low-boom supersonic business jet using cokriging approximation models. In: 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization, AIAA, Atlanta, GA, USA Chung HS, Alonso JJ (2002a) Design of a low-boom supersonic business jet using cokriging approximation models. In: 9th AIAA/ISSMO Symposium on Multidisciplinary Analysis and Optimization, AIAA, Atlanta, GA, USA
10.
Zurück zum Zitat Chung HS, Alonso JJ (2002b) Using gradients to construct cokriging approximation models for high-dimensional design optimization problems. In: 40th AIAA Aerospace Sciences Meeting and Exhibit, AIAA, Reno, NV, USA Chung HS, Alonso JJ (2002b) Using gradients to construct cokriging approximation models for high-dimensional design optimization problems. In: 40th AIAA Aerospace Sciences Meeting and Exhibit, AIAA, Reno, NV, USA
11.
Zurück zum Zitat Coleman TF, Verma A (2000) Admit-1: automatic differentiation and matlab interface toolbox. ACM Trans Math Softw 26(1):150–175MathSciNetCrossRefMATH Coleman TF, Verma A (2000) Admit-1: automatic differentiation and matlab interface toolbox. ACM Trans Math Softw 26(1):150–175MathSciNetCrossRefMATH
12.
Zurück zum Zitat Giering R, Kaminski T (1998) Recipes for adjoint code construction. ACM Trans Math Softw 24(4):437–474CrossRefMATH Giering R, Kaminski T (1998) Recipes for adjoint code construction. ACM Trans Math Softw 24(4):437–474CrossRefMATH
13.
Zurück zum Zitat Giles MB, Pierce NA (2000) An introduction to the adjoint approach to design. Flow, Turbul Combust 65(3–4):393–415CrossRefMATH Giles MB, Pierce NA (2000) An introduction to the adjoint approach to design. Flow, Turbul Combust 65(3–4):393–415CrossRefMATH
14.
Zurück zum Zitat Golub GH, Van Loan CF (1989) Matrix computations. Johns Hopkins University Press, BaltimoreMATH Golub GH, Van Loan CF (1989) Matrix computations. Johns Hopkins University Press, BaltimoreMATH
15.
Zurück zum Zitat Griewank A (2000) Evaluating derivatives: principles and techniques of algorithmic differentiation. SIAM, Philadelphia, PA, USA Griewank A (2000) Evaluating derivatives: principles and techniques of algorithmic differentiation. SIAM, Philadelphia, PA, USA
16.
Zurück zum Zitat Griewank A, Juedes D, Utke J (1996) Adol-c, a package for the automatic differentiation of algorithms written in c/c++. ACM Trans Math Softw 22(2):131–167CrossRefMATH Griewank A, Juedes D, Utke J (1996) Adol-c, a package for the automatic differentiation of algorithms written in c/c++. ACM Trans Math Softw 22(2):131–167CrossRefMATH
17.
Zurück zum Zitat Han ZH, Gortz S, Zimmermann R (2009) On improving efficiency and accuracy of variable-fidelity surrogate modeling in aero-data for loads context. In: Proceedings of European Air and Space Conference, CEAS, Manchester, UK Han ZH, Gortz S, Zimmermann R (2009) On improving efficiency and accuracy of variable-fidelity surrogate modeling in aero-data for loads context. In: Proceedings of European Air and Space Conference, CEAS, Manchester, UK
18.
Zurück zum Zitat Hascot L (2004) Tapenade: a tool for automatic differentiation of programs. In: Proceedings of 4th European congress on computational methods, ECCOMAS, Jyvaskyla, Finland, pp 1–14 Hascot L (2004) Tapenade: a tool for automatic differentiation of programs. In: Proceedings of 4th European congress on computational methods, ECCOMAS, Jyvaskyla, Finland, pp 1–14
20.
Zurück zum Zitat Laurenceau J, Sagaut P (2008) Building efficient response surfaces of aerodynamic functions with kriging and cokriging. AIAA J 46(2):498–507CrossRef Laurenceau J, Sagaut P (2008) Building efficient response surfaces of aerodynamic functions with kriging and cokriging. AIAA J 46(2):498–507CrossRef
21.
Zurück zum Zitat Leary SJ, Bhaskar A, Keane AJ (2004) A derivative based surrogate model for approximating and optimizing the output of an expensive computer simulation. J Global Optim 30(1):39–58MathSciNetCrossRefMATH Leary SJ, Bhaskar A, Keane AJ (2004) A derivative based surrogate model for approximating and optimizing the output of an expensive computer simulation. J Global Optim 30(1):39–58MathSciNetCrossRefMATH
22.
Zurück zum Zitat Liu W, Batill S (2000) Gradient-enhanced neural network response surface approximations. In: 8th AIAA/ISSMO Symposium and Exhibit on Multidisciplinary Analysis and Optimization, AIAA, Long Beach, California, USA Liu W, Batill S (2000) Gradient-enhanced neural network response surface approximations. In: 8th AIAA/ISSMO Symposium and Exhibit on Multidisciplinary Analysis and Optimization, AIAA, Long Beach, California, USA
23.
Zurück zum Zitat Liu W, Batill SM (2002) Gradient-enhanced response surface approximations using kriging models. In: 9th AIAA/ISSMO Symposium and Exhibit on Multidisciplinary Analysis and Optimization, AIAA, Atlanta, Georgia, USA Liu W, Batill SM (2002) Gradient-enhanced response surface approximations using kriging models. In: 9th AIAA/ISSMO Symposium and Exhibit on Multidisciplinary Analysis and Optimization, AIAA, Atlanta, Georgia, USA
24.
Zurück zum Zitat Luo Y, Tao D, Geng B, Xu C, Maybank SJ (2013a) Manifold regularized multitask learning for semi-supervised multilabel image classification. Image Proc IEEE Trans 22(2):523–536MathSciNetCrossRef Luo Y, Tao D, Geng B, Xu C, Maybank SJ (2013a) Manifold regularized multitask learning for semi-supervised multilabel image classification. Image Proc IEEE Trans 22(2):523–536MathSciNetCrossRef
25.
Zurück zum Zitat Luo Y, Tao D, Xu C, Xu C, Liu H, Wen Y (2013b) Multiview vector-valued manifold regularization for multilabel image classification. IEEE Trans Neural Networks Learn Syst 24(5):709–722CrossRef Luo Y, Tao D, Xu C, Xu C, Liu H, Wen Y (2013b) Multiview vector-valued manifold regularization for multilabel image classification. IEEE Trans Neural Networks Learn Syst 24(5):709–722CrossRef
26.
Zurück zum Zitat McKay MD, Bechman RJ, Conover WJ (1979) A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21(2):239–245MathSciNetMATH McKay MD, Bechman RJ, Conover WJ (1979) A comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21(2):239–245MathSciNetMATH
27.
Zurück zum Zitat Mitra NJ, Nguye A (2003) Estimating surface normals in noisy point cloud data. Int J Comput Geom Appl 14:261–276CrossRef Mitra NJ, Nguye A (2003) Estimating surface normals in noisy point cloud data. Int J Comput Geom Appl 14:261–276CrossRef
28.
Zurück zum Zitat Morris MD, Mitchell TJ, Ylvisaker D (1993) Bayesian design and analysis of computer experiments: use of derivatives in surface prediction. Technometrics 35(3):243–255MathSciNetCrossRefMATH Morris MD, Mitchell TJ, Ylvisaker D (1993) Bayesian design and analysis of computer experiments: use of derivatives in surface prediction. Technometrics 35(3):243–255MathSciNetCrossRefMATH
29.
Zurück zum Zitat NLBowers, Gether HU, Hiekman JC, Jones DA, Nesbi CJ (1986) Actuarial mathematics. Society of Actuaries Schaumburg NLBowers, Gether HU, Hiekman JC, Jones DA, Nesbi CJ (1986) Actuarial mathematics. Society of Actuaries Schaumburg
30.
Zurück zum Zitat Saunders C, Gammerman A, Vovk V (1998) Ridge regression learning algorithm in dual variables. In: (ICML-1998) Proceedings of the 15th International Conference on Machine Learning, Morgan Kaufmann, Madison, Wisconsin, USA, pp 515–521 Saunders C, Gammerman A, Vovk V (1998) Ridge regression learning algorithm in dual variables. In: (ICML-1998) Proceedings of the 15th International Conference on Machine Learning, Morgan Kaufmann, Madison, Wisconsin, USA, pp 515–521
31.
Zurück zum Zitat Shiriaev D, Griewank A (1996) Adol-f automatic differentiation of fortran codes. In: Computational differentiation: techniques, applications, and tools, SIAM, Philadelphia, PA, USA, pp 375–384 Shiriaev D, Griewank A (1996) Adol-f automatic differentiation of fortran codes. In: Computational differentiation: techniques, applications, and tools, SIAM, Philadelphia, PA, USA, pp 375–384
33.
Zurück zum Zitat Sobester A (2003) Enhancements to global design optimization techniques. PhD thesis Sobester A (2003) Enhancements to global design optimization techniques. PhD thesis
34.
Zurück zum Zitat Stephenson G (2010) Using derivative information in the statistical analysis of computer models. PhD thesis Stephenson G (2010) Using derivative information in the statistical analysis of computer models. PhD thesis
35.
Zurück zum Zitat Van KF, Vervenne K (2004) Gradient-enhanced response surface building. Struct Multi Optim 27:337–351 Van KF, Vervenne K (2004) Gradient-enhanced response surface building. Struct Multi Optim 27:337–351
36.
37.
Zurück zum Zitat Wang GG, Shan S (2007) Review of metamodeling techniques in support of engineering design optimization. J Mech Des 129(4):370–381MathSciNetCrossRef Wang GG, Shan S (2007) Review of metamodeling techniques in support of engineering design optimization. J Mech Des 129(4):370–381MathSciNetCrossRef
38.
Zurück zum Zitat Yamazaki W, Rumpfkeil MP, Mavriplis DJ (2010) Design optimization utilizing gradient/hessian enhanced surrogate model. In: 28th AIAA Applied Aerodynamics Conference, AIAA, Chicago, Illinois, USA Yamazaki W, Rumpfkeil MP, Mavriplis DJ (2010) Design optimization utilizing gradient/hessian enhanced surrogate model. In: 28th AIAA Applied Aerodynamics Conference, AIAA, Chicago, Illinois, USA
39.
Zurück zum Zitat Zhou XJ, Ma YZ (2013) A study on smo algorithm for solving -svr with non-psd kernels. Commun Stat Simul Comput 40(10):2175–2196MathSciNetCrossRef Zhou XJ, Ma YZ (2013) A study on smo algorithm for solving -svr with non-psd kernels. Commun Stat Simul Comput 40(10):2175–2196MathSciNetCrossRef
40.
Zurück zum Zitat Zhou XJ, Ma YZ, Li XF (2011) Ensemble of surrogates with recursive arithmetic average. Struct Multi Optim 44(5):651–671CrossRef Zhou XJ, Ma YZ, Li XF (2011) Ensemble of surrogates with recursive arithmetic average. Struct Multi Optim 44(5):651–671CrossRef
41.
Zurück zum Zitat Zhou XJ, Ma YZ, Tu YL, Feng Y (2012) Ensemble of surrogates for dual response surface modeling in robust parameter design. Qual Reliab Eng Int 29(2):173–197CrossRef Zhou XJ, Ma YZ, Tu YL, Feng Y (2012) Ensemble of surrogates for dual response surface modeling in robust parameter design. Qual Reliab Eng Int 29(2):173–197CrossRef
Metadaten
Titel
Enhancing Least Square Support Vector Regression with Gradient Information
verfasst von
Xiao Jian Zhou
Ting Jiang
Publikationsdatum
01.02.2016
Verlag
Springer US
Erschienen in
Neural Processing Letters / Ausgabe 1/2016
Print ISSN: 1370-4621
Elektronische ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-014-9402-5

Weitere Artikel der Ausgabe 1/2016

Neural Processing Letters 1/2016 Zur Ausgabe

Neuer Inhalt