Skip to main content
Top
Published in: Neural Processing Letters 1/2021

03-01-2021

On Regularization Based Twin Support Vector Regression with Huber Loss

Authors: Umesh Gupta, Deepak Gupta

Published in: Neural Processing Letters | Issue 1/2021

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Twin support vector regression (TSVR) is generally employed with \( \varepsilon \)-insensitive loss function which is not well capable to handle the noises and outliers. According to the definition, Huber loss function performs as quadratic for small errors and linear for others and shows better performance in comparison to Gaussian loss hence it restrains easily for a different type of noises and outliers. Recently, TSVR with Huber loss (HN-TSVR) has been suggested to handle the noise and outliers. Like TSVR, it is also having the singularity problem which degrades the performance of the model. In this paper, regularized version of HN-TSVR is proposed as regularization based twin support vector regression (RHN-TSVR) to avoid the singularity problem of HN-TSVR by applying the structured risk minimization principle that leads to our model convex and well-posed. This proposed RHN-TSVR model is well capable to handle the noise as well as outliers and avoids the singularity issue. To show the validity and applicability of proposed RHN-TSVR, various experiments perform on several artificial generated datasets having uniform, Gaussian and Laplacian noise as well as on benchmark different real-world datasets and compare with support vector regression, TSVR, \( \varepsilon \)-asymmetric Huber SVR, \( \varepsilon \)-support vector quantile regression and HN-TSVR. Here, all benchmark real-world datasets are embedded with a different significant level of noise 0%, 5% and 10% on different reported algorithms with the proposed approach. The proposed algorithm RHN-TSVR is showing better prediction ability on artificial datasets as well as real-world datasets with a different significant level of noise compared to other reported models.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Anand P, Rastogi R, Chandra S (2019) A new asymmetric ϵ-insensitive pinball loss function based support vector quantile regression model. Appl Soft Comput 94:1–14 Anand P, Rastogi R, Chandra S (2019) A new asymmetric ϵ-insensitive pinball loss function based support vector quantile regression model. Appl Soft Comput 94:1–14
2.
go back to reference Anand P, Rastogi R, Chandra S (2019) Support vector regression via a combined reward cum penalty loss function. arXiv: 1904.12331v2 [cs.LG] version: 2, pp 1–13 Anand P, Rastogi R, Chandra S (2019) Support vector regression via a combined reward cum penalty loss function. arXiv:​ 1904.​12331v2 [cs.LG] version: 2, pp 1–13
3.
go back to reference Bai L, Shao Y-H, Wang Z, Li C-N (2019) Clustering by twin support vector machine and least square twin support vector classifier with uniform output coding. Knowl-Based Syst 163:227–240CrossRef Bai L, Shao Y-H, Wang Z, Li C-N (2019) Clustering by twin support vector machine and least square twin support vector classifier with uniform output coding. Knowl-Based Syst 163:227–240CrossRef
4.
go back to reference Balasundaram S, Prasad SC (2019) Robust twin support vector regression based on Huber loss function. Neural Comput Appl 32:1–25 Balasundaram S, Prasad SC (2019) Robust twin support vector regression based on Huber loss function. Neural Comput Appl 32:1–25
5.
go back to reference Balasundaram S, Meena Y (2019) Robust support vector regression in primal with asymmetric Huber loss. Neural Process Lett 49(3):1399–1431CrossRef Balasundaram S, Meena Y (2019) Robust support vector regression in primal with asymmetric Huber loss. Neural Process Lett 49(3):1399–1431CrossRef
6.
go back to reference Chen S-G, Xiao-Jun W (2018) A new fuzzy twin support vector machine for pattern classification. Int J Mach Learn Cybernet 9(9):1553–1564CrossRef Chen S-G, Xiao-Jun W (2018) A new fuzzy twin support vector machine for pattern classification. Int J Mach Learn Cybernet 9(9):1553–1564CrossRef
7.
go back to reference Chen, S, Liu X, Li B (2018) A cost-sensitive loss function for machine learning. In: International conference on database systems for advanced applications vol 10829. LNCS. Springer, Cham, pp 255–268 Chen, S, Liu X, Li B (2018) A cost-sensitive loss function for machine learning. In: International conference on database systems for advanced applications vol 10829. LNCS. Springer, Cham, pp 255–268
8.
go back to reference Chen C, Yan C, Zhao N, Guo B, Liu G (2017) A robust algorithm of support vector regression with a trimmed Huber loss function in the primal. Soft Comput 21(18):5235–5243CrossRef Chen C, Yan C, Zhao N, Guo B, Liu G (2017) A robust algorithm of support vector regression with a trimmed Huber loss function in the primal. Soft Comput 21(18):5235–5243CrossRef
9.
go back to reference Chen Z, Matousek R, Wanke P (2018) Chinese bank efficiency during the global financial crisis: a combined approach using satisficing DEA and support vector machines. N Am J Econ Finance 43:71–86CrossRef Chen Z, Matousek R, Wanke P (2018) Chinese bank efficiency during the global financial crisis: a combined approach using satisficing DEA and support vector machines. N Am J Econ Finance 43:71–86CrossRef
10.
go back to reference Chen C, Li Y, Yan C, Dai H, Liu G (2015) A robust algorithm of multiquadric method based on an improved Huber loss function for interpolating remote-sensing-derived elevation data sets. Remote Sens 7(3):3347–3371CrossRef Chen C, Li Y, Yan C, Dai H, Liu G (2015) A robust algorithm of multiquadric method based on an improved Huber loss function for interpolating remote-sensing-derived elevation data sets. Remote Sens 7(3):3347–3371CrossRef
11.
go back to reference Chu W, Sathiya Keerthi S, Ong CJ (2004) Bayesian support vector regression using a unified loss function. IEEE Trans Neural Netw 15(1):29–44CrossRef Chu W, Sathiya Keerthi S, Ong CJ (2004) Bayesian support vector regression using a unified loss function. IEEE Trans Neural Netw 15(1):29–44CrossRef
12.
go back to reference Chuang C-C (2007) Fuzzy weighted support vector regression with a fuzzy partition. IEEE Trans Syst Man Cybern Part B (Cybern) 37(3):630–640CrossRef Chuang C-C (2007) Fuzzy weighted support vector regression with a fuzzy partition. IEEE Trans Syst Man Cybern Part B (Cybern) 37(3):630–640CrossRef
14.
go back to reference COVID19S (2020)[online]. https:/dataverse.harvard.edu/dataset.xhtml/ COVID19S (2020)[online]. https:/dataverse.harvard.edu/dataset.xhtml/
15.
go back to reference Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, CambridgeMATHCrossRef Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, CambridgeMATHCrossRef
16.
go back to reference Cui W, Yan Xu (2009) Adaptive weighted least square support vector machine regression integrated with outlier detection and its application in QSAR. Chemometr Intell Lab Syst 98(2):130–135CrossRef Cui W, Yan Xu (2009) Adaptive weighted least square support vector machine regression integrated with outlier detection and its application in QSAR. Chemometr Intell Lab Syst 98(2):130–135CrossRef
17.
go back to reference Demšar, J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7(Jan):1–30MathSciNetMATH Demšar, J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7(Jan):1–30MathSciNetMATH
18.
go back to reference Deylami H-M, PrasadSingh Y (2012) Cybercrime detection techniques based on support vector machines. Artif Intell Res 2(1):1CrossRef Deylami H-M, PrasadSingh Y (2012) Cybercrime detection techniques based on support vector machines. Artif Intell Res 2(1):1CrossRef
19.
go back to reference Drucker H, Burges CJC, Kaufman L, Smola AJ, Vapnik V (1997) Support vector regression machines. In: Advances in neural information processing systems, vol 9. pp 155–161 Drucker H, Burges CJC, Kaufman L, Smola AJ, Vapnik V (1997) Support vector regression machines. In: Advances in neural information processing systems, vol 9. pp 155–161
22.
go back to reference Forghani Y, Sigari Tabrizi R, Sadoghi Yazdi H, Mohammad-R. Akbarzadeh-T (2011) Fuzzy support vector regression. In: 2011 1st international eConference on computer and knowledge engineering (ICCKE), IEEE (2011), pp 28–33 Forghani Y, Sigari Tabrizi R, Sadoghi Yazdi H, Mohammad-R. Akbarzadeh-T (2011) Fuzzy support vector regression. In: 2011 1st international eConference on computer and knowledge engineering (ICCKE), IEEE (2011), pp 28–33
23.
go back to reference Fung GM, Mangasarian OL (2005) Multicategory proximal support vector machine classifiers. Mach Learn 59(1-2):77–97MATHCrossRef Fung GM, Mangasarian OL (2005) Multicategory proximal support vector machine classifiers. Mach Learn 59(1-2):77–97MATHCrossRef
24.
go back to reference Gu B, Fang J, Pan F, Bai Z (2020) Fast clustering-based weighted twin support vector regression.”. Soft Comput 24:1–17CrossRef Gu B, Fang J, Pan F, Bai Z (2020) Fast clustering-based weighted twin support vector regression.”. Soft Comput 24:1–17CrossRef
25.
go back to reference Gupta U, Gupta D (2019) An improved regularization based Lagrangian asymmetric ν-twin support vector regression using pinball loss function. Appl Intell 49(10):3606–3627CrossRef Gupta U, Gupta D (2019) An improved regularization based Lagrangian asymmetric ν-twin support vector regression using pinball loss function. Appl Intell 49(10):3606–3627CrossRef
26.
go back to reference Gupta U, Gupta, D (2018) Lagrangian twin-bounded support vector machine based on L2-norm. In: Recent developments in machine learning and data analytics, vol 740. AISC. Springer, Singapore, pp 431–444 Gupta U, Gupta, D (2018) Lagrangian twin-bounded support vector machine based on L2-norm. In: Recent developments in machine learning and data analytics, vol 740. AISC. Springer, Singapore, pp 431–444
27.
go back to reference Gupta D, Pratama M, Ma Z, Li J, Prasad M (2019) Financial time series forecasting using twin support vector regression. PLoS ONE 14(3):0211402CrossRef Gupta D, Pratama M, Ma Z, Li J, Prasad M (2019) Financial time series forecasting using twin support vector regression. PLoS ONE 14(3):0211402CrossRef
28.
go back to reference Gupta U, Gupta D, Prasad M (2018) Kernel target alignment based fuzzy least square twin bounded support vector machine. In: 2018 IEEE symposium series on computational intelligence (SSCI). IEEE Gupta U, Gupta D, Prasad M (2018) Kernel target alignment based fuzzy least square twin bounded support vector machine. In: 2018 IEEE symposium series on computational intelligence (SSCI). IEEE
29.
go back to reference Hazarika BB, Gupta D, Berlin M (2020) Modeling suspended sediment load in a river using extreme learning machine and twin support vector regression with wavelet conjunction. Environ Earth Sci 79:234CrossRef Hazarika BB, Gupta D, Berlin M (2020) Modeling suspended sediment load in a river using extreme learning machine and twin support vector regression with wavelet conjunction. Environ Earth Sci 79:234CrossRef
30.
go back to reference Hong DH, Hwang C (2005) Interval regression analysis using quadratic loss support vector machine. IEEE Trans Fuzzy Syst 13(2):229–237MathSciNetCrossRef Hong DH, Hwang C (2005) Interval regression analysis using quadratic loss support vector machine. IEEE Trans Fuzzy Syst 13(2):229–237MathSciNetCrossRef
31.
go back to reference Huang M-L (2015) Intersection traffic flow forecasting based on ν-GSVR with a new hybrid evolutionary algorithm. Neurocomputing 147:343–349CrossRef Huang M-L (2015) Intersection traffic flow forecasting based on ν-GSVR with a new hybrid evolutionary algorithm. Neurocomputing 147:343–349CrossRef
32.
go back to reference Huang X, Shi L, Suykens JAK (2014) Asymmetric least squares support vector machine classifiers.”. Comput Stat Data Anal 70:395–405MathSciNetMATHCrossRef Huang X, Shi L, Suykens JAK (2014) Asymmetric least squares support vector machine classifiers.”. Comput Stat Data Anal 70:395–405MathSciNetMATHCrossRef
35.
go back to reference Hwang C, Hong DH, Seok KH (2006) Support vector interval regression machine for crisp input and output data. Fuzzy Sets Syst 157(8):1114–1125MathSciNetMATHCrossRef Hwang C, Hong DH, Seok KH (2006) Support vector interval regression machine for crisp input and output data. Fuzzy Sets Syst 157(8):1114–1125MathSciNetMATHCrossRef
36.
go back to reference Jayadeva, Khemchandani R, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910MATHCrossRef Jayadeva, Khemchandani R, Chandra S (2007) Twin support vector machines for pattern classification. IEEE Trans Pattern Anal Mach Intell 29(5):905–910MATHCrossRef
37.
go back to reference Kaneko H, Funatsu K (2014) Adaptive soft sensor based on online support vector regression and Bayesian ensemble learning for various states in chemical plants. Chemometr Intell Lab Syst 137:57–66CrossRef Kaneko H, Funatsu K (2014) Adaptive soft sensor based on online support vector regression and Bayesian ensemble learning for various states in chemical plants. Chemometr Intell Lab Syst 137:57–66CrossRef
39.
go back to reference Kumar MA, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36(4):7535–7543CrossRef Kumar MA, Gopal M (2009) Least squares twin support vector machines for pattern classification. Expert Syst Appl 36(4):7535–7543CrossRef
40.
go back to reference Liu LL, Zhao Y, Kong L, Liu M, Dong L, Ma F, Pang Z (2018) Robust real-time heart rate prediction for multiple subjects from facial video using compressive tracking and support vector machine. J Med Imaging 5(2):024503CrossRef Liu LL, Zhao Y, Kong L, Liu M, Dong L, Ma F, Pang Z (2018) Robust real-time heart rate prediction for multiple subjects from facial video using compressive tracking and support vector machine. J Med Imaging 5(2):024503CrossRef
41.
go back to reference Liu X, Zhu T, Zhai L, Liu J (2017) Mass classification of benign and malignant with a new twin support vector machine joint l2,1 - norm. Int J Mach Learn Cybern 10:1–17 Liu X, Zhu T, Zhai L, Liu J (2017) Mass classification of benign and malignant with a new twin support vector machine joint l2,1 - norm. Int J Mach Learn Cybern 10:1–17
42.
go back to reference Mangasarian OL, Musicant DR (2000) Robust linear and support vector regression. IEEE Trans Pattern Anal Mach Intell 22(9):950–955CrossRef Mangasarian OL, Musicant DR (2000) Robust linear and support vector regression. IEEE Trans Pattern Anal Mach Intell 22(9):950–955CrossRef
43.
go back to reference Mao X, Wang Y, Liu X, Guo Y (2017) An adaptive weighted least square support vector regression for hysteresis in piezoelectric actuators. Sens Actuators A 263:423–429CrossRef Mao X, Wang Y, Liu X, Guo Y (2017) An adaptive weighted least square support vector regression for hysteresis in piezoelectric actuators. Sens Actuators A 263:423–429CrossRef
44.
go back to reference Maulik U, Chakraborty D (2017) Remote sensing image classification: a survey of support-vector-machine-based advanced techniques. IEEE Geosci Remote Sens Mag 5(1):33–52CrossRef Maulik U, Chakraborty D (2017) Remote sensing image classification: a survey of support-vector-machine-based advanced techniques. IEEE Geosci Remote Sens Mag 5(1):33–52CrossRef
45.
go back to reference Mehrkanoon S, Huang X, Suykens JAK (2014) Non-parallel support vector classifiers with different loss functions. Neurocomputing 143:294–301CrossRef Mehrkanoon S, Huang X, Suykens JAK (2014) Non-parallel support vector classifiers with different loss functions. Neurocomputing 143:294–301CrossRef
46.
go back to reference Melacci S, Belkin M (2011) Laplacian support vector machines trained in the primal. J Mach Learn Res 12(Mar):1149–1184MathSciNetMATH Melacci S, Belkin M (2011) Laplacian support vector machines trained in the primal. J Mach Learn Res 12(Mar):1149–1184MathSciNetMATH
47.
go back to reference Niu J, Chen J, Yitian X (2017) Twin support vector regression with Huber loss. J Intell Fuzzy Syst 32(6):4247–4258MATHCrossRef Niu J, Chen J, Yitian X (2017) Twin support vector regression with Huber loss. J Intell Fuzzy Syst 32(6):4247–4258MATHCrossRef
49.
go back to reference Ouyang X, Zhao N, Gao C, Wang L (2019) An efficient twin projection support vector machine for regression. Eng Lett 27(1):103–107 Ouyang X, Zhao N, Gao C, Wang L (2019) An efficient twin projection support vector machine for regression. Eng Lett 27(1):103–107
50.
go back to reference Peng X (2010) TSVR: an efficient twin support vector machine for regression. Neural Netw 23(3):365–372MATHCrossRef Peng X (2010) TSVR: an efficient twin support vector machine for regression. Neural Netw 23(3):365–372MATHCrossRef
51.
go back to reference Peng X, Chen D (2019) An $$ l_1 $$-norm loss based twin support vector regression and its geometric extension. Int J Mach Learn Cybernet 10(9):2573–2588CrossRef Peng X, Chen D (2019) An $$ l_1 $$-norm loss based twin support vector regression and its geometric extension. Int J Mach Learn Cybernet 10(9):2573–2588CrossRef
52.
go back to reference Puthiyottil A, Balasundaram S, Meena Y (2020) “L1-norm support vector regression in primal based on huber loss function. In: Proceedings of ICETIT 2019, vol 605. LNEE. Springer, Cham, pp 195–205 Puthiyottil A, Balasundaram S, Meena Y (2020) “L1-norm support vector regression in primal based on huber loss function. In: Proceedings of ICETIT 2019, vol 605. LNEE. Springer, Cham, pp 195–205
54.
go back to reference Shen X, Niu L, Qi Z, Tian Y (2017) Support vector machine classifier with truncated pinball loss. Pattern Recognit 68:199–210CrossRef Shen X, Niu L, Qi Z, Tian Y (2017) Support vector machine classifier with truncated pinball loss. Pattern Recognit 68:199–210CrossRef
55.
go back to reference Shao YH, Zhang C, Wang X, Deng N (2011) Improvements on twin support vector machines. IEEE Trans Neural Netw 22(6):962–968CrossRef Shao YH, Zhang C, Wang X, Deng N (2011) Improvements on twin support vector machines. IEEE Trans Neural Netw 22(6):962–968CrossRef
56.
go back to reference Singla M, Ghosh D, Shukla KK, Pedrycz W (2020) “Robust twin support vector regression based on rescaled hinge loss. Pattern Recognit 105:107395CrossRef Singla M, Ghosh D, Shukla KK, Pedrycz W (2020) “Robust twin support vector regression based on rescaled hinge loss. Pattern Recognit 105:107395CrossRef
59.
go back to reference Tang L, Tian Y, Yang C, Pardalos PM (2018) Ramp-loss nonparallel support vector regression: robust, sparse and scalable approximation. Knowl-Based Syst 147:55–67CrossRef Tang L, Tian Y, Yang C, Pardalos PM (2018) Ramp-loss nonparallel support vector regression: robust, sparse and scalable approximation. Knowl-Based Syst 147:55–67CrossRef
60.
go back to reference Tang L, Tian Y, Pardalos PM (2019) A novel perspective on multiclass classification: regular simplex support vector machine. Inf Sci 480:324–338MathSciNetMATHCrossRef Tang L, Tian Y, Pardalos PM (2019) A novel perspective on multiclass classification: regular simplex support vector machine. Inf Sci 480:324–338MathSciNetMATHCrossRef
61.
go back to reference Tang L, Tian Y, Li W, Pardalos PM (2020) Structural improved regular simplex support vector machine for multiclass classification. Appl Soft Comput 91:106235CrossRef Tang L, Tian Y, Li W, Pardalos PM (2020) Structural improved regular simplex support vector machine for multiclass classification. Appl Soft Comput 91:106235CrossRef
62.
go back to reference Tanveer M, Shubham K, Aldhaifallah M, Ho SS (2016) An efficient regularized K-nearest neighbor based weighted twin support vector regression. Knowl-Based Syst 94:70–87CrossRef Tanveer M, Shubham K, Aldhaifallah M, Ho SS (2016) An efficient regularized K-nearest neighbor based weighted twin support vector regression. Knowl-Based Syst 94:70–87CrossRef
66.
go back to reference Wang L, Gao C, Zhao N, Chen X (2020) Wavelet transform-based weighted $$\nu $$ ν-twin support vector regression. Int J Mach Learn Cybernet 11(1):95–110CrossRef Wang L, Gao C, Zhao N, Chen X (2020) Wavelet transform-based weighted $$\nu $$ ν-twin support vector regression. Int J Mach Learn Cybernet 11(1):95–110CrossRef
67.
go back to reference Wang L, Gao C, Zhao N, Chen X (2019) A projection wavelet weighted twin support vector regression and its primal solution. Appl Intell 49(8):3061–3081CrossRef Wang L, Gao C, Zhao N, Chen X (2019) A projection wavelet weighted twin support vector regression and its primal solution. Appl Intell 49(8):3061–3081CrossRef
68.
go back to reference Wang K, Pei H, Ding X, Zhong P (2019a) Robust proximal support vector regression based on maximum correntropy criterion. Sci Progr 2019:1–11 Wang K, Pei H, Ding X, Zhong P (2019a) Robust proximal support vector regression based on maximum correntropy criterion. Sci Progr 2019:1–11
69.
go back to reference Wang C, Li Z, Dey N, Li Z, Ashour AS, Fong SJ, Simon Sherratt R, Wu L, Shi F (2018) Histogram of oriented gradient based plantar pressure image feature extraction and classification employing fuzzy support vector machine. J Med Imaging Health Inf 8(4):842–854CrossRef Wang C, Li Z, Dey N, Li Z, Ashour AS, Fong SJ, Simon Sherratt R, Wu L, Shi F (2018) Histogram of oriented gradient based plantar pressure image feature extraction and classification employing fuzzy support vector machine. J Med Imaging Health Inf 8(4):842–854CrossRef
70.
go back to reference Wang K, Zhong P (2014) Robust support vector regression with flexible loss function. Int J Signal Process Image Process Pattern Recognit 7(4):211–220 Wang K, Zhong P (2014) Robust support vector regression with flexible loss function. Int J Signal Process Image Process Pattern Recognit 7(4):211–220
71.
go back to reference Wu Q (2010) A hybrid-forecasting model based on Gaussian support vector machine and chaotic particle swarm optimization. Expert Syst Appl 37(3):2388–2394CrossRef Wu Q (2010) A hybrid-forecasting model based on Gaussian support vector machine and chaotic particle swarm optimization. Expert Syst Appl 37(3):2388–2394CrossRef
72.
go back to reference Wu Q, Yan H (2009) Product sales forecasting model based on robust ν-support vector machine. Comput Integrated Manuf Syst 15(6):1081–1087 Wu Q, Yan H (2009) Product sales forecasting model based on robust ν-support vector machine. Comput Integrated Manuf Syst 15(6):1081–1087
73.
go back to reference Wu Q, Law R, Xin X (2012) A sparse Gaussian process regression model for tourism demand forecasting in Hong Kong. Expert Syst Appl 39(5):4769–4774CrossRef Wu Q, Law R, Xin X (2012) A sparse Gaussian process regression model for tourism demand forecasting in Hong Kong. Expert Syst Appl 39(5):4769–4774CrossRef
74.
go back to reference Xu Q, Zhang J, Jiang C, Huang X, He Y (2015) Weighted quantile regression via support vector machine. Expert Syst Appl 42(13):5441–5451CrossRef Xu Q, Zhang J, Jiang C, Huang X, He Y (2015) Weighted quantile regression via support vector machine. Expert Syst Appl 42(13):5441–5451CrossRef
75.
go back to reference Xu Y, Wang L (2014) K-nearest neighbor-based weighted twin support vector regression. Appl Intell 41(1):299–309CrossRef Xu Y, Wang L (2014) K-nearest neighbor-based weighted twin support vector regression. Appl Intell 41(1):299–309CrossRef
76.
go back to reference Xu Y, Li X, Pan X, Yang Z (2017) Asymmetric ν-twin support vector regression. Neural Comput Appl 30:1–16 Xu Y, Li X, Pan X, Yang Z (2017) Asymmetric ν-twin support vector regression. Neural Comput Appl 30:1–16
77.
go back to reference Yang L, Ding G, Yuan C, Zhang M (2020) Robust regression framework with asymmetrically analogous to correntropy-induced loss. Knowl-Based Syst 191:105211CrossRef Yang L, Ding G, Yuan C, Zhang M (2020) Robust regression framework with asymmetrically analogous to correntropy-induced loss. Knowl-Based Syst 191:105211CrossRef
78.
go back to reference Yang L, Dong H (2018) Support vector machine with truncated pinball loss and its application in pattern recognition. Chemometr Intell Lab Syst 177:89–99CrossRef Yang L, Dong H (2018) Support vector machine with truncated pinball loss and its application in pattern recognition. Chemometr Intell Lab Syst 177:89–99CrossRef
79.
go back to reference Yang, Z, Xu Y (2018) A safe sample screening rule for Laplacian twin parametric-margin support vector machine. Pattern Recognit 84:1–12CrossRef Yang, Z, Xu Y (2018) A safe sample screening rule for Laplacian twin parametric-margin support vector machine. Pattern Recognit 84:1–12CrossRef
80.
go back to reference Yang L, Ren Z, Wang Y, Dong H (2017) A robust regression framework with laplace kernel-induced loss. Neural Comput 29(11):3014–3039MathSciNetMATHCrossRef Yang L, Ren Z, Wang Y, Dong H (2017) A robust regression framework with laplace kernel-induced loss. Neural Comput 29(11):3014–3039MathSciNetMATHCrossRef
81.
go back to reference Ye Y, Gao J, Shao Y, Li C, Jin Y, Hua X (2020) Robust support vector regression with generic quadratic nonconvex ε-insensitive loss. Appl Math Model 82:235–251MathSciNetMATHCrossRef Ye Y, Gao J, Shao Y, Li C, Jin Y, Hua X (2020) Robust support vector regression with generic quadratic nonconvex ε-insensitive loss. Appl Math Model 82:235–251MathSciNetMATHCrossRef
82.
go back to reference Zhang J, Zheng C-H, Xia Y, Wang B, Chen P (2017) Optimization enhanced genetic algorithm-support vector regression for the prediction of compound retention indices in gas chromatography. Neurocomputing 240:183–190CrossRef Zhang J, Zheng C-H, Xia Y, Wang B, Chen P (2017) Optimization enhanced genetic algorithm-support vector regression for the prediction of compound retention indices in gas chromatography. Neurocomputing 240:183–190CrossRef
83.
go back to reference Zhao Y, Sun J (2008) Robust support vector regression in the primal. Neural Netw 21(10):1548–1555MATHCrossRef Zhao Y, Sun J (2008) Robust support vector regression in the primal. Neural Netw 21(10):1548–1555MATHCrossRef
84.
go back to reference Zhu J, Hoi SCH, Rung-Tsong Lyu M (2008) Robust regularized kernel regression. IEEE Trans Syst Man Cybern Part B (Cybern) 38(6):1639–1644CrossRef Zhu J, Hoi SCH, Rung-Tsong Lyu M (2008) Robust regularized kernel regression. IEEE Trans Syst Man Cybern Part B (Cybern) 38(6):1639–1644CrossRef
Metadata
Title
On Regularization Based Twin Support Vector Regression with Huber Loss
Authors
Umesh Gupta
Deepak Gupta
Publication date
03-01-2021
Publisher
Springer US
Published in
Neural Processing Letters / Issue 1/2021
Print ISSN: 1370-4621
Electronic ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-020-10380-y

Other articles of this Issue 1/2021

Neural Processing Letters 1/2021 Go to the issue