22.06.2016 | Foundations | Ausgabe 18/2017
A robust algorithm of support vector regression with a trimmed Huber loss function in the primal
- Zeitschrift:
-
Soft Computing
>
Ausgabe 18/2017
- Autoren:
- Chuanfa Chen, Changqing Yan, Na Zhao, Bin Guo, Guolin Liu
Abstract
Support vector machine for regression (SVR) is an efficient tool for solving function estimation problem. However, it is sensitive to outliers due to its unbounded loss function. In order to reduce the effect of outliers, we propose a robust SVR with a trimmed Huber loss function (SVRT) in this paper. Synthetic and benchmark datasets were, respectively, employed to comparatively assess the performance of SVRT, and its results were compared with those of SVR, least squares SVR (LS-SVR) and a weighted LS-SVR. The numerical test shows that when training samples are subject to errors with a normal distribution, SVRT is slightly less accurate than SVR and LS-SVR, yet more accurate than the weighted LS-SVR. However, when training samples are contaminated by outliers, SVRT has a better performance than the other methods. Furthermore, SVRT is faster than the weighted LS-SVR. Simulating eight benchmark datasets shows that SVRT is averagely more accurate than the other methods when sample points are contaminated by outliers. In conclusion, SVRT can be considered as an alternative robust method for simulating contaminated sample points.