Elsevier

Fuzzy Sets and Systems

Volume 138, Issue 2, 1 September 2003, Pages 283-300
Fuzzy Sets and Systems

Support vector interval regression networks for interval regression analysis

https://doi.org/10.1016/S0165-0114(02)00570-5Get rights and content

Abstract

In this paper, the support vector interval regression networks (SVIRNs) are proposed for the interval regression analysis. The SVIRNs consist of two radial basis function networks. One network identifies the upper side of data interval, and the other network identifies the lower side of data intervals. Because the support vector regression (SVR) approach is equivalent to solving a linear constrained quadratic programming problem, the number of hidden nodes and the initial values of adjustable parameters can be easily obtained. Since the selection of a parameter ε in the SVR approach may seriously affect the modeling performance, a two-step approach is proposed to properly select the ε value. After the SVR approach with the selected ε, an initial structure of SVIRNs can be obtained. Besides, outliers will not significantly affect the upper and lower bound interval obtained through the proposed two-step approach. Consequently, a traditional back-propagation (BP) learning algorithm can be used to adjust the initial structure networks of SVIRNs under training data sets without or with outliers. Due to the better initial structure of SVIRNs are obtained by the SVR approach, the convergence rate of SVIRNs is faster than the conventional networks with BP learning algorithms or with robust BP learning algorithms for interval regression analysis. Four examples are provided to show the validity and applicability of the proposed SVIRNs.

References (21)

There are more references available in the full text version of this article.

Cited by (124)

  • On pairing Huber support vector regression

    2020, Applied Soft Computing
  • Simplex basis function based sparse least squares support vector regression

    2019, Neurocomputing
    Citation Excerpt :

    A novel geometric framework, based on which new SVR models and algorithms are proposed in [6], these use convex hull algorithms for data reduction. The SVR has also been applied in applications in nonstationary settings, e.g., time-series prediction in nonlinear environment [7], and interval regression analysis in time-varying environment [8]. A major advance for computational efficiency is the introduction of least squares support vector machine (LSSVM) and least squares support vector regression (LSSVR), which uses equality constraints [9] so that closed form least squares type solutions are available.

  • Development of new correlations for the oil formation volume factor in oil reservoirs using artificial intelligent white box technique

    2018, Petroleum
    Citation Excerpt :

    Support Vector machine is the type of supervised learning that is mostly used for regression and pattern recognition purposes [28,29]. Based on soft margin hyper-plane support vector machine have been introduced as new artificial intelligence tool framework for both classification and function approximation [30,31]. Instead of sigmoidal type transfer function like in artificial neural network, support vector machine stands on the kernel neuron function which definitely allows projection to higher planes and able to solve more complicated and complex highly nonlinear problems [32].

View all citing articles on Scopus

This work was supported by National Science Council Under Grant NSC89-2218-E-146-001.

View full text