Rotated Hyperbola Smooth Support Vector Regression
Abstract
ε-support vector regression (ε-SVR) as a constrained minimization problem can be converted into an unconstrained convex quadratic programming. Smooth function is the essence of the ε-smooth support vector regression (ε-SSVR). In this paper, a new rotated hyperbola function is proposed to replace the ε-insensitive loss function. The ε-rotated hyperbola smooth support vector regression (ε-RHSSVR) model is presented. Theoretical analyses show that the derived smooth function has improved approximation precision compared with other smooth approximate functions. The Newton-Armijo algorithm is applied to train the new model. Numerical results and comparisons are given to demonstrate the effectiveness and speed of the method.