Advanced search
Start date
Betweenand
(Reference retrieved automatically from Web of Science through information on FAPESP grant and its corresponding number as mentioned in the publication by the authors.)

Improving the kernel regularized least squares method for small-sample regression

Full text
Author(s):
Braga, Igor [1] ; Monard, Maria Carolina [1]
Total Authors: 2
Affiliation:
[1] Univ Sao Paulo, Inst Math & Comp Sci, BR-13566590 Sao Carlos, SP - Brazil
Total Affiliations: 1
Document type: Journal article
Source: Neurocomputing; v. 163, n. SI, p. 106-114, SEP 2 2015.
Web of Science Citations: 3
Abstract

The kernel regularized least squares (KRIS) method uses the kernel trick to perform non-linear regression estimation. Its performance depends on proper selection of both a kernel function and a regularization parameter. In practice, cross-validation along with the Gaussian RBF kernel have been widely used for carrying out model selection for KRLS. However, when training data is scarce, this combination often leads to poor regression estimation. In order to mitigate this issue, we follow two lines of investigation in this paper. First, we explore a new type of kernel function that is less susceptible to overfitting than the RBF kernel. Then, we consider alternative parameter selection methods that have been shown to perform well for other regression methods. Experiments conducted on real-world datasets show that an additive spline kernel greatly outperforms both the RBF and a previously proposed multiplicative spline kernel. We also find that the parameter selection procedure Finite Prediction Error (FPE) is a competitive alternative to cross-validation when using the additive splines kernel. (C) 2015 Elsevier B.V. All rights reserved. (AU)

FAPESP's process: 09/17773-7 - Selective Inference in Machine Learning: theory, algorithms and applications
Grantee:Ígor Assis Braga
Support type: Scholarships in Brazil - Doctorate