TY - GEN
T1 - Learning sequence kernels
AU - Cortes, Corinna
AU - Mohri, Mehryar
AU - Rostamizadeh, Afshin
PY - 2008
Y1 - 2008
N2 - Kernel methods are used to tackle a variety of learning tasks including classification, regression, ranking, clustering, and dimensionality reduction. The appropriate choice of a kernel is often left to the user. But, poor selections may lead to a sub-optimal performance. Instead, sample points can be used to learn a kernel function appropriate for the task by selecting one out of a family of kernels determined by the user. This paper considers the problem of learning sequence kernel functions, an important problem for applications in computational biology, natural language processing, document classification and other text processing areas. For most kernel-based learning techniques, the kernels selected must be positive definite symmetric, which, for sequence data, are found to be rational kernels. We give a general formulation of the problem of learning rational kernels and prove that a large family of rational kernels can be learned efficiently using a simple quadratic program both in the context of support vector machines and kernel ridge regression. This improves upon previous work that generally results in a more costly semi-definite or quadratically constrained quadratic program. Furthermore, in the specific case of kernel ridge regression, we give an alternative solution based on a closed-form solution for the optimal kernel matrix. We also report results of experiments with our kernel learning techniques in classification and regression tasks.
AB - Kernel methods are used to tackle a variety of learning tasks including classification, regression, ranking, clustering, and dimensionality reduction. The appropriate choice of a kernel is often left to the user. But, poor selections may lead to a sub-optimal performance. Instead, sample points can be used to learn a kernel function appropriate for the task by selecting one out of a family of kernels determined by the user. This paper considers the problem of learning sequence kernel functions, an important problem for applications in computational biology, natural language processing, document classification and other text processing areas. For most kernel-based learning techniques, the kernels selected must be positive definite symmetric, which, for sequence data, are found to be rational kernels. We give a general formulation of the problem of learning rational kernels and prove that a large family of rational kernels can be learned efficiently using a simple quadratic program both in the context of support vector machines and kernel ridge regression. This improves upon previous work that generally results in a more costly semi-definite or quadratically constrained quadratic program. Furthermore, in the specific case of kernel ridge regression, we give an alternative solution based on a closed-form solution for the optimal kernel matrix. We also report results of experiments with our kernel learning techniques in classification and regression tasks.
UR - http://www.scopus.com/inward/record.url?scp=58049176470&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=58049176470&partnerID=8YFLogxK
U2 - 10.1109/MLSP.2008.4685446
DO - 10.1109/MLSP.2008.4685446
M3 - Conference contribution
AN - SCOPUS:58049176470
SN - 9781424423767
T3 - Proceedings of the 2008 IEEE Workshop on Machine Learning for Signal Processing, MLSP 2008
SP - 2
EP - 8
BT - Proceedings of the 2008 IEEE Workshop on Machine Learning for Signal Processing, MLSP 2008
T2 - 2008 IEEE Workshop on Machine Learning for Signal Processing, MLSP 2008
Y2 - 16 October 2008 through 19 October 2008
ER -