عنوان مقاله فارسی: برنامه های افزودنی خارج از نمونه برای روش های هسته غیر پارامتریک
عنوان مقاله لاتین: Out-of-Sample Extensions for Non-Parametric Kernel Methods
نویسندگان: Binbin Pan; Wen-Sheng Chen; Bo Chen; Chen Xu; Jianhuang Lai
تعداد صفحات: 11
سال انتشار: 2017
زبان: لاتین
Abstract:
Choosing suitable kernels plays an important role in the performance of kernel methods. Recently, a number of studies were devoted to developing nonparametric kernels. Without assuming any parametric form of the target kernel, nonparametric kernel learning offers a flexible scheme to utilize the information of the data, which may potentially characterize the data similarity better. The kernel methods using nonparametric kernels are referred to as nonparametric kernel methods. However, many nonparametric kernel methods are restricted to transductive learning, where the prediction function is defined only over the data points given beforehand. They have no straightforward extension for the out-of-sample data points, and thus cannot be applied to inductive learning. In this paper, we show how to make the nonparametric kernel methods applicable to inductive learning. The key problem of out-of-sample extension is how to extend the nonparametric kernel matrix to the corresponding kernel function. A regression approach in the hyper reproducing kernel Hilbert space is proposed to solve this problem. Empirical results indicate that the out-of-sample performance is comparable to the in-sample performance in most cases. Experiments on face recognition demonstrate the superiority of our nonparametric kernel method over the state-of-the-art parametric kernel methods.
out-of-sample extensions for non-parametric kernel methods_1618751598_47575_4145_1779.zip3.03 MB |