收录:
摘要:
In this paper, we propose a new method of designing and constructing "good" mappings defined by kernel functions for classification task, called Optimal Successive Mappings (OSM). Kernel methods, such as Support Vector Machines (SVM), could not provide satisfactory classification accuracy on some complicated data sets, which are still not linearly separable in feature space. It means kernels designed only by tuning kernel parameters cannot adapt well to classification of complicated data sets. Unlike tuning parameters, OSM learns and designs its kernel from training data, through a sequence of two mappings and optimizing a criteria function. After feature mapping of OSM, data in the feature space appear not only linearly separable but also intra-class compact and extra-class separate. As the problem of optimizing the criteria function reduces to a generalized eigenvalue problem, OSM possesses non-iterative and low complex properties. Comparative experiments demonstrate the effectiveness of our method.
关键词:
通讯作者信息: