收录:
摘要:
A great number of dimensionality reduction methods are finally reduced to solving generalized eigenvector problems. Optimization techniques are promising ways to solve the parameter selection problems in these dimensionality reduction methods. The most important step in these optimization methods is to compute the objective function with respect to the parameter, which depends on computing the gradient and Hessian matrix of the resulted eigenvectors and eigenvalues. In this paper, we propose a novel method to compute the gradient of the eigenvalues, and then apply them to tune the parameter in the kernel principal component analysis. Experimental results on UCI data sets show that the new method outperforms the original algorithm, especially in time complexity. © 2011 IEEE.
关键词:
通讯作者信息:
电子邮件地址: