收录:
摘要:
In the extreme learning machine (ELM) network, sigmoid activation function is usually chosen for additive hidden neurons. Therefore, this paper replaced this activation function with a smooth approximation called softplus function. Because of being closer to the biological activation model and having certain sparseness, softplus activation function can further optimize network performance. In order to have a better classification performance, the optimization model of ELM by the improved Fisher discriminative analysis was restricted, and animproved ELM algorithm was proposed. Thus the output weights can be obtained analytically and are more conducive for classification. Finally, the experiments on handwritten digit database and face database prove the feasibility and superiority of the improved ELM algorithm. ©, 2015, Beijing University of Technology. All right reserved.
关键词:
通讯作者信息:
电子邮件地址: