Indexed by:
Abstract:
In the extreme learning machine (ELM) network, sigmoid activation function is usually chosen for additive hidden neurons. Therefore, this paper replaced this activation function with a smooth approximation called softplus function. Because of being closer to the biological activation model and having certain sparseness, softplus activation function can further optimize network performance. In order to have a better classification performance, the optimization model of ELM by the improved Fisher discriminative analysis was restricted, and animproved ELM algorithm was proposed. Thus the output weights can be obtained analytically and are more conducive for classification. Finally, the experiments on handwritten digit database and face database prove the feasibility and superiority of the improved ELM algorithm. ©, 2015, Beijing University of Technology. All right reserved.
Keyword:
Reprint Author's Address:
Email:
Source :
Journal of Beijing University of Technology
ISSN: 0254-0037
Year: 2015
Issue: 9
Volume: 41
Page: 1341-1348
Cited Count:
WoS CC Cited Count: 0
SCOPUS Cited Count: 7
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 0