• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Han, Hong-Gui (Han, Hong-Gui.) | Ma, Miao-Li (Ma, Miao-Li.) | Yang, Hong-Yan (Yang, Hong-Yan.) | Qiao, Jun-Fei (Qiao, Jun-Fei.)

Indexed by:

EI Scopus SCIE

Abstract:

Gradient-based algorithms are commonly used for training radial basis function neural network (RBFNN). However, it is still difficult to avoid vanishing gradient to improve the learning performance in the training process. For this reason, in this paper, an accelerated second-order learning (ASOL) algorithm is developed to train RBFNN. First, an adaptive expansion and pruning mechanism (AEPM) of gradient space, based on the integrity and orthogonality of hidden neurons, is designed. Then, the effective gradient information is constantly added to gradient space and the redundant gradient information is eliminated from gradient space. Second, with AEPM, the neurons are generated or pruned accordingly. In this way, a self-organizing RBFNN (SORBFNN) which reduces the structure complexity and improves the generalization ability is obtained. Then, the structure and parameters in the learning process can be optimized by the proposed ASOL-based SORBFNN (ASOL-SORBFNN). Third, some theoretical analyses including the efficiency of the proposed AEPM on avoiding the vanishing gradient and the stability of SORBFNN in the process of structural adjustment are given, then the successful application of the proposed ASOL-SORBFNN is guaranteed. Finally, to illustrate the advantages of the proposed ASOL-SORBFNN, several experimental studies are examined. By comparing with other existing approaches, the results show that ASOLSORBFNN performs well in terms of both learning speed and prediction accuracy. (c) 2021 Elsevier B.V. All rights reserved.

Keyword:

Adaptive expansion and pruning mechanism (AEPM) of gradient space Accelerated second-order learning (ASOL)algorithm Self-organizing radial basis function neural network (SORBFNN)

Author Community:

  • [ 1 ] [Han, Hong-Gui]Beijing Univ Technol, Fac Informat Technol, Engn Res Ctr Digital Community,Minist Educ, Beijing Artificial Intelligence Inst,Beijing Key, Beijing, Peoples R China
  • [ 2 ] [Ma, Miao-Li]Beijing Univ Technol, Fac Informat Technol, Engn Res Ctr Digital Community,Minist Educ, Beijing Artificial Intelligence Inst,Beijing Key, Beijing, Peoples R China
  • [ 3 ] [Qiao, Jun-Fei]Beijing Univ Technol, Fac Informat Technol, Engn Res Ctr Digital Community,Minist Educ, Beijing Artificial Intelligence Inst,Beijing Key, Beijing, Peoples R China
  • [ 4 ] [Han, Hong-Gui]Beijing Univ Technol, Beijing Lab Urban Mass Transit, Beijing, Peoples R China
  • [ 5 ] [Ma, Miao-Li]Beijing Univ Technol, Beijing Lab Urban Mass Transit, Beijing, Peoples R China
  • [ 6 ] [Qiao, Jun-Fei]Beijing Univ Technol, Beijing Lab Urban Mass Transit, Beijing, Peoples R China
  • [ 7 ] [Yang, Hong-Yan]Beijing Univ Technol, Fac Informat Technol, Engn Res Ctr Digital Community, Minist Educ, Beijing, Peoples R China

Reprint Author's Address:

Show more details

Related Keywords:

Related Article:

Source :

NEUROCOMPUTING

ISSN: 0925-2312

Year: 2021

Volume: 469

Page: 1-12

6 . 0 0 0

JCR@2022

ESI Discipline: COMPUTER SCIENCE;

ESI HC Threshold:87

JCR Journal Grade:2

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count: 31

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 3

Affiliated Colleges:

Online/Total:850/5420006
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.