• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Liang, Hongwei (Liang, Hongwei.) | Zhao, Zehao (Zhao, Zehao.) | Wang, Gongming (Wang, Gongming.)

Indexed by:

EI Scopus

Abstract:

Deep belief network (DBN) is an effective deep learning model, which has been widely used to analyze big-data characteristics, extract features and approximate nonlinear systems. However, due to the deep structure and numerous parameters, DBN suffers from the disadvantage of time-consuming training process generally. To address this problem and improve the training efficiency without reducing accuracy, this paper proposes a self-optimizing deep belief network with adaptive-active learning (SODBN-AAL). In the proposed SODBN-AAL, an adaptive learning algorithm of hyper-parameters is designed to ensure a good accuracy. On this basis, an active learning algorithm is developed based on event-triggered strategy to improve the training efficiency by extracting effective data and skipping invalid data. As a self-optimizing model, SODBN-AAL combines the advantages of both adaptive hyper-parameters and the event-triggered active learning. Two simulation experiments on the benchmark problem and water quality prediction are conducted to show the advantages of SODBN-AAL. The results show that, compared with the basic DBN model, SODBN-AAL averagely improves the learning accuracy by 77.13% and learning efficiency by 84.83%. © 2024 IEEE.

Keyword:

Deep learning Benchmarking Water quality Learning algorithms Learning systems Data mining Efficiency

Author Community:

  • [ 1 ] [Liang, Hongwei]Qufu Normal University Library, Qufu Normal University, Qufu; 273165, China
  • [ 2 ] [Zhao, Zehao]Lvzhiyuan Environmental Group, High Technology Research Center, Rizhao; 276801, China
  • [ 3 ] [Wang, Gongming]Beijing University of Technology, Beijing Institute of Artificial Intelligence, Beijing; 100124, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2024

Page: 1553-1556

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count: 1

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 1

Affiliated Colleges:

Online/Total:1245/5268288
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.