收录:
摘要:
As an extensively used model for time series prediction, the Long-Short Term Memory (LSTM) neural network suffers from shortcomings such as high computational cost and large memory requirement, due to its complex structure. To address these problems, a PLS-based pruning algorithm is hereby proposed for a simplified LSTM (PSLSTM). First, a hybrid strategy is designed to simplify the internal structure of LSTM, which combines the structure simplification and parameter reduction for gates. Second, partial least squares (PLS) regression coefficients are used as the metric to evaluate the importance of the memory blocks, and the redundant hidden layer size is pruned by merging unimportant blocks with their most correlated ones. The Backpropagation Through Time (BPTT) algorithm is utilized as the learning algorithm to update the network parameters. Finally, several benchmark and practical datasets for time series prediction are used to evaluate the performance of the proposed PSLSTM. The experimental results demonstrate that the PLS-based pruning algorithm can achieve the trade-off between a good generalization ability and a compact network structure. The computational complexity is improved by the simple internal structure as well as the compact hidden layer size, without sacrificing prediction accuracy. (C) 2022 Elsevier B.V. All rights reserved.
关键词:
通讯作者信息:
电子邮件地址:
来源 :
KNOWLEDGE-BASED SYSTEMS
ISSN: 0950-7051
年份: 2022
卷: 254
8 . 8
JCR@2022
8 . 8 0 0
JCR@2022
ESI学科: COMPUTER SCIENCE;
ESI高被引阀值:46
JCR分区:1
中科院分区:2
归属院系: