收录:
摘要:
As extension of traditional echo state networks (ESNs), the polynomial echo state networks (PESNs) have been proposed in our previous work (Yang et al., 208) by employing the polynomial function of complete input variable as output weight matrix. In practice, the generalization performance and computational burden of PESNs are perturbed by redundant or irrelevant inputs. To construct output weights with a suitable subset of input variables, the forward selection based PESN (FS-PESN) and backward selection based PESN (BS-PESN) are proposed. Firstly, the forward selection method is used in FS-PESN to choose the input variable which incurs the maximum reduction on objective function, and the backward selection shame is introduced in BS-PESN to remove the input variable which leads to the smallest increment on objective function. Then, the iterative updating strategies are designed to avoid repetitive computations in FS-PESN and BS-PESN. Specially, an accelerating scheme is introduced into BS-PESN to simplify training process. Finally, numerical simulations are carried out to illustrate effectiveness of the proposed techniques in terms of generalization ability and testing time. (C) 2020 Elsevier B.V. All rights reserved.
关键词:
通讯作者信息:
电子邮件地址:
来源 :
NEUROCOMPUTING
ISSN: 0925-2312
年份: 2020
卷: 398
页码: 83-94
6 . 0 0 0
JCR@2022
ESI学科: COMPUTER SCIENCE;
ESI高被引阀值:132
归属院系: