• 综合
  • 标题
  • 关键词
  • 摘要
  • 学者
  • 期刊-刊名
  • 期刊-ISSN
  • 会议名称
搜索

作者:

Faraji, Amin (Faraji, Amin.) | Noohi, Mostafa (Noohi, Mostafa.) | Sadrossadat, Sayed Alireza (Sadrossadat, Sayed Alireza.) | Mirvakili, Ali (Mirvakili, Ali.) | Na, Weicong (Na, Weicong.) | Feng, Feng (Feng, Feng.)

收录:

EI Scopus SCIE

摘要:

In order to model high-speed nonlinear circuits, recurrent neural network (RNN) has been widely used in computer-aided design (CAD) area to achieve high performance and fast models compared with the existing models. Despite their advantages, they still have challenges such as large training time and limited test accuracy. In this article, the batch normalization (BN) method is applied to deep RNN leading to a much shorter training time and more accurate models compared with the conventional RNN. The proposed BN-RNN method works by modifying the distribution of the internal nodes of a deep network in the training course as an internal auxiliary shift yielding a much faster training. Indeed, the internal covariance shift will be reduced and the training of deep neural networks will be accelerated via a normalization step applied to the layers of RNN. BN-RNN, moreover, has a beneficial effect on gradient flow through the grid by reducing the dependence of gradients on the scale of network parameters or their initial values. This provides a much better learning process without the risk of divergence. For verifying the proposed method, time-domain modeling of three high-speed nonlinear circuits operating at the GHz region is provided. Comparisons of the training and test errors between RNN and BN-RNN, and evaluation time comparisons between transistor level and the BN-RNN-based models for these circuits prove the higher speed of the models obtained from the BN-RNN method. In addition, it is shown that training using the proposed method requires much less CPU time and number of epochs.

关键词:

Solid modeling Integrated circuit modeling computer-aided design (CAD) Neural networks Batch normalization (BN) recurrent neural network (RNN) Nonlinear circuits nonlinear circuits Behavioral sciences Training Recurrent neural networks modeling

作者机构:

  • [ 1 ] [Faraji, Amin]Yazd Univ, Dept Comp Engn, Yazd 8915818411, Iran
  • [ 2 ] [Sadrossadat, Sayed Alireza]Yazd Univ, Dept Comp Engn, Yazd 8915818411, Iran
  • [ 3 ] [Noohi, Mostafa]Yazd Univ, Dept Elect Engn, Yazd 8915818411, Iran
  • [ 4 ] [Mirvakili, Ali]Yazd Univ, Dept Elect Engn, Yazd 8915818411, Iran
  • [ 5 ] [Na, Weicong]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 6 ] [Feng, Feng]Tianjin Univ, Sch Microelect, Tianjin 300072, Peoples R China

通讯作者信息:

查看成果更多字段

相关关键词:

来源 :

IEEE TRANSACTIONS ON MICROWAVE THEORY AND TECHNIQUES

ISSN: 0018-9480

年份: 2022

期: 11

卷: 70

页码: 4857-4868

4 . 3

JCR@2022

4 . 3 0 0

JCR@2022

ESI学科: ENGINEERING;

ESI高被引阀值:49

JCR分区:2

中科院分区:2

被引次数:

WoS核心集被引频次: 20

SCOPUS被引频次: 21

ESI高被引论文在榜: 0 展开所有

万方被引频次:

中文被引频次:

近30日浏览量: 6

归属院系:

在线人数/总访问数:3531/4257886
地址:北京工业大学图书馆(北京市朝阳区平乐园100号 邮编:100124) 联系我们:010-67392185
版权所有:北京工业大学图书馆 站点建设与维护:北京爱琴海乐之技术有限公司