• 综合
  • 标题
  • 关键词
  • 摘要
  • 学者
  • 期刊-刊名
  • 期刊-ISSN
  • 会议名称
搜索

作者:

Guo Nan (Guo Nan.) | Gu Ke (Gu Ke.) (学者:顾锞) | Qiao Junfei (Qiao Junfei.) (学者:乔俊飞) | Bi Jing (Bi Jing.)

收录:

EI SCIE PubMed

摘要:

Recent years have witnessed numerous successful applications of incorporating attention module into feed-forward convolutional neural networks. Along this line of research, we design a novel lightweight general-purpose attention module by simultaneously taking channel attention and spatial attention into consideration. Specifically, inspired by the characteristics of channel attention and spatial attention, a nonlinear hybrid method is proposed to combine such two types of attention feature maps, which is highly beneficial to better network fine-tuning. Further, the parameters of each attention branch can be adjustable for the purpose of making the attention module more flexible and adaptable. From another point of view, we found that the currently popular SE, and CBAM modules are actually two particular cases of our proposed attention module. We also explore the latest attention module ADCM. To validate the module, we conduct experiments on CIFAR10, CIFAR100, Fashion MINIST datasets. Results show that, after integrating with our attention module, existing networks tend to be more efficient in training process and have better performance as compared with state-of-the-art competitors. Also, it is worthy to stress the following two points: (1) our attention module can be used in existing state-of-the-art deep architectures and get better performance at a small computational cost; (2) the module can be added to existing deep architectures in a simple way through stacking the integration of networks block and our module.

关键词:

Convolutional neural networks Feature map combination General module Hybrid attention mechanism

作者机构:

  • [ 1 ] [Guo Nan]Beijing Key Laboratory of Computational Intelligence and Intelligent System, Faculty of Information Technology, Beijing University of Technology, Beijing 100124, China
  • [ 2 ] [Gu Ke]Beijing Key Laboratory of Computational Intelligence and Intelligent System, Faculty of Information Technology, Beijing University of Technology, Beijing 100124, China
  • [ 3 ] [Qiao Junfei]Beijing Key Laboratory of Computational Intelligence and Intelligent System, Faculty of Information Technology, Beijing University of Technology, Beijing 100124, China. Electronic address: junfeiq@bjut.edu.cn
  • [ 4 ] [Bi Jing]Beijing Key Laboratory of Computational Intelligence and Intelligent System, Faculty of Information Technology, Beijing University of Technology, Beijing 100124, China

通讯作者信息:

电子邮件地址:

查看成果更多字段

相关关键词:

来源 :

Neural networks : the official journal of the International Neural Network Society

ISSN: 1879-2782

年份: 2021

卷: 140

页码: 158-166

被引次数:

WoS核心集被引频次: 0

SCOPUS被引频次: 23

ESI高被引论文在榜: 0 展开所有

万方被引频次:

中文被引频次:

近30日浏览量: 1

归属院系:

在线人数/总访问数:2606/2962542
地址:北京工业大学图书馆(北京市朝阳区平乐园100号 邮编:100124) 联系我们:010-67392185
版权所有:北京工业大学图书馆 站点建设与维护:北京爱琴海乐之技术有限公司