• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Guo, Nan (Guo, Nan.) | Gu, Ke (Gu, Ke.) | Qiao, Junfei (Qiao, Junfei.) (Scholars:乔俊飞) | Bi, Jing (Bi, Jing.)

Indexed by:

EI Scopus SCIE PubMed

Abstract:

Recent years have witnessed numerous successful applications of incorporating attention module into feed-forward convolutional neural networks. Along this line of research, we design a novel lightweight general-purpose attention module by simultaneously taking channel attention and spatial attention into consideration. Specifically, inspired by the characteristics of channel attention and spatial attention, a nonlinear hybrid method is proposed to combine such two types of attention feature maps, which is highly beneficial to better network fine-tuning. Further, the parameters of each attention branch can be adjustable for the purpose of making the attention module more flexible and adaptable. From another point of view, we found that the currently popular SE, and CBAM modules are actually two particular cases of our proposed attention module. We also explore the latest attention module ADCM. To validate the module, we conduct experiments on CIFAR10, CIFAR100, Fashion MINIST datasets. Results show that, after integrating with our attention module, existing networks tend to be more efficient in training process and have better performance as compared with state-of-the-art competitors. Also, it is worthy to stress the following two points: (1) our attention module can be used in existing state-of-the-art deep architectures and get better performance at a small computational cost; (2) the module can be added to existing deep architectures in a simple way through stacking the integration of networks block and our module. (C) 2021 Published by Elsevier Ltd.

Keyword:

Hybrid attention mechanism Convolutional neural networks Feature map combination General module

Author Community:

  • [ 1 ] [Guo, Nan]Beijing Univ Technol, Beijing Key Lab Computat Intelligence & Intellige, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 2 ] [Gu, Ke]Beijing Univ Technol, Beijing Key Lab Computat Intelligence & Intellige, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 3 ] [Qiao, Junfei]Beijing Univ Technol, Beijing Key Lab Computat Intelligence & Intellige, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 4 ] [Bi, Jing]Beijing Univ Technol, Beijing Key Lab Computat Intelligence & Intellige, Fac Informat Technol, Beijing 100124, Peoples R China

Reprint Author's Address:

  • 乔俊飞

    [Qiao, Junfei]Beijing Univ Technol, Beijing Key Lab Computat Intelligence & Intellige, Fac Informat Technol, Beijing 100124, Peoples R China

Show more details

Related Keywords:

Related Article:

Source :

NEURAL NETWORKS

ISSN: 0893-6080

Year: 2021

Volume: 140

Page: 158-166

7 . 8 0 0

JCR@2022

ESI Discipline: COMPUTER SCIENCE;

ESI HC Threshold:87

JCR Journal Grade:1

Cited Count:

WoS CC Cited Count: 22

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 0

Affiliated Colleges:

Online/Total:473/5615639
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.