• 综合
  • 标题
  • 关键词
  • 摘要
  • 学者
  • 期刊-刊名
  • 期刊-ISSN
  • 会议名称
搜索

作者:

Xu, Cheng (Xu, Cheng.) | Ji, Junzhong (Ji, Junzhong.) (学者:冀俊忠) | Zhang, Menglong (Zhang, Menglong.) | Zhang, Xiaodan (Zhang, Xiaodan.)

收录:

EI

摘要:

Attention mechanism has achieved remarkable success in image captioning under the neural encoder-decoder frameworks. However, these methods are limited to introduce attention to the language model, e.g., LSTM (long short-term memory), straightforwardly: the attention is embedded into LSTM outside the core hidden layer, and the current attention is irrelevant to the previous one. In this paper, through exploring the inner relationship of attention mechanism and the gates of LSTM, we propose a new attention-gated LSTM model (AGL) that introduces dynamic attention to the language model. In this method, the visual attention is incorporated into the output gate of LSTM and propagates along with the sequential cell state. Thus the attention in AGL obtains dynamic characteristics, which means the current focused visual region can give remote guidance to the later state. Quantitative and qualitative experiments conducted on the MS COCO dataset demonstrate the the advantage of our proposed method. © 2019 IEEE.

关键词:

Behavioral research Computational linguistics Dynamics Long short-term memory

作者机构:

  • [ 1 ] [Xu, Cheng]Beijing Municipal Key Laboratory of Multimedia and Intelligent Software Technology, Beijing University of Technology, China
  • [ 2 ] [Xu, Cheng]Beijing Artificial Intelligence Institute, Beijing University of Technology, China
  • [ 3 ] [Ji, Junzhong]Beijing Municipal Key Laboratory of Multimedia and Intelligent Software Technology, Beijing University of Technology, China
  • [ 4 ] [Ji, Junzhong]Beijing Artificial Intelligence Institute, Beijing University of Technology, China
  • [ 5 ] [Zhang, Menglong]Beijing Municipal Key Laboratory of Multimedia and Intelligent Software Technology, Beijing University of Technology, China
  • [ 6 ] [Zhang, Menglong]Beijing Artificial Intelligence Institute, Beijing University of Technology, China
  • [ 7 ] [Zhang, Xiaodan]Beijing Municipal Key Laboratory of Multimedia and Intelligent Software Technology, Beijing University of Technology, China
  • [ 8 ] [Zhang, Xiaodan]Beijing Artificial Intelligence Institute, Beijing University of Technology, China

通讯作者信息:

  • [zhang, xiaodan]beijing artificial intelligence institute, beijing university of technology, china;;[zhang, xiaodan]beijing municipal key laboratory of multimedia and intelligent software technology, beijing university of technology, china

电子邮件地址:

查看成果更多字段

相关关键词:

相关文章:

来源 :

年份: 2019

页码: 172-177

语种: 英文

被引次数:

WoS核心集被引频次: 0

SCOPUS被引频次: 1

ESI高被引论文在榜: 0 展开所有

万方被引频次:

中文被引频次:

近30日浏览量: 5

在线人数/总访问数:295/2897049
地址:北京工业大学图书馆(北京市朝阳区平乐园100号 邮编:100124) 联系我们:010-67392185
版权所有:北京工业大学图书馆 站点建设与维护:北京爱琴海乐之技术有限公司