• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Wang, Ming Yang (Wang, Ming Yang.) | Li, Chen Liang (Li, Chen Liang.) | Sun, Jian Dong (Sun, Jian Dong.) | Xu, Wei Ran (Xu, Wei Ran.) | Gao, Sheng (Gao, Sheng.) | Zhang, Ya Hao (Zhang, Ya Hao.) | Wang, Pu (Wang, Pu.) | Li, Jun Liang (Li, Jun Liang.)

Indexed by:

EI Scopus

Abstract:

Text comprehension and information retrieval are two essential methods which could be reinforced by modeling to semantic similarity in sentences and phrases. However, there are general problems of traditional methods on LSTM which is used to process the input sentences. Those semantic vectors cannot fully represent the entire information sequence and the information contained in firstly input content will be diluted or overwritten by the late r information. The longer the input sequence, the more serious this phenomenon is. In order to address these problems, we propose new methods with self-attention. It can incorporate weights of special words and highlight the comparison of the similarity in key words. Compared with normal self-attention which can only incorporate the weight of the key words into the naive sentences and describe position information on sentences through position encoding. Our experiment shows that new method can improve the performance of model. © 2018 IEEE.

Keyword:

Semantics Digital integrated circuits Long short-term memory

Author Community:

  • [ 1 ] [Wang, Ming Yang]Beijing University of Posts and Telecommunications, China
  • [ 2 ] [Li, Chen Liang]Beijing University of Posts and Telecommunications, China
  • [ 3 ] [Sun, Jian Dong]Beijing University of Posts and Telecommunications, China
  • [ 4 ] [Xu, Wei Ran]Beijing University of Posts and Telecommunications, China
  • [ 5 ] [Gao, Sheng]Beijing University of Posts and Telecommunications, China
  • [ 6 ] [Zhang, Ya Hao]Beijing University of Technology, China
  • [ 7 ] [Wang, Pu]Beijing University of Posts and Telecommunications, China
  • [ 8 ] [Li, Jun Liang]Luoyang Electronic Equipment Test Center, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2018

Page: 16-19

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 3

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 1

Affiliated Colleges:

Online/Total:660/5296451
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.