• 综合
  • 标题
  • 关键词
  • 摘要
  • 学者
  • 期刊-刊名
  • 期刊-ISSN
  • 会议名称
搜索

作者:

Yuan, YingQi (Yuan, YingQi.)

收录:

EI

摘要:

BiLSTM has been widely used in the field of text classification, but the model still cannot accurately measure the importance of each word and cannot extract text features more effectively. In order to solve this problem, this paper proposes a BiLSTM-WSAttention model. The neural network model of BiLSTM-WSAttention is used to classify text. The BiLSTM-WSAttention neural network model combines the context of words and sentences, and extracts contextual semantic information from two perspectives: front to back and back to front. At the same time, this article introduces an attention mechanism. Since text is composed of sentences, sentences are composed of words, and the importance of words and sentences depends on context information, so this article includes word-level attention mechanisms and sentence-level. The attention mechanism assigns different weight values to different words and sentences. Finally, the method proposed in this article is compared with the classification methods of Naive-Bayes, CNN, RNN, and BLSTM on the same data set. The experimental results show that: Compared with other classification methods, the neural network model BiLSTM-WSAttention proposed in this article is effective on this data set. © 2021 IEEE.

关键词:

Classification (of information) Recurrent neural networks Semantics Text processing

作者机构:

  • [ 1 ] [Yuan, YingQi]Beijing University of Technology, Faculty of Information Technology, Beijing, China

通讯作者信息:

电子邮件地址:

查看成果更多字段

相关关键词:

相关文章:

来源 :

年份: 2021

页码: 2235-2239

语种: 英文

被引次数:

WoS核心集被引频次: 0

SCOPUS被引频次: 4

ESI高被引论文在榜: 0 展开所有

万方被引频次:

中文被引频次:

近30日浏览量: 2

归属院系:

在线人数/总访问数:2064/2983059
地址:北京工业大学图书馆(北京市朝阳区平乐园100号 邮编:100124) 联系我们:010-67392185
版权所有:北京工业大学图书馆 站点建设与维护:北京爱琴海乐之技术有限公司