• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Xue, Bingxin (Xue, Bingxin.) | Zhu, Cui (Zhu, Cui.) | Wang, Xuan (Wang, Xuan.) | Zhu, Wenjun (Zhu, Wenjun.)

Indexed by:

Scopus SCIE

Abstract:

Graph Convolutional Neural Network (GCN) is widely used in text classification tasks. Furthermore, it has been effectively used to accomplish tasks that are thought to have a rich relational structure. However, due to the sparse adjacency matrix constructed by GCN, GCN cannot make full use of context-dependent information in text classification, and it is not good at capturing local information. The Bidirectional Encoder Representation from Transformers (BERT) has the ability to capture contextual information in sentences or documents, but it is limited in capturing global (the corpus) information about vocabulary in a language, which is the advantage of GCN. Therefore, this paper proposes an improved model to solve the above problems. The original GCN uses word co-occurrence relationships to build text graphs. Word connections are not abundant enough and cannot capture context dependencies well, so we introduce a semantic dictionary and dependencies. While the model enhances the ability to capture contextual dependencies, it lacks the ability to capture sequences. Therefore, we introduced BERT and Bi-directional Long Short-Term Memory (BiLSTM) Network to perform deeper learning on the features of text, thereby improving the classification effect of the model. The experimental results show that our model is more effective than previous research reports on four text classification datasets.

Keyword:

Bi-directional Long Short-Term Memory ResNet graph convolutional network dependencies text classification

Author Community:

  • [ 1 ] [Xue, Bingxin]Beijing Univ Technol, Fac Informat Technol, Beijing 100020, Peoples R China
  • [ 2 ] [Zhu, Cui]Beijing Univ Technol, Fac Informat Technol, Beijing 100020, Peoples R China
  • [ 3 ] [Wang, Xuan]Beijing Univ Technol, Fac Informat Technol, Beijing 100020, Peoples R China
  • [ 4 ] [Zhu, Wenjun]Beijing Univ Technol, Fac Informat Technol, Beijing 100020, Peoples R China

Reprint Author's Address:

Show more details

Related Keywords:

Source :

APPLIED SCIENCES-BASEL

Year: 2022

Issue: 16

Volume: 12

2 . 7

JCR@2022

2 . 7 0 0

JCR@2022

ESI Discipline: ENGINEERING;

ESI HC Threshold:49

JCR Journal Grade:2

CAS Journal Grade:3

Cited Count:

WoS CC Cited Count: 6

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 2

Affiliated Colleges:

Online/Total:854/5292256
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.