• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Mu, Zhibo (Mu, Zhibo.) | Zheng, Shuang (Zheng, Shuang.) | Wang, Quanmin (Wang, Quanmin.)

Indexed by:

CPCI-S EI Scopus

Abstract:

Aiming at the problem of sparse Chinese text features and mixing of long and short texts, which lead to the difficulty of extracting word vector features and the single convolution kernel of traditional neural network and redundant parameters, the ACL-RoBERTa-CNN text classification model uses a contrast learning method to learn a uniformly distributed vector representation in order to achieve the effect of regular expression of space. Input the same sentence into dropout twice as "positive pairs", replacing the traditional data enhancement method. Use the contrast learned RoBERTa pre-training model to train the word vector, send the word vector to the CNN layer, use different size convolution kernels to capture the information of different length words in each piece of data, finally combine with Softmax the classifier classifies the extracted features. The experimental results on two public data sets show that ACL-RoBERTa-CNN classification performance is better than TextCNN, TextRNN, LSTM-ATT, RoBERTa-LSTM, RoBERTa-CNN and other deep learning text classification models.

Keyword:

CNN Text Classification Contrastive learning RoBERTa

Author Community:

  • [ 1 ] [Mu, Zhibo]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 2 ] [Zheng, Shuang]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 3 ] [Wang, Quanmin]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China

Reprint Author's Address:

Show more details

Related Keywords:

Related Article:

Source :

2021 INTERNATIONAL CONFERENCE ON BIG DATA ENGINEERING AND EDUCATION (BDEE 2021)

Year: 2021

Page: 193-197

Cited Count:

WoS CC Cited Count: 3

SCOPUS Cited Count: 5

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 2

Affiliated Colleges:

Online/Total:532/5317384
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.