• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Mu, Pengyu (Mu, Pengyu.) | He, Jingsha (He, Jingsha.) (Scholars:何泾沙) | Zhu, Nafei (Zhu, Nafei.)

Indexed by:

CPCI-S EI Scopus

Abstract:

At present, the network pyramid scheme has become a major tumor that hinders social development. In order to curb the propagation of the network pyramid scheme and effectively identify the pyramid scheme text in the network, this study proposes a joint topic model, Paragraph Vector Latent Dirichlet Allocation (PV_LDA), based on the characteristics of high-yield, high rebate, hierarchical salary and text topic diversity described in the text. The model uses the paragraph as the minimum processing unit to generate the topic distribution matrix of "high-interest rate" and "hierarchical salary" from the network pyramid scheme text. The Gibbs sampling is used to derive the "pyramid scheme" topic distribution matrix represented by the two features, which is used for classification processing by the classifier. the classification accuracy rate for the network pyramid scheme text can reach 86.25%. The conclusions show that the topic model proposed in this paper can capture the characteristics of the pyramid scheme more reasonably.

Keyword:

network pyramid scheme topic model topic mining text classification

Author Community:

  • [ 1 ] [Mu, Pengyu]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 2 ] [He, Jingsha]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 3 ] [Zhu, Nafei]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China

Reprint Author's Address:

  • [Mu, Pengyu]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China

Show more details

Related Keywords:

Related Article:

Source :

NLPIR 2019: 2019 3RD INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING AND INFORMATION RETRIEVAL

Year: 2019

Page: 15-19

Language: English

Cited Count:

WoS CC Cited Count: 1

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 2

Affiliated Colleges:

Online/Total:670/5296486
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.