• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Guo Xianjing (Guo Xianjing.) | Li Yong (Li Yong.)

Indexed by:

CPCI-S EI Scopus

Abstract:

As an important basic task in natural language processing, textual entailment recognition has practical applications in QA, information retrieval IR, information extraction and many other tasks. The traditional methods of text imputation include mainly classification methods based on artificial features, methods based on word similarity, and so on. Traditional methods require a large number of manual extraction features, construction rules, and the like. Deep neural networks can avoid the problems of manual extraction features in traditional machine learning methods and error accumulation caused by NLP preprocessing tools. With the good results of deep learning in the field of natural language processing, the study of natural language reasoning and textual inclusion recognition is also increasing. The same text under different tasks will focus on its focus. For the text implied recognition task, we usually focus on whether the sub-events in each text match. If we can decompose a sentence into sub-events related to it, and judge that the sub-events in the hypothetical text are contained in the sub-events of the premise text, then we can determine the implication between the two sentences. The mLSTM model achieved an accuracy of 86.1% on the SNLI corpus, which is the best level at present, but it is not effective on other small corpus. This article aims to improve the mLSTM model, establishing the mGRU model based on GRU (Gated Recurrent Unit), and verify the effect of the model on SNLI corpus and multiNLI corpora.

Keyword:

GRU mGRU Textual Entailment Recognition

Author Community:

  • [ 1 ] [Guo Xianjing]Beijing Univ Technol, Beijing, Peoples R China
  • [ 2 ] [Li Yong]Beijing Univ Technol, Beijing, Peoples R China

Reprint Author's Address:

  • [Guo Xianjing]Beijing Univ Technol, Beijing, Peoples R China

Show more details

Related Keywords:

Related Article:

Source :

2018 THE 10TH INTERNATIONAL CONFERENCE ON INFORMATION AND MULTIMEDIA TECHNOLOGY (ICIMT 2018)

Year: 2018

Page: 282-285

Language: English

Cited Count:

WoS CC Cited Count: 1

SCOPUS Cited Count: 2

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 0

Affiliated Colleges:

Online/Total:543/5611466
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.