• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Jiang, ZongLi (Jiang, ZongLi.) (Scholars:蒋宗礼) | Zhang, Shuo (Zhang, Shuo.)

Indexed by:

EI Scopus

Abstract:

The traditional end-to-end task-oriented dialogue models are based on complex recurrent or convolutional neural networks that include an encoder and a decoder. But in the case of large amounts of data, there are many types of questions. It performs poorly when answering multiple types of questions, memory information cannot effectively record all the sentence information of the context. In view of the above this, this article uses a modified transformer model to overcome the problems mentioned in dialogue tasks. Transformer is a model constructed using attention mechanisms, which completely discards the method of RNN (recurrent neural networks), and its structure includes two sub-parts of Encoder and decoder. It uses residual network, batch normalization, and self-attention mechanism to build the model structure, uses Positional Encoding to capture sentence information, which can speed up model training convergence and capture Longer sentence information. In this paper, we modified the activation function in the transformer and use label smoothing to optimize the training to make the model's expressive ability better than previous. © 2019 Published under licence by IOP Publishing Ltd.

Keyword:

Intelligent computing Decoding Convolutional neural networks Signal encoding Recurrent neural networks

Author Community:

  • [ 1 ] [Jiang, ZongLi]Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 2 ] [Zhang, Shuo]Faculty of Information Technology, Beijing University of Technology, Beijing, China

Reprint Author's Address:

  • 蒋宗礼

    [jiang, zongli]faculty of information technology, beijing university of technology, beijing, china

Show more details

Related Keywords:

Related Article:

Source :

ISSN: 1742-6588

Year: 2020

Issue: 1

Volume: 1544

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 4

Affiliated Colleges:

Online/Total:650/5311203
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.