• 综合
  • 标题
  • 关键词
  • 摘要
  • 学者
  • 期刊-刊名
  • 期刊-ISSN
  • 会议名称
搜索

作者:

Zhao, Jianyu (Zhao, Jianyu.) | Zhan, Zhiqiang (Zhan, Zhiqiang.) | Li, Tong (Li, Tong.) | Li, Rang (Li, Rang.) | Hu, Changjian (Hu, Changjian.) | Wang, Siyun (Wang, Siyun.) | Zhang, Yang (Zhang, Yang.)

收录:

EI Scopus SCIE

摘要:

Table-to-Text generation aims to generate descriptions which can be viewed as a set of field-value records for factual tables. Despite the significant progress, the state-of-the-art models suffer from two major issues: Nonfluency and Divergence. Nonfluency means descriptions generated by models are not as fluent as those generated by humans, and thus can be distinguished easily. Divergence refers to the fact that the generated sentences contain information which can not be concluded from factual tables. This could be attributed to that most neural models are trained with the Maximum Likelihood Estimation (MLE) loss and use divergence-contained references as the ground truth, which forces the models to learn what cannot be inferred from the source to some extent. Motivated by the limitations of current models, we propose a novel GAN-based model with adversarial learning mechanism, which simultaneously trains a generative model G and a discriminative model D, to address Nonfluency and Divergence issues in Table-to-Text generation. Specifically, we build the generator G as an agent of reinforcement learning with a sequence-to-sequence architecture, which takes the raw data as input and predicts the generated sentences. Meanwhile, we build the discriminator D with a Convolutional Neural Network (CNN) to calculate rewards to measure the fluency of generations. To judge the fidelity of generations with regard to the original table more accurately, we also calculate the rewards from BLEU Table. With the fusion rewards from CNN and BLEU-Table, our methods outperform the baselines by a large margin on the WikiBio and Wiki3C benchmarks evaluated with BLEU, ROURGE, and PARENT. Specifically, our models achieve 49.0 (BLEU-4), 37.8 (ROUGE-4) and 45.4 (PARENT) on WikiBio, as well as 12.9 (BLEU-4) and 6.9 (ROUGE-4) on Wiki3C. More importantly, we construct a new Wiki3C dataset that improves the insufficiency of datasets and promote the progress in Table-to-Text generation. (c) 2021 Elsevier B.V. All rights reserved.

关键词:

Table-to-Text generation Natural language generation Generative adversarial network

作者机构:

  • [ 1 ] [Zhao, Jianyu]Lenovo Res, AI Lab, Beijing, Peoples R China
  • [ 2 ] [Li, Rang]Lenovo Res, AI Lab, Beijing, Peoples R China
  • [ 3 ] [Hu, Changjian]Lenovo Res, AI Lab, Beijing, Peoples R China
  • [ 4 ] [Zhan, Zhiqiang]Lenovo Res, Smart Educ Lab, Beijing, Peoples R China
  • [ 5 ] [Zhang, Yang]Lenovo Res, Smart Educ Lab, Beijing, Peoples R China
  • [ 6 ] [Li, Tong]Beijing Univ Technol, Comp Sci, Beijing, Peoples R China
  • [ 7 ] [Wang, Siyun]Univ Southern Calif, Los Angeles, CA 90089 USA

通讯作者信息:

  • [Zhang, Yang]Lenovo Res, Smart Educ Lab, Beijing, Peoples R China

电子邮件地址:

查看成果更多字段

相关关键词:

来源 :

NEUROCOMPUTING

ISSN: 0925-2312

年份: 2021

卷: 452

页码: 28-36

6 . 0 0 0

JCR@2022

ESI学科: COMPUTER SCIENCE;

ESI高被引阀值:87

JCR分区:2

被引次数:

WoS核心集被引频次: 8

SCOPUS被引频次: 6

ESI高被引论文在榜: 0 展开所有

万方被引频次:

中文被引频次:

近30日浏览量: 1

在线人数/总访问数:115/4300126
地址:北京工业大学图书馆(北京市朝阳区平乐园100号 邮编:100124) 联系我们:010-67392185
版权所有:北京工业大学图书馆 站点建设与维护:北京爱琴海乐之技术有限公司