• 综合
  • 标题
  • 关键词
  • 摘要
  • 学者
  • 期刊-刊名
  • 期刊-ISSN
  • 会议名称
搜索

作者:

Ji, Zongcheng (Ji, Zongcheng.) | Xiao, Yinlong (Xiao, Yinlong.)

收录:

EI Scopus

摘要:

The Flat-LAttice Transformer (FLAT) has achieved notable success in Chinese named entity recognition (NER) by integrating lexical information into the widely-used Transformer encoder. FLAT enhances each sentence by constructing a flat lattice, a token sequence with characters and matched lexicon words, and calculating self-attention among tokens. However, FLAT faces a quadruple complexity challenge, especially with lengthy sentences containing numerous matched words, significantly increasing memory and computational costs. To alleviate this issue, we propose a novel lightweight lexicon-enhanced Transformer (LLET) for Chinese NER. Specifically, we introduce two distinct variants that focus on character attention to characters and words, both jointly and separately. Experimental results conducted on four public Chinese NER datasets show that both variants achieve significant memory savings while maintaining comparable performance when compared to FLAT. © 2024 IEEE.

关键词:

作者机构:

  • [ 1 ] [Ji, Zongcheng]PAII Inc, California, United States
  • [ 2 ] [Xiao, Yinlong]Beijing University of Technology, Beijing, China

通讯作者信息:

电子邮件地址:

查看成果更多字段

相关关键词:

相关文章:

来源 :

ISSN: 1520-6149

年份: 2024

页码: 12677-12681

语种: 英文

被引次数:

WoS核心集被引频次:

SCOPUS被引频次:

ESI高被引论文在榜: 0 展开所有

万方被引频次:

中文被引频次:

近30日浏览量: 0

归属院系:

在线人数/总访问数:282/4977983
地址:北京工业大学图书馆(北京市朝阳区平乐园100号 邮编:100124) 联系我们:010-67392185
版权所有:北京工业大学图书馆 站点建设与维护:北京爱琴海乐之技术有限公司