• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Hou, Xinyu (Hou, Xinyu.) | Zhu, Cui (Zhu, Cui.) | Zhu, Wenjun (Zhu, Wenjun.)

Indexed by:

EI Scopus

Abstract:

In recent times, numerous models tried to enhance the performance of Transformer on Chinese NER tasks. The model can be enhanced in two ways: one is combining it with lexicon augmentation techniques, the other is optimizing the Transformer model itself. According to research, fully connected self-attention may scatter the attention distribution, which is the reason for worse performance of the original Transformer with self-attention. In this paper, we attempt to optimize the Transformer model especially attention layer. Therefore, a novel attention mechanism, Dilated Shift Window Attention, is proposed to address this problem. By using Window Attention, this method improves the model’s capacity to deal local information, meanwhile, the model can still manage long text and long-distance dependencies owing to the Window Dilatation mechanism. Experiments on various datasets also show that DSWA replacing fully connected self-attention improves the model’s performance on the Chinese NER task. Copyright © 2023 by KSI Research Inc. and Knowledge Systems Institute, USA.

Keyword:

Software engineering Knowledge engineering

Author Community:

  • [ 1 ] [Hou, Xinyu]School of Computer Science, Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 2 ] [Zhu, Cui]School of Computer Science, Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 3 ] [Zhu, Wenjun]School of Computer Science, Faculty of Information Technology, Beijing University of Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2023

Page: 51-57

Language: English

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 1

Affiliated Colleges:

Online/Total:622/5421930
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.