收录:
摘要:
Relation extraction is fundamental in knowledge graph construction. Recent studies have indicated the efficacy of Graph Convolutional Networks in enhancing performance in relation extraction tasks by leveraging dependency trees. However, noise in automatically generated dependency trees poses a challenge to using syntactic dependency information effectively. In this paper, we propose an Adaptive Graph Attention Network model based on Dependency Type and Direction information, which effectively reduces noise through direction-aware dependency relations, thereby enhancing extraction performance. Specifically, we propose an adaptive graph attention mechanism to construct direction-aware adjacency matrices for the precise aggregation of dependency pairs centred around entities as head words. This mechanism effectively filters out noise interference from entities acting as dependent words. Moreover, our model dynamically allocates weights to different dependency types based on their directions. This adaptive allocation enhances the learning capability of entity representations, optimizing the encoding and extraction of entity information. Experimental results on two English benchmark datasets demonstrate that introducing direction information significantly enhances the model's performance. These findings validate the efficacy of incorporating directionality in encoding to reduce dependency noise and improve relation extraction task performance.
关键词:
通讯作者信息:
电子邮件地址:
来源 :
DOCUMENT ANALYSIS AND RECOGNITION-ICDAR 2024, PT IV
ISSN: 0302-9743
年份: 2024
卷: 14807
页码: 287-305
归属院系: