A Document-Level Relation Extraction Framework with Dynamic Pruning

Publication Name

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Abstract

Relation extraction (RE) has been a fundamental task in natural language processing (NLP) as it identifies semantic relations among entity pairs in texts. Because sentence-level RE can only capture intra-connections within a sentence rather than inter-connections between or among sentences, researchers shift their attentions to document-level RE to obtain richer and complex relations which may involve logic inference. Prior works on document-level RE suffer from inflexible pruning rules and lack of sentence-level features, which lead to the missing of valuable information. In this paper, we propose a document-level relation extraction framework with both dynamic pruning mechanism and sentence-level attention. Specifically, a weight-based flexible pruning mechanism is applied on the document-level dependency tree to remove non-relational edges dynamically and obtain the weight dependency tree (WDT). Moreover, a graph convolution network (GCN) then is employed to learn syntactic representations of the WDT. Furthermore, the sentence-level attention and gating selection module are applied to capture the intrinsic interactions between sentence-level and document-level features. We evaluate our framework on three benchmark datasets: DocRED, CDR, and GDA. Experiment results demonstrate that our approach outperforms the baselines and achieves the state-of-the-art performance.

Open Access Status

This publication is not available as open access

Volume

14261 LNCS

First Page

13

Last Page

25

Funding Number

61877051

Funding Sponsor

National Natural Science Foundation of China

Share

COinS
 

Link to publisher version (DOI)

http://dx.doi.org/10.1007/978-3-031-44198-1_2