University of Wollongong
Browse

A Document-Level Relation Extraction Framework with Dynamic Pruning

journal contribution
posted on 2024-11-17, 13:32 authored by Hanyue Zhang, Li Li, Jun Shen
Relation extraction (RE) has been a fundamental task in natural language processing (NLP) as it identifies semantic relations among entity pairs in texts. Because sentence-level RE can only capture intra-connections within a sentence rather than inter-connections between or among sentences, researchers shift their attentions to document-level RE to obtain richer and complex relations which may involve logic inference. Prior works on document-level RE suffer from inflexible pruning rules and lack of sentence-level features, which lead to the missing of valuable information. In this paper, we propose a document-level relation extraction framework with both dynamic pruning mechanism and sentence-level attention. Specifically, a weight-based flexible pruning mechanism is applied on the document-level dependency tree to remove non-relational edges dynamically and obtain the weight dependency tree (WDT). Moreover, a graph convolution network (GCN) then is employed to learn syntactic representations of the WDT. Furthermore, the sentence-level attention and gating selection module are applied to capture the intrinsic interactions between sentence-level and document-level features. We evaluate our framework on three benchmark datasets: DocRED, CDR, and GDA. Experiment results demonstrate that our approach outperforms the baselines and achieves the state-of-the-art performance.

Funding

National Natural Science Foundation of China (61877051)

History

Journal title

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Volume

14261 LNCS

Pagination

13-25

Language

English

Usage metrics

    Categories

    No categories selected

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC