当前位置: X-MOL 学术Int. J. Uncertain. Fuzziness Knowl. Based Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Document-Level Relation Extraction with Deep Gated Graph Reasoning
International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems ( IF 1.5 ) Pub Date : 2024-03-20 , DOI: 10.1142/s0218488524400063
Zeyu Liang

Extracting the relations of two entities on the sentence-level has drawn increasing attention in recent years but remains facing great challenges on document-level, due to the inherent difficulty in recognizing the relations of two entities across multiple sentences. Previous works show that employing the graph convolutional neural network can help the model capture unstructured dependent information of entities. However, they usually employed the non-adaptive weight edges to build the correlation weight matrix which suffered from the problem of information redundancy and gradient disappearance. To solve this problem, we propose a deep gated graph reasoning model for document-level relation extraction, namely, BERT-GGNNs, which employ an improved gated graph neural network with a learnable correlation weight matrix to establish multiple deep gated graph reason layers. The proposed deep gated graph reasoning layers make the model easier to reasoning the relations between entities hidden in the document. Experiments show that the proposed model outperforms most of strong baseline models, and our proposed model is 0.3% and 0.3% higher than the famous LSR-BERT model on the F1 and Ing F1, respectively.



中文翻译:

使用深度门控图推理进行文档级关系提取

近年来,在句子级别提取两个实体的关系引起了越来越多的关注,但由于识别跨多个句子的两个实体的关系存在固有的困难,因此在文档级别仍然面临巨大的挑战。先前的工作表明,采用图卷积神经网络可以帮助模型捕获实体的非结构化依赖信息。然而,他们通常采用非自适应权值边来构建相关权值矩阵,存在信息冗余和梯度消失的问题。为了解决这个问题,我们提出了一种用于文档级关系提取的深度门控图推理模型,即 BERT-GGNN,它采用具有可学习相关权重矩阵的改进门控图神经网络来建立多个深度门控图推理层。所提出的深度门控图推理层使模型更容易推理隐藏在文档中的实体之间的关系。实验表明,我们提出的模型优于大多数强基线模型,并且我们提出的模型在 F1 和 Ing F1 上分别比著名的 LSR-BERT 模型高0.3%0.3% 。

更新日期:2024-03-21
down
wechat
bug