当前位置: X-MOL 学术J. Comput. Sci. Tech. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Augmenting Trigger Semantics to Improve Event Coreference Resolution
Journal of Computer Science and Technology ( IF 1.9 ) Pub Date : 2023-05-30 , DOI: 10.1007/s11390-022-1143-8
Min Huan , Sheng Xu , Pei-Feng Li

Due to the small size of the annotated corpora and the sparsity of the event trigger words, the event coreference resolver cannot capture enough event semantics, especially the trigger semantics, to identify coreferential event mentions. To address the above issues, this paper proposes a trigger semantics augmentation mechanism to boost event coreference resolution. First, this mechanism performs a trigger-oriented masking strategy to pre-train a BERT (Bidirectional Encoder Representations from Transformers)-based encoder (Trigger-BERT), which is fine-tuned on a large-scale unlabeled dataset Gigaword. Second, it combines the event semantic relations from the Trigger-BERT encoder with the event interactions from the soft-attention mechanism to resolve event coreference. Experimental results on both the KBP2016 and KBP2017 datasets show that our proposed model outperforms several state-of-the-art baselines.



中文翻译:

增强触发语义以提高事件共指解析

由于注释语料库的规模较小以及事件触发词的稀疏性,事件共指解析器无法捕获足够的事件语义,尤其是触发语义来识别共指事件提及。为了解决上述问题,本文提出了一种触发语义增强机制来促进事件共指解析。首先,该机制执行面向触发的掩蔽策略来预训练基于 BERT(来自 Transformers 的双向编码器表示)的编码器(Trigger-BERT),该编码器在大规模未标记数据集 Gigaword 上进行了微调。其次,它将 Trigger-BERT 编码器的事件语义关系与软注意力机制的事件交互相结合,以解决事件共指问题。

更新日期:2023-05-30
down
wechat
bug