Sort:
Open Access Issue
Event Temporal Relation Extraction with Attention Mechanism and Graph Neural Network
Tsinghua Science and Technology 2022, 27(1): 79-90
Published: 17 August 2021
Abstract PDF (3 MB) Collect
Downloads:117

Event temporal relation extraction is an important part of natural language processing. Many models are being used in this task with the development of deep learning. However, most of the existing methods cannot accurately obtain the degree of association between different tokens and events, and event-related information cannot be effectively integrated. In this paper, we propose an event information integration model that integrates event information through multilayer bidirectional long short-term memory (Bi-LSTM) and attention mechanism. Although the above scheme can improve the extraction performance, it can still be further optimized. To further improve the performance of the previous scheme, we propose a novel relational graph attention network that incorporates edge attributes. In this approach, we first build a semantic dependency graph through dependency parsing, model a semantic graph that considers the edges’ attributes by using top-k attention mechanisms to learn hidden semantic contextual representations, and finally predict event temporal relations. We evaluate proposed models on the TimeBank-Dense dataset. Compared to previous baselines, the Micro-F1 scores obtained by our models improve by 3.9% and 14.5%, respectively.

Total 1