Paper Reading AI Learner

Event Presence Prediction Helps Trigger Detection Across Languages

2020-09-15 15:52:21
Parul Awasthy, Tahira Naseem, Jian Ni, Taesun Moon, Radu Florian

Abstract

tract: The task of event detection and classification is central to most information retrieval applications. We show that a Transformer based architecture can effectively model event extraction as a sequence labeling task. We propose a combination of sentence level and token level training objectives that significantly boosts the performance of a BERT based event extraction model. Our approach achieves a new state-of-the-art performance on ACE 2005 data for English and Chinese. We also test our model on ERE Spanish, achieving an average gain of 2 absolute F1 points over prior best performing model.

Abstract (translated)

URL

https://arxiv.org/abs/2009.07188

PDF

https://arxiv.org/pdf/2009.07188