Abstract
We present an approach to minimally supervised relation extraction that combines the benefits of learned representations and structured learning, and accurately predicts sentence-level relation mentions given only proposition-level supervision from a KB. By explicitly reasoning about missing data during learning, our approach enables large-scale training of 1D convolutional neural networks while mitigating the issue of label noise inherent in distant supervision. Our approach achieves state-of-the-art results on minimally supervised sentential relation extraction, outperforming a number of baselines, including a competitive approach that uses the attention layer of a purely neural model.
Abstract (translated)
我们提出了一种最小监督关系提取的方法,它结合了学习表示和结构化学习的优点,并准确预测了仅由知识库提供命题级监督的句子级关系。通过对学习过程中丢失的数据进行明确的推理,我们的方法能够对一维卷积神经网络进行大规模的训练,同时减轻了远程监控中固有的标签噪声问题。我们的方法在最小监督的句子关系提取上取得了最先进的结果,超过了许多基线,包括使用纯神经模型的注意层的竞争方法。
URL
https://arxiv.org/abs/1904.00118