Abstract
When answering natural language questions over knowledge bases (KB), different question components and KB aspects play different roles. However, most existing embedding-based methods for knowledge base question answering (KBQA) ignore the subtle inter-relationships between the question and the KB (e.g., entity types, relation paths and context). In this work, we propose to directly model the two-way flow of interactions between the questions and the underlying KB via a novel two-layered bidirectional attention network, called BAMnet. Without requiring any external resources or hand-crafted features, on the WebQuestions benchmark, our method significantly outperforms existing information-retrieval based methods, and remains competitive with (hand-crafted) semantic parsing based methods. Also, since we use attention mechanisms, our method offers better interpretability compared to other baselines.
Abstract (translated)
在知识库上回答自然语言问题时,不同的问题成分和知识库方面扮演着不同的角色。然而,大多数现有的基于嵌入的知识库问题解答(KBQA)方法忽略了问题与知识库之间的微妙关系(例如,实体类型、关系路径和上下文)。在这项工作中,我们建议通过一个新的两层双向注意力网络,即BAMNET,直接模拟问题和底层知识库之间的双向交互流。在不需要任何外部资源或手工制作功能的情况下,在WebQuestions基准测试中,我们的方法显著优于现有的基于信息检索的方法,并且与(手工制作的)基于语义解析的方法保持竞争。此外,由于我们使用了注意力机制,与其他基线相比,我们的方法提供了更好的解释能力。
URL
https://arxiv.org/abs/1903.02188