Abstract
Recurrent neural networks have become ubiquitous in computing representations of sequential data, especially textual data in natural language processing. In particular, Bidirectional LSTMs are at the heart of several neural models achieving state-of-the-art performance in a wide variety of tasks in NLP. We propose a general and effective improvement to the BiLSTM model which encodes each suffix and prefix of a sequence of tokens in both forward and reverse directions. We call our model Suffix BiLSTM or SuBiLSTM. Using an extensive set of experiments, we demonstrate that using SuBiLSTM instead of a BiLSTM in existing base models leads to improvements in performance in learning general sentence representations, text classification, textual entailment and named entity recognition. We achieve new state-of-the-art results for fine-grained sentiment classification and question classification using SuBiLSTM.
Abstract (translated)
递归神经网络已经变得普遍用于计算顺序数据的表示,尤其是自然语言处理中的文本数据。特别是,双向LSTM是几个神经模型的核心,在NLP中的各种任务中实现了最先进的性能。我们提出对BiLSTM模型的一般和有效改进,该模型在前向和反向两个方向上编码每个后缀和一系列令牌的前缀。我们将模型称为Suffix BiLSTM或SuBiLSTM。使用一组广泛的实验,我们证明在现有基础模型中使用SuBiLSTM而不是BiLSTM可以提高学习一般句子表示,文本分类,文本蕴涵和命名实体识别的性能。我们使用SuBiLSTM实现细粒度情感分类和问题分类的最新结果。
URL
https://arxiv.org/abs/1805.07340