Abstract
We propose a novel model for Neural Machine Translation (NMT). Different from the conventional method, our model can predict the future text length and words at each decoding time step so that the generation can be helped with the information from the future prediction. With such information, the model does not stop generation without having translated enough content. Experimental results demonstrate that our model can significantly outperform the baseline models. Besides, our analysis reflects that our model is effective in the prediction of the length and words of the untranslated content.
Abstract (translated)
我们提出了一种新的神经机器翻译模型(NMT)。与传统方法不同,我们的模型可以预测每个解码时间步骤的未来文本长度和单词,从而可以帮助生成来自未来预测的信息。有了这些信息,该模型不会在没有翻译足够内容的情况下停止生成。实验结果表明,我们的模型可以明显优于基线模型。此外,我们的分析反映出我们的模型在预测未翻译内容的长度和单词方面是有效的。
URL
https://arxiv.org/abs/1809.00336