Abstract
Improving neural machine translation (NMT) systems with prompting has achieved significant progress in recent years. In this work, we focus on how to integrate multi-knowledge, multiple types of knowledge, into NMT models to enhance the performance with prompting. We propose a unified framework, which can integrate effectively multiple types of knowledge including sentences, terminologies/phrases and translation templates into NMT models. We utilize multiple types of knowledge as prefix-prompts of input for the encoder and decoder of NMT models to guide the translation process. The approach requires no changes to the model architecture and effectively adapts to domain-specific translation without retraining. The experiments on English-Chinese and English-German translation demonstrate that our approach significantly outperform strong baselines, achieving high translation quality and terminology match accuracy.
Abstract (translated)
近年来,通过提示来提高神经机器翻译(NMT)系统取得了显著的进展。在本文中,我们重点探讨了如何将多知识、多种类型的知识集成到NMT模型中,以提高通过提示的性能。我们提出了一个统一的框架,可以有效地将包括句子、词表/短语和翻译模板在内的多种知识类型集成到NMT模型中。我们将知识作为输入的编码器和解码器的前缀提示,以指导翻译过程。在本文中,无需对模型架构进行更改,即可适应领域特定翻译,而有效避免重新训练。英汉和英德翻译的实验证明,我们的方法在性能上显著优于强大的基线,实现了高翻译质量和词表匹配准确性。
URL
https://arxiv.org/abs/2312.04807