Abstract
We propose a novel tensor network language model based on the simplest tensor network (i.e., tensor trains), called `Tensor Train Language Model' (TTLM). TTLM represents sentences in an exponential space constructed by the tensor product of words, but computing the probabilities of sentences in a low-dimensional fashion. We demonstrate that the architectures of Second-order RNNs, Recurrent Arithmetic Circuits (RACs), and Multiplicative Integration RNNs are, essentially, special cases of TTLM. Experimental evaluations on real language modeling tasks show that the proposed variants of TTLM (i.e., TTLM-Large and TTLM-Tiny) outperform the vanilla Recurrent Neural Networks (RNNs) with low-scale of hidden units. (The code is available at this https URL.)
Abstract (translated)
我们提出了一个基于最简单的张量网络(即张量训练)的新型张量网络语言模型,称为`Tensor Train Language Model'(TTLM)。TTLM表示由单词张量乘积构成的指数空间中的句子,但以低维方式计算句子的概率。我们证明了Second-order RNN,Recurrent Arithmetic Circuits(RACs)和Multiplicative Integration RNNs的架构本质上与TTLM相同。在真实语言建模任务上的实验评估表明,与普通循环神经网络(RNNs)相比,所提出的TTLM变体(即TTLM-Large和TTLM-Tiny)在低隐藏单元规模上表现出更好的性能。(代码可在此处访问:https://this URL。)
URL
https://arxiv.org/abs/2405.04590