Abstract
Reservoir Computing (RC) is a popular methodology for the efficient design of Recurrent Neural Networks (RNNs). Recently, the advantages of the RC approach have been extended to the context of multi-layered RNNs, with the introduction of the Deep Echo State Network (DeepESN) model. In this paper, we study the quality of state dynamics in progressively higher layers of DeepESNs, using tools from the areas of information theory and numerical analysis. Our experimental results on RC benchmark datasets reveal the fundamental role played by the strength of inter-reservoir connections to increasingly enrich the representations developed in higher layers. Our analysis also gives interesting insights into the possibility of effective exploitation of training algorithms based on stochastic gradient descent in the RC field.
Abstract (translated)
储层计算(RC)是一种常用的循环神经网络(RNN)有效设计方法。近年来,随着深回声状态网络(DEEPESN)模型的引入,RC方法的优点已经扩展到多层RNN的背景下。本文利用信息论和数值分析领域的工具,研究了逐步上升的深水层的状态动力学质量。我们在RC基准数据集上的实验结果揭示了储层间连接强度所起的基本作用,以不断丰富在更高层开发的代表性。我们的分析也对基于随机梯度下降的训练算法在RC领域的有效利用的可能性提供了有趣的见解。
URL
https://arxiv.org/abs/1903.05174