Abstract
Recent advancements in neural networks, supported by foundational theoretical insights, emphasize the superior representational power of complex numbers. However, their adoption in randomized neural networks (RNNs) has been limited due to the lack of effective methods for transforming real-valued tabular datasets into complex-valued representations. To address this limitation, we propose two methods for generating complex-valued representations from real-valued datasets: a natural transformation and an autoencoder-driven method. Building on these mechanisms, we propose RVFL-X, a complex-valued extension of the random vector functional link (RVFL) network. RVFL-X integrates complex transformations into real-valued datasets while maintaining the simplicity and efficiency of the original RVFL architecture. By leveraging complex components such as input, weights, and activation functions, RVFL-X processes complex representations and produces real-valued outputs. Comprehensive evaluations on 80 real-valued UCI datasets demonstrate that RVFL-X consistently outperforms both the original RVFL and state-of-the-art (SOTA) RNN variants, showcasing its robustness and effectiveness across diverse application domains.
Abstract (translated)
最近在神经网络领域的进展,依托于基础理论的洞见,强调了复数表示的强大表征能力。然而,由于缺乏有效的方法将实数值表格数据集转换为复数值表示,其在随机神经网络(RNN)中的应用受到了限制。为了克服这一局限性,我们提出了两种从实数值数据集中生成复数值表示的方法:自然变换和基于自编码器的变换方法。在此基础上,我们提出了一种RVFL-X模型,它是随机向量功能链接(RVFL)网络的复数扩展版本。RVFL-X将复数转换整合到实数值数据集中,同时保持了原始RVFL架构的简单性和效率。通过利用输入、权重和激活函数等复数组件,RVFL-X处理复数表示并生成实数值输出。 通过对80个UCI实数值数据集进行全面评估,我们发现RVFL-X在性能上始终优于原版RVFL及当前最先进的(SOTA)RNN变体,在不同的应用领域展示了其稳健性和有效性。
URL
https://arxiv.org/abs/2510.06278