Abstract
We present ECToNAS, a cost-efficient evolutionary cross-topology neural architecture search algorithm that does not require any pre-trained meta controllers. Our framework is able to select suitable network architectures for different tasks and hyperparameter settings, independently performing cross-topology optimisation where required. It is a hybrid approach that fuses training and topology optimisation together into one lightweight, resource-friendly process. We demonstrate the validity and power of this approach with six standard data sets (CIFAR-10, CIFAR-100, EuroSAT, Fashion MNIST, MNIST, SVHN), showcasing the algorithm's ability to not only optimise the topology within an architectural type, but also to dynamically add and remove convolutional cells when and where required, thus crossing boundaries between different network types. This enables researchers without a background in machine learning to make use of appropriate model types and topologies and to apply machine learning methods in their domains, with a computationally cheap, easy-to-use cross-topology neural architecture search framework that fully encapsulates the topology optimisation within the training process.
Abstract (translated)
我们提出了ECToNAS,一种成本效益高的进化交叉 topology 神经架构搜索算法,它不需要任何预训练的元控制器。我们的框架能够独立执行跨 topology 优化,在需要时提供。这是一种结合训练和 topology 优化的轻量级、资源友好的过程。我们用六个标准数据集(CIFAR-10,CIFAR-100,EuroSAT,Fashion MNIST,MNIST,SVHN)证明了这种方法的有效性和威力,展示了算法不仅可以在架构类型的 topology 上进行优化,还可以在需要时动态添加和删除卷积细胞,从而跨越不同网络类型的边界。这使得没有机器学习背景的研究人员可以使用适当的模型类型和 topology,并在其领域应用机器学习方法。我们提供了一个计算上便宜、易于使用的跨 topology 神经架构搜索框架,完全封装了 topology 优化在训练过程中。
URL
https://arxiv.org/abs/2403.05123