Abstract
Latest algorithms for automatic neural architecture search perform remarkable but are basically directionless in search space and computational expensive in training of every intermediate architecture. In this paper, we propose a method for efficient architecture search called EENA (Efficient Evolution of Neural Architecture). Due to the elaborately designed mutation and crossover operations, the evolution process can be guided by the information have already been learned. Therefore, less computational effort will be required while the searching and training time can be reduced significantly. On CIFAR-10 classification, EENA using minimal computational resources (0.65 GPU-days) can design highly effective neural architecture which achieves 2.56% test error with 8.47M parameters. Furthermore, the best architecture discovered is also transferable for CIFAR-100.
Abstract (translated)
最新的自动神经架构搜索算法性能显著,但基本上在搜索空间中没有方向性,并且在每个中间架构的训练中计算代价都很高。本文提出了一种有效的结构搜索方法EENA(神经结构的有效演化)。由于精心设计的变异和交叉操作,进化过程可以由已经学习到的信息来指导。因此,在减少搜索和训练时间的同时,需要较少的计算工作量。在cifar-10分类中,使用最小计算资源(0.65gpu天)的eena可以设计出高效的神经网络结构,在8.47m参数下,测试误差达到2.56%。此外,发现的最佳体系结构也可用于CIFAR-100。
URL
https://arxiv.org/abs/1905.07320