Skip to content

Jie Hao, William Zhu, Architecture self-attention mechanism: nonlinear optimization for neural architecture search

Full Text: PDF
DOI: 10.23952/jnva.5.2021.1.08

Volume 5, Issue 1, 1 February 2021, Pages 119-140

 

Abstract. Neural Architecture Search (NAS) is a very prevalent method of automatically designing neural network architectures. It has recently drawn considerable attention since it relieves the manual design labour of neural networks. However, existing NAS methods ignore the interrelationships among candidate architectures in the search space. As a consequence, the objective neural architecture extracted from the search space suffers from performance unstable due to the interrelationship collapse. In this paper, we propose architecture self-attention mechanism for neural architecture search (ASM-NAS) to address the above problem. Specifically, the proposed architecture self-attention mechanism constructs the interrelationships among architectures by interacting information between any two candidate architectures. Through learning the interrelationships, it selectively emphasizes some architectures important to the network while suppressing unimportant ones, which provides significant references for the architecture selection. Therefore, we improves the performance stability of the architecture search by the above startegy. Besides, our proposed method is high-efficiency and executes architecture search with low time and space costs. Compared to other advanced NAS approaches, our ASM-NAS is able to achieve better architecture search performance on the image classification datasets of CIFAR10, CIFAR100, fashionMNIST and ImageNet.

 

How to Cite this Article:
J. Hao, W. Zhu, Architecture self-attention mechanism: nonlinear optimization for neural architecture search, J. Nonlinear Var. Anal. 5 (2021), 119-140.