GNAS: A Generalized Neural Network Architecture Search Framework

03/19/2021
by   Dige Ai, et al.
0

In practice, the problems encountered in training NAS (Neural Architecture Search) are not simplex, but a series of combinations of difficulties are often faced(incorrect compensation estimation, curse of dimension, overfitting, high complexity, etc.). From the point of view for solving practical problems, this paper makes reference and improvement to the previous researches which only solve the single problem of NAS, and combines them into a practical technology flow. This paper propose a framework that decouples the network structure from the search space for operators. We use two BOHBs(Bayesian Optimization Hyperband) to search alternately in the vast network structure and operator search space. And then, we trained a GCN-baesd predictor using the feedback of the child model. This approach takes care of the dimension curse while improving efficiency. Considering that activation function and initialization are also important components of neural network, and can affect the generalization ability of the model. This paper introduced an activation function and an initialization method domain, join them to the operator search space to form a generalized search space, thus improving the generalization ability of the child model. At last, We applied our framework to neural architecture search and achieved significant improvements on multiple datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset