Grid Search, Random Search, Genetic Algorithm: A Big Comparison for NAS

12/12/2019
by   Petro Liashchynskyi, et al.
0

In this paper, we compare the three most popular algorithms for hyperparameter optimization (Grid Search, Random Search, and Genetic Algorithm) and attempt to use them for neural architecture search (NAS). We use these algorithms for building a convolutional neural network (search architecture). Experimental results on CIFAR-10 dataset further demonstrate the performance difference between compared algorithms. The comparison results are based on the execution time of the above algorithms and accuracy of the proposed models.

READ FULL TEXT
research
11/18/2022

HiveNAS: Neural Architecture Search using Artificial Bee Colony Optimization

The traditional Neural Network-development process requires substantial ...
research
08/24/2023

Hybrid Genetic Algorithm and Hill Climbing Optimization for the Neural Network

In this paper, we propose a hybrid model combining genetic algorithm and...
research
10/30/2020

Resource-Aware Pareto-Optimal Automated Machine Learning Platform

In this study, we introduce a novel platform Resource-Aware AutoML (RA-A...
research
08/07/2023

MCTS guided Genetic Algorithm for optimization of neural network weights

In this research, we investigate the possibility of applying a search st...
research
11/05/2021

A Data-driven Approach to Neural Architecture Search Initialization

Algorithmic design in neural architecture search (NAS) has received a lo...
research
09/20/2019

Genetic Neural Architecture Search for automatic assessment of human sperm images

Male infertility is a disease which affects approximately 7 morphology a...
research
10/27/2021

A Novel Sleep Stage Classification Using CNN Generated by an Efficient Neural Architecture Search with a New Data Processing Trick

With the development of automatic sleep stage classification (ASSC) tech...

Please sign up or login with your details

Forgot password? Click here to reset