GADAM: Genetic-Evolutionary ADAM for Deep Neural Network Optimization

by   Jiawei Zhang, et al.

Deep neural network learning can be formulated as a non-convex optimization problem. Existing optimization algorithms, e.g., Adam, can learn the models fast, but may get stuck in local optima easily. In this paper, we introduce a novel optimization algorithm, namely GADAM (Genetic-Evolutionary Adam). GADAM learns deep neural network models based on a number of unit models generations by generations: it trains the unit models with Adam, and evolves them to the new generations with genetic algorithm. We will show that GADAM can effectively jump out of the local optima in the learning process to obtain better solutions, and prove that GADAM can also achieve a very fast convergence. Extensive experiments have been done on various benchmark datasets, and the learning results will demonstrate the effectiveness and efficiency of the GADAM algorithm.


page 1

page 2

page 3

page 4


BGADAM: Boosting based Genetic-Evolutionary ADAM for Convolutional Neural Network Optimization

Among various optimization algorithms, ADAM can achieve outstanding perf...

GEN Model: An Alternative Approach to Deep Neural Network Models

In this paper, we introduce an alternative approach, namely GEN (Genetic...

An Evolutionary Optimization Approach to Risk Parity Portfolio Selection

In this paper we present an evolutionary optimization approach to solve ...

Deep Genetic Network

Optimizing a neural network's performance is a tedious and time taking p...

Supervising Unsupervised Learning with Evolutionary Algorithm in Deep Neural Network

A method to control results of gradient descent unsupervised learning in...

An Evolutionary Algorithm of Linear complexity: Application to Training of Deep Neural Networks

The performance of deep neural networks, such as Deep Belief Networks fo...

Multi-Grade Deep Learning

The current deep learning model is of a single-grade, that is, it learns...

Please sign up or login with your details

Forgot password? Click here to reset