Fast-DENSER++: Evolving Fully-Trained Deep Artificial Neural Networks

05/08/2019
by   Filipe Assunção, et al.
0

This paper proposes a new extension to Deep Evolutionary Network Structured Evolution (DENSER), called Fast-DENSER++ (F-DENSER++). The vast majority of NeuroEvolution methods that optimise Deep Artificial Neural Networks (DANNs) only evaluate the candidate solutions for a fixed amount of epochs; this makes it difficult to effectively assess the learning strategy, and requires the best generated network to be further trained after evolution. F-DENSER++ enables the training time of the candidate solutions to grow continuously as necessary, i.e., in the initial generations the candidate solutions are trained for shorter times, and as generations proceed it is expected that longer training cycles enable better performances. Consequently, the models discovered by F-DENSER++ are fully-trained DANNs, and are ready for deployment after evolution, without the need for further training. The results demonstrate the ability of F-DENSER++ to effectively generate fully-trained DANNs; by the end of evolution, whilst the average performance of the models generated by F-DENSER++ is of 88.73 previous version of DENSER (Fast-DENSER) is 86.91 which increases to 87.76

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

research
04/01/2020

Incremental Evolution and Development of Deep Artificial Neural Networks

NeuroEvolution (NE) methods are known for applying Evolutionary Computat...
research
04/05/2017

The Relative Performance of Ensemble Methods with Deep Convolutional Neural Networks for Image Classification

Artificial neural networks have been successfully applied to a variety o...
research
04/16/2021

ALF – A Fitness-Based Artificial Life Form for Evolving Large-Scale Neural Networks

Machine Learning (ML) is becoming increasingly important in daily life. ...
research
05/23/2022

Split personalities in Bayesian Neural Networks: the case for full marginalisation

The true posterior distribution of a Bayesian neural network is massivel...
research
03/26/2022

Discovering dynamical features of Hodgkin-Huxley-type model of physiological neuron using artificial neural network

We consider Hodgkin-Huxley-type model that is a stiff ODE system with tw...
research
05/19/2021

Variability of Artificial Neural Networks

What makes an artificial neural network easier to train and more likely ...
research
07/03/2023

Learning Difference Equations with Structured Grammatical Evolution for Postprandial Glycaemia Prediction

People with diabetes must carefully monitor their blood glucose levels, ...

Please sign up or login with your details

Forgot password? Click here to reset