Automatic Construction of Multi-layer Perceptron Network from Streaming Examples

by   Mahardhika Pratama, et al.

Autonomous construction of deep neural network (DNNs) is desired for data streams because it potentially offers two advantages: proper model's capacity and quick reaction to drift and shift. While the self-organizing mechanism of DNNs remains an open issue, this task is even more challenging to be developed for standard multi-layer DNNs than that using the different-depth structures, because the addition of a new layer results in information loss of previously trained knowledge. A Neural Network with Dynamically Evolved Capacity (NADINE) is proposed in this paper. NADINE features a fully open structure where its network structure, depth and width, can be automatically evolved from scratch in an online manner and without the use of problem-specific thresholds. NADINE is structured under a standard MLP architecture and the catastrophic forgetting issue during the hidden layer addition phase is resolved using the proposal of soft-forgetting and adaptive memory methods. The advantage of NADINE, namely elastic structure and online learning trait, is numerically validated using nine data stream classification and regression problems where it demonstrates performance improvement over prominent algorithms in all problems. In addition, it is capable of dealing with data stream regression and classification problems equally well.


page 1

page 2

page 3

page 4


Autonomous Deep Learning: Continual Learning Approach for Dynamic Environments

The feasibility of deep neural networks (DNNs) to address data stream pr...

Autonomous Deep Learning: Incremental Learning of Denoising Autoencoder for Evolving Data Streams

The generative learning phase of Autoencoder (AE) and its successor Deno...

Memory Efficient Experience Replay for Streaming Learning

In supervised machine learning, an agent is typically trained once and t...

Some thoughts on catastrophic forgetting and how to learn an algorithm

The work of McCloskey and Cohen popularized the concept of catastrophic ...

Online Limited Memory Neural-Linear Bandits with Likelihood Matching

We study neural-linear bandits for solving problems where both explorati...

An Incremental Construction of Deep Neuro Fuzzy System for Continual Learning of Non-stationary Data Streams

Existing fuzzy neural networks (FNNs) are mostly developed under a shall...

Informative regularization for a multi-layer perceptron RR Lyrae classifier under data shift

In recent decades, machine learning has provided valuable models and alg...

Please sign up or login with your details

Forgot password? Click here to reset