Mixture of ELM based experts with trainable gating network

05/25/2021
by   Laleh Armi, et al.
12

Mixture of experts method is a neural network based ensemble learning that has great ability to improve the overall classification accuracy. This method is based on the divide and conquer principle, in which the problem space is divided between several experts by supervisition of gating network. In this paper, we propose an ensemble learning method based on mixture of experts which is named mixture of ELM based experts with trainable gating network (MEETG) to improve the computing cost and to speed up the learning process of ME. The structure of ME consists of multi layer perceptrons (MLPs) as base experts and gating network, in which gradient-based learning algorithm is applied for training the MLPs which is an iterative and time consuming process. In order to overcome on these problems, we use the advantages of extreme learning machine (ELM) for designing the structure of ME. ELM as a learning algorithm for single hidden-layer feed forward neural networks provides much faster learning process and better generalization ability in comparision with some other traditional learning algorithms. Also, in the proposed method a trainable gating network is applied to aggregate the outputs of the experts dynamically according to the input sample. Our experimental results and statistical analysis on 11 benchmark datasets confirm that MEETG has an acceptable performance in classification problems. Furthermore, our experimental results show that the proposed approach outperforms the original ELM on prediction stability and classification accuracy.

READ FULL TEXT

page 8

page 9

page 11

page 12

research
02/17/2012

Extended Mixture of MLP Experts by Hybrid of Conjugate Gradient Method and Modified Cuckoo Search

This paper investigates a new method for improving the learning algorith...
research
11/04/2020

Rank Based Pseudoinverse Computation in Extreme Learning Machine for Large Datasets

Extreme Learning Machine (ELM) is an efficient and effective least-squar...
research
10/25/2019

A Gegenbauer Neural Network with Regularized Weights Direct Determination for Classification

Single-hidden layer feed forward neural networks (SLFNs) are widely used...
research
08/04/2022

Towards Understanding Mixture of Experts in Deep Learning

The Mixture-of-Experts (MoE) layer, a sparsely-activated model controlle...
research
04/29/2018

Dense Adaptive Cascade Forest: A Densely Connected Deep Ensemble for Classification Problems

Recent research has shown that deep ensemble for forest can achieve a hu...
research
02/07/2021

An Analytic Layer-wise Deep Learning Framework with Applications to Robotics

Deep learning has achieved great success in many applications, but it ha...
research
05/19/2021

A Novel lightweight Convolutional Neural Network, ExquisiteNetV2

In the paper of ExquisiteNetV1, the ability of classification of Exquisi...

Please sign up or login with your details

Forgot password? Click here to reset