Efficient Diversity-Driven Ensemble for Deep Neural Networks

12/26/2021
by   Wentao Zhang, et al.
16

The ensemble of deep neural networks has been shown, both theoretically and empirically, to improve generalization accuracy on the unseen test set. However, the high training cost hinders its efficiency since we need a sufficient number of base models and each one in the ensemble has to be separately trained. Lots of methods are proposed to tackle this problem, and most of them are based on the feature that a pre-trained network can transfer its knowledge to the next base model and then accelerate the training process. However, these methods suffer a severe problem that all of them transfer knowledge without selection and thus lead to low diversity. As the effect of ensemble learning is more pronounced if ensemble members are accurate and diverse, we propose a method named Efficient Diversity-Driven Ensemble (EDDE) to address both the diversity and the efficiency of an ensemble. To accelerate the training process, we propose a novel knowledge transfer method which can selectively transfer the previous generic knowledge. To enhance diversity, we first propose a new diversity measure, then use it to define a diversity-driven loss function for optimization. At last, we adopt a Boosting-based framework to combine the above operations, such a method can also further improve diversity. We evaluate EDDE on Computer Vision (CV) and Natural Language Processing (NLP) tasks. Compared with other well-known ensemble methods, EDDE can get highest ensemble accuracy with the lowest training cost, which means it is efficient in the ensemble of neural networks.

READ FULL TEXT

page 1

page 10

research
10/03/2021

Boost Neural Networks by Checkpoints

Training multiple deep neural networks (DNNs) and averaging their output...
research
01/18/2023

HCE: Improving Performance and Efficiency with Heterogeneously Compressed Neural Network Ensemble

Ensemble learning has gain attention in resent deep learning research as...
research
02/12/2023

Interpretable Diversity Analysis: Visualizing Feature Representations In Low-Cost Ensembles

Diversity is an important consideration in the construction of robust ne...
research
03/27/2021

Deep Ensemble Collaborative Learning by using Knowledge-transfer Graph for Fine-grained Object Classification

Mutual learning, in which multiple networks learn by sharing their knowl...
research
05/13/2016

Wisdom of Crowds cluster ensemble

The Wisdom of Crowds is a phenomenon described in social science that su...
research
08/28/2023

Diversified Ensemble of Independent Sub-Networks for Robust Self-Supervised Representation Learning

Ensembling a neural network is a widely recognized approach to enhance m...
research
06/27/2022

Transfer learning for ensembles: reducing computation time and keeping the diversity

Transferring a deep neural network trained on one problem to another req...

Please sign up or login with your details

Forgot password? Click here to reset