Operational Adaptation of DNN Classifiers using Elastic Weight Consolidation

by   Abanoub Ghobrial, et al.

Autonomous systems (AS) often use Deep Neural Network (DNN) classifiers to allow them to operate in complex, high dimensional, non-linear, and dynamically changing environments. Due to the complexity of these environments, DNN classifiers may output misclassifications due to experiencing new tasks in their operational environments, which were not identified during development. Removing a system from operation and retraining it to include the new identified task becomes economically infeasible as the number of such autonomous systems increase. Additionally, such misclassifications may cause financial losses and safety threats to the AS or to other operators in its environment. In this paper, we propose to reduce such threats by investigating if DNN classifiers can adapt its knowledge to learn new information in the AS's operational environment, using only a limited number of observations encountered sequentially during operation. This allows the AS to adapt to new encountered information and hence increases the AS's reliability on doing correct classifications. However, retraining DNNs on different observations than used in prior training is known to cause catastrophic forgetting or significant model drift. We investigate if this problem can be controlled by using Elastic Weight Consolidation (EWC) whilst learning from limited new observations. We carry out experiments using original and noisy versions of the MNIST dataset to represent known and new information to DNN classifiers. Results show that using EWC does make the process of adaptation to new information a lot more controlled, and thus allowing for reliable adaption of ASs to new information in their operational environment.


page 1

page 2

page 3

page 4


Towards continuous learning for glioma segmentation with elastic weight consolidation

When finetuning a convolutional neural network (CNN) on data from a new ...

Domain Expansion in DNN-based Acoustic Models for Robust Speech Recognition

Training acoustic models with sequentially incoming data – while both le...

Memory Efficient Experience Replay for Streaming Learning

In supervised machine learning, an agent is typically trained once and t...

Operational Calibration: Debugging Confidence Errors for DNNs in the Field

Trained DNN models are increasingly adopted as integral parts of softwar...

Boosting Operational DNN Testing Efficiency through Conditioning

With the increasing adoption of Deep Neural Network (DNN) models as inte...

Learning to Continuously Optimize Wireless Resource In Episodically Dynamic Environment

There has been a growing interest in developing data-driven and in parti...

Iterative Assessment and Improvement of DNN Operational Accuracy

Deep Neural Networks (DNN) are nowadays largely adopted in many applicat...

Please sign up or login with your details

Forgot password? Click here to reset