Non-Iterative Knowledge Fusion in Deep Convolutional Neural Networks

09/25/2018
by   Mikhail Iu. Leontev, et al.
0

Incorporation of a new knowledge into neural networks with simultaneous preservation of the previous one is known to be a nontrivial problem. This problem becomes even more complex when new knowledge is contained not in new training examples, but inside the parameters (connection weights) of another neural network. Here we propose and test two methods allowing combining the knowledge contained in separate networks. One method is based on a simple operation of summation of weights of constituent neural networks. Another method assumes incorporation of a new knowledge by modification of weights nonessential for the preservation of already stored information. We show that with these methods the knowledge from one network can be transferred into another one non-iteratively without requiring training sessions. The fused network operates efficiently, performing classification far better than a chance level. The efficiency of the methods is quantified on several publicly available data sets in classification tasks both for shallow and deep neural networks.

READ FULL TEXT
research
02/02/2018

Intriguing Properties of Randomly Weighted Networks: Generalizing While Learning Next to Nothing

Training deep neural networks results in strong learned representations ...
research
06/08/2019

Simultaneous Classification and Novelty Detection Using Deep Neural Networks

Deep neural networks have achieved great success in classification tasks...
research
06/23/2014

Committees of deep feedforward networks trained with few data

Deep convolutional neural networks are known to give good results on ima...
research
12/05/2016

Improving the Performance of Neural Networks in Regression Tasks Using Drawering

The method presented extends a given regression neural network to make i...
research
02/23/2021

Deep Convolutional Neural Networks with Unitary Weights

While normalizations aim to fix the exploding and vanishing gradient pro...
research
10/11/2021

Mining the Weights Knowledge for Optimizing Neural Network Structures

Knowledge embedded in the weights of the artificial neural network can b...
research
12/09/2019

Stealing Knowledge from Protected Deep Neural Networks Using Composite Unlabeled Data

As state-of-the-art deep neural networks are deployed at the core of mor...

Please sign up or login with your details

Forgot password? Click here to reset