Parallel Weight Consolidation: A Brain Segmentation Case Study

05/28/2018
by   Patrick McClure, et al.
0

Collecting the large datasets needed to train deep neural networks can be very difficult, particularly for the many applications for which sharing and pooling data is complicated by practical, ethical, or legal concerns. However, it may be the case that derivative datasets or predictive models developed within individual sites can be shared and combined with fewer restrictions. Training on distributed datasets and combining the resulting networks is often viewed as continual learning, but these methods require networks to be trained sequentially. In this paper, we introduce parallel weight consolidation (PWC), a continual learning method to consolidate the weights of neural networks trained in parallel on independent datasets. We perform a brain segmentation case study using PWC to consolidate several dilated convolutional neural networks trained in parallel on independent structural magnetic resonance imaging (sMRI) datasets from different sites. We found that PWC led to increased performance on held-out test sets from the different sites, as well as on a very large and completely independent multi-site dataset. This demonstrates the feasibility of PWC for combining the knowledge learned by networks trained on different datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset