Deconstructing Data Reconstruction: Multiclass, Weight Decay and General Losses

07/04/2023
by   Gon Buzaglo, et al.
0

Memorization of training data is an active research area, yet our understanding of the inner workings of neural networks is still in its infancy. Recently, Haim et al. (2022) proposed a scheme to reconstruct training samples from multilayer perceptron binary classifiers, effectively demonstrating that a large portion of training samples are encoded in the parameters of such networks. In this work, we extend their findings in several directions, including reconstruction from multiclass and convolutional neural networks. We derive a more general reconstruction scheme which is applicable to a wider range of loss functions such as regression losses. Moreover, we study the various factors that contribute to networks' susceptibility to such reconstruction schemes. Intriguingly, we observe that using weight decay during training increases reconstructability both in terms of quantity and quality. Additionally, we examine the influence of the number of neurons relative to the number of training samples on the reconstructability.

READ FULL TEXT

page 4

page 8

page 14

page 15

page 16

page 17

page 19

page 20

research
05/05/2023

Reconstructing Training Data from Multiclass Neural Networks

Reconstructing samples from the training set of trained neural networks ...
research
09/26/2019

Mildly Overparametrized Neural Nets can Memorize Training Data Efficiently

It has been observed zhang2016understanding that deep neural networks ca...
research
12/21/2015

GraphConnect: A Regularization Framework for Neural Networks

Deep neural networks have proved very successful in domains where large ...
research
03/14/2021

Pre-interpolation loss behaviour in neural networks

When training neural networks as classifiers, it is common to observe an...
research
05/18/2018

Reconstruction of training samples from loss functions

This paper presents a new mathematical framework to analyze the loss fun...
research
10/03/2022

Omnigrok: Grokking Beyond Algorithmic Data

Grokking, the unusual phenomenon for algorithmic datasets where generali...
research
03/29/2023

Polarity is all you need to learn and transfer faster

Natural intelligences (NIs) thrive in a dynamic world - they learn quick...

Please sign up or login with your details

Forgot password? Click here to reset