Exploring the Common Principal Subspace of Deep Features in Neural Networks

by   Haoran Liu, et al.

We find that different Deep Neural Networks (DNNs) trained with the same dataset share a common principal subspace in latent spaces, no matter in which architectures (e.g., Convolutional Neural Networks (CNNs), Multi-Layer Preceptors (MLPs) and Autoencoders (AEs)) the DNNs were built or even whether labels have been used in training (e.g., supervised, unsupervised, and self-supervised learning). Specifically, we design a new metric 𝒫-vector to represent the principal subspace of deep features learned in a DNN, and propose to measure angles between the principal subspaces using 𝒫-vectors. Small angles (with cosine close to 1.0) have been found in the comparisons between any two DNNs trained with different algorithms/architectures. Furthermore, during the training procedure from random scratch, the angle decrease from a larger one (70^∘-80^∘ usually) to the small one, which coincides the progress of feature space learning from scratch to convergence. Then, we carry out case studies to measure the angle between the 𝒫-vector and the principal subspace of training dataset, and connect such angle with generalization performance. Extensive experiments with practically-used Multi-Layer Perceptron (MLPs), AEs and CNNs for classification, image reconstruction, and self-supervised learning tasks on MNIST, CIFAR-10 and CIFAR-100 datasets have been done to support our claims with solid evidences. Interpretability of Deep Learning, Feature Learning, and Subspaces of Deep Features


page 4

page 10

page 21

page 28


On the Memorization Properties of Contrastive Learning

Memorization studies of deep neural networks (DNNs) help to understand w...

Learning optimally separated class-specific subspace representations using convolutional autoencoder

In this work, we propose a novel convolutional autoencoder based archite...

Detecting Learning vs Memorization in Deep Neural Networks using Shared Structure Validation Sets

The roles played by learning and memorization represent an important top...

DeepTerramechanics: Terrain Classification and Slip Estimation for Ground Robots via Deep Learning

Terramechanics plays a critical role in the areas of ground vehicles and...

Self-Supervised Convolutional Subspace Clustering Network

Subspace clustering methods based on data self-expression have become ve...

RandomForestMLP: An Ensemble-Based Multi-Layer Perceptron Against Curse of Dimensionality

We present a novel and practical deep learning pipeline termed RandomFor...

Quantifying the alignment of graph and features in deep learning

We show that the classification performance of Graph Convolutional Netwo...

Please sign up or login with your details

Forgot password? Click here to reset