Distilling EEG Representations via Capsules for Affective Computing

04/30/2021
by   Guangyi Zhang, et al.
0

Affective computing with Electroencephalogram (EEG) is a challenging task that requires cumbersome models to effectively learn the information contained in large-scale EEG signals, causing difficulties for real-time smart-device deployment. In this paper, we propose a novel knowledge distillation pipeline to distill EEG representations via capsule-based architectures for both classification and regression tasks. Our goal is to distill information from a heavy model to a lightweight model for subject-specific tasks. To this end, we first pre-train a large model (teacher network) on large number of training samples. Then, we employ the teacher network to learn the discriminative features embedded in capsules by adopting a lightweight model (student network) to mimic the teacher using the privileged knowledge. Such privileged information learned by the teacher contain similarities among capsules and are only available during the training stage of the student network. We evaluate the proposed architecture on two large-scale public EEG datasets, showing that our framework consistently enables student networks with different compression ratios to effectively learn from the teacher, even when provided with limited training samples. Lastly, our method achieves state-of-the-art results on one of the two datasets.

READ FULL TEXT
research
03/26/2022

Knowledge Distillation with the Reused Teacher Classifier

Knowledge distillation aims to compress a powerful yet cumbersome teache...
research
05/12/2023

A Lightweight Domain Adversarial Neural Network Based on Knowledge Distillation for EEG-based Cross-subject Emotion Recognition

Individual differences of Electroencephalogram (EEG) could cause the dom...
research
12/06/2022

Enhancing Low-Density EEG-Based Brain-Computer Interfaces with Similarity-Keeping Knowledge Distillation

Electroencephalogram (EEG) has been one of the common neuromonitoring mo...
research
08/08/2023

Teacher-Student Architecture for Knowledge Distillation: A Survey

Although Deep neural networks (DNNs) have shown a strong capacity to sol...
research
05/25/2019

ShrinkTeaNet: Million-scale Lightweight Face Recognition via Shrinking Teacher-Student Networks

Large-scale face recognition in-the-wild has been recently achieved matu...
research
11/07/2018

Amalgamating Knowledge towards Comprehensive Classification

With the rapid development of deep learning, there have been an unpreced...
research
04/23/2019

Student Becoming the Master: Knowledge Amalgamation for Joint Scene Parsing, Depth Estimation, and More

In this paper, we investigate a novel deep-model reusing task. Our goal ...

Please sign up or login with your details

Forgot password? Click here to reset