CALDA: Improving Multi-Source Time Series Domain Adaptation with Contrastive Adversarial Learning

by   Janardhan Rao Doppa, et al.
Washington State University

Unsupervised domain adaptation (UDA) provides a strategy for improving machine learning performance in data-rich (target) domains where ground truth labels are inaccessible but can be found in related (source) domains. In cases where meta-domain information such as label distributions is available, weak supervision can further boost performance. We propose a novel framework, CALDA, to tackle these two problems. CALDA synergistically combines the principles of contrastive learning and adversarial learning to robustly support multi-source UDA (MS-UDA) for time series data. Similar to prior methods, CALDA utilizes adversarial learning to align source and target feature representations. Unlike prior approaches, CALDA additionally leverages cross-source label information across domains. CALDA pulls examples with the same label close to each other, while pushing apart examples with different labels, reshaping the space through contrastive learning. Unlike prior contrastive adaptation methods, CALDA requires neither data augmentation nor pseudo labeling, which may be more challenging for time series. We empirically validate our proposed approach. Based on results from human activity recognition, electromyography, and synthetic datasets, we find utilizing cross-source information improves performance over prior time series and contrastive methods. Weak supervision further improves performance, even in the presence of noise, allowing CALDA to offer generalizable strategies for MS-UDA. Code is available at:


page 1

page 2

page 3

page 4


CoTMix: Contrastive Domain Adaptation for Time-Series via Temporal Mixup

Unsupervised Domain Adaptation (UDA) has emerged as a powerful solution ...

Multi-Source Deep Domain Adaptation with Weak Supervision for Time-Series Sensor Data

Domain adaptation (DA) offers a valuable means to reuse data and models ...

Domain Adaptation Under Behavioral and Temporal Shifts for Natural Time Series Mobile Activity Recognition

Increasingly, human behavior is captured on mobile devices, leading to a...

Contrastive Learning for Unsupervised Domain Adaptation of Time Series

Unsupervised domain adaptation (UDA) aims at learning a machine learning...

Contrastive Vicinal Space for Unsupervised Domain Adaptation

Utilizing vicinal space between the source and target domains is one of ...

Match-And-Deform: Time Series Domain Adaptation through Optimal Transport and Temporal Alignment

While large volumes of unlabeled data are usually available, associated ...

TempCLR: Reconstructing Hands via Time-Coherent Contrastive Learning

We introduce TempCLR, a new time-coherent contrastive learning approach ...

Please sign up or login with your details

Forgot password? Click here to reset