DeepAI AI Chat
Log In Sign Up

Contrastive Learning for Unsupervised Domain Adaptation of Time Series

by   Yilmazcan Özyurt, et al.
ETH Zurich
Universität München

Unsupervised domain adaptation (UDA) aims at learning a machine learning model using a labeled source domain that performs well on a similar yet different, unlabeled target domain. UDA is important in many applications such as medicine, where it is used to adapt risk scores across different patient cohorts. In this paper, we develop a novel framework for UDA of time series data, called CLUDA. Specifically, we propose a contrastive learning framework to learn domain-invariant semantics in multivariate time series, so that these preserve label information for the prediction task. In our framework, we further capture semantic variation between source and target domain via nearest-neighbor contrastive learning. To the best of our knowledge, ours is the first framework to learn domain-invariant semantic information for UDA of time series data. We evaluate our framework using large-scale, real-world datasets with medical time series (i.e., MIMIC-IV and AmsterdamUMCdb) to demonstrate its effectiveness and show that it achieves state-of-the-art performance for time series UDA.


page 1

page 2

page 3

page 4


CoTMix: Contrastive Domain Adaptation for Time-Series via Temporal Mixup

Unsupervised Domain Adaptation (UDA) has emerged as a powerful solution ...

Time Series Domain Adaptation via Sparse Associative Structure Alignment

Domain adaptation on time series data is an important but challenging ta...

Time-Series Domain Adaptation via Sparse Associative Structure Alignment: Learning Invariance and Variance

Domain adaptation on time-series data is often encountered in the indust...

Source-Free Domain Adaptation with Temporal Imputation for Time Series Data

Source-free domain adaptation (SFDA) aims to adapt a pretrained model fr...

CALDA: Improving Multi-Source Time Series Domain Adaptation with Contrastive Adversarial Learning

Unsupervised domain adaptation (UDA) provides a strategy for improving m...

MHCCL: Masked Hierarchical Cluster-wise Contrastive Learning for Multivariate Time Series

Learning semantic-rich representations from raw unlabeled time series da...