DeepAI AI Chat
Log In Sign Up

Unsupervised Scalable Representation Learning for Multivariate Time Series

by   Jean-Yves Franceschi, et al.
Gentoo Foundation Inc

Time series constitute a challenging data type for machine learning algorithms, due to their highly variable lengths and sparse labeling in practice. In this paper, we tackle this challenge by proposing an unsupervised method to learn universal embeddings of time series. Unlike previous works, it is scalable with respect to their length and we demonstrate the quality, transferability and practicability of the learned representations with thorough experiments and comparisons. To this end, we combine an encoder based on causal dilated convolutions with a triplet loss employing time-based negative sampling, obtaining general-purpose representations for variable length and multivariate time series.


Discovering Subdimensional Motifs of Different Lengths in Large-Scale Multivariate Time Series

Detecting repeating patterns of different lengths in time series, also c...

TimeAutoML: Autonomous Representation Learning for Multivariate Irregularly Sampled Time Series

Multivariate time series (MTS) data are becoming increasingly ubiquitous...

Similarity Preserving Representation Learning for Time Series Analysis

A considerable amount of machine learning algorithms take instance-featu...

AUTOSHAPE: An Autoencoder-Shapelet Approach for Time Series Clustering

Time series shapelets are discriminative subsequences that have been rec...

Graph Spectral Embedding for Parsimonious Transmission of Multivariate Time Series

We propose a graph spectral representation of time series data that 1) i...

Representation Learning on Variable Length and Incomplete Wearable-Sensory Time Series

The prevalence of wearable sensors (e.g., smart wristband) is enabling a...