Preformer: Predictive Transformer with Multi-Scale Segment-wise Correlations for Long-Term Time Series Forecasting

02/23/2022
by   Dazhao Du, et al.
0

Transformer-based methods have shown great potential in long-term time series forecasting. However, most of these methods adopt the standard point-wise self-attention mechanism, which not only becomes intractable for long-term forecasting since its complexity increases quadratically with the length of time series, but also cannot explicitly capture the predictive dependencies from contexts since the corresponding key and value are transformed from the same point. This paper proposes a predictive Transformer-based model called Preformer. Preformer introduces a novel efficient Multi-Scale Segment-Correlation mechanism that divides time series into segments and utilizes segment-wise correlation-based attention for encoding time series. A multi-scale structure is developed to aggregate dependencies at different temporal scales and facilitate the selection of segment length. Preformer further designs a predictive paradigm for decoding, where the key and value come from two successive segments rather than the same segment. In this way, if a key segment has a high correlation score with the query segment, its successive segment contributes more to the prediction of the query segment. Extensive experiments demonstrate that our Preformer outperforms other Transformer-based methods.

READ FULL TEXT

page 6

page 7

page 8

page 9

page 11

page 12

page 13

page 15

research
07/27/2023

HUTFormer: Hierarchical U-Net Transformer for Long-Term Traffic Forecasting

Traffic forecasting, which aims to predict traffic conditions based on h...
research
08/22/2023

SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting

RNN-based methods have faced challenges in the Long-term Time Series For...
research
06/24/2021

Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting

Extending the forecasting time is a critical demand for real application...
research
01/05/2023

Towards Long-Term Time-Series Forecasting: Feature, Pattern, and Distribution

Long-term time-series forecasting (LTTF) has become a pressing demand in...
research
09/10/2021

Query-driven Segment Selection for Ranking Long Documents

Transformer-based rankers have shown state-of-the-art performance. Howev...
research
06/08/2023

Does Long-Term Series Forecasting Need Complex Attention and Extra Long Inputs?

As Transformer-based models have achieved impressive performance on vari...
research
02/22/2022

Hyper Attention Recurrent Neural Network: Tackling Temporal Covariate Shift in Time Series Analysis

Analyzing long time series with RNNs often suffers from infeasible train...

Please sign up or login with your details

Forgot password? Click here to reset