Dual Aspect Self-Attention based on Transformer for Remaining Useful Life Prediction

06/30/2021
by   Zhizheng Zhang, et al.
0

Remaining useful life prediction (RUL) is one of the key technologies of condition-based maintenance, which is important to maintain the reliability and safety of industrial equipments. While deep learning has achieved great success in RUL prediction, existing methods have difficulties in processing long sequences and extracting information from the sensor and time step aspects. In this paper, we propose Dual Aspect Self-attention based on Transformer (DAST), a novel deep RUL prediction method. DAST consists of two encoders, which work in parallel to simultaneously extract features of different sensors and time steps. Solely based on self-attention, the DAST encoders are more effective in processing long data sequences, and are capable of adaptively learning to focus on more important parts of input. Moreover, the parallel feature extraction design avoids mutual influence of information from two aspects. Experimental results on two real turbofan engine datasets show that our method significantly outperforms state-of-the-art methods.

READ FULL TEXT

page 1

page 3

page 7

research
09/06/2023

TFBEST: Dual-Aspect Transformer with Learnable Positional Encoding for Failure Prediction

Hard Disk Drive (HDD) failures in datacenters are costly - from catastro...
research
07/20/2020

Attention Sequence to Sequence Model for Machine Remaining Useful Life Prediction

Accurate estimation of remaining useful life (RUL) of industrial equipme...
research
10/28/2022

A Long-term Dependent and Trustworthy Approach to Reactor Accident Prognosis based on Temporal Fusion Transformer

Prognosis of the reactor accident is a crucial way to ensure appropriate...
research
02/24/2022

Attention Enables Zero Approximation Error

Deep learning models have been widely applied in various aspects of dail...
research
05/08/2023

Vision Transformer Off-the-Shelf: A Surprising Baseline for Few-Shot Class-Agnostic Counting

Class-agnostic counting (CAC) aims to count objects of interest from a q...
research
09/04/2021

An empirical evaluation of attention-based multi-head models for improved turbofan engine remaining useful life prediction

A single unit (head) is the conventional input feature extractor in deep...

Please sign up or login with your details

Forgot password? Click here to reset