Soft Sensing Transformer: Hundreds of Sensors are Worth a Single Word

11/10/2021
by   Chao Zhang, et al.
0

With the rapid development of AI technology in recent years, there have been many studies with deep learning models in soft sensing area. However, the models have become more complex, yet, the data sets remain limited: researchers are fitting million-parameter models with hundreds of data samples, which is insufficient to exercise the effectiveness of their models and thus often fail to perform when implemented in industrial applications. To solve this long-lasting problem, we are providing large scale, high dimensional time series manufacturing sensor data from Seagate Technology to the public. We demonstrate the challenges and effectiveness of modeling industrial big data by a Soft Sensing Transformer model on these data sets. Transformer is used because, it has outperformed state-of-the-art techniques in Natural Language Processing, and since then has also performed well in the direct application to computer vision without introduction of image-specific inductive biases. We observe the similarity of a sentence structure to the sensor readings and process the multi-variable sensor readings in a time series in a similar manner of sentences in natural language. The high-dimensional time-series data is formatted into the same shape of embedded sentences and fed into the transformer model. The results show that transformer model outperforms the benchmark models in soft sensing field based on auto-encoder and long short-term memory (LSTM) models. To the best of our knowledge, we are the first team in academia or industry to benchmark the performance of original transformer model with large-scale numerical soft sensing data.

READ FULL TEXT
research
08/04/2021

Auto-encoder based Model for High-dimensional Imbalanced Industrial Data

With the proliferation of IoT devices, the distributed control systems a...
research
09/20/2023

Transformers versus LSTMs for electronic trading

With the rapid development of artificial intelligence, long short term m...
research
08/19/2023

A Transformer-based Framework For Multi-variate Time Series: A Remaining Useful Life Prediction Use Case

In recent times, Large Language Models (LLMs) have captured a global spo...
research
05/28/2019

BreizhCrops: A Satellite Time Series Dataset for Crop Type Identification

This dataset challenges the time series community with the task of satel...
research
11/12/2021

Soft-Sensing ConFormer: A Curriculum Learning-based Convolutional Transformer

Over the last few decades, modern industrial processes have investigated...
research
01/31/2021

Classification Models for Partially Ordered Sequences

Many models such as Long Short Term Memory (LSTMs), Gated Recurrent Unit...
research
11/08/2021

Learned Dynamics of Electrothermally-Actuated Soft Robot Limbs Using LSTM Neural Networks

Modeling the dynamics of soft robot limbs with electrothermal actuators ...

Please sign up or login with your details

Forgot password? Click here to reset