Unsupervised construction of representations for oil wells via Transformers

12/29/2022
by   Alina Rogulina, et al.
0

Determining and predicting reservoir formation properties for newly drilled wells represents a significant challenge. One of the variations of these properties evaluation is well-interval similarity. Many methodologies for similarity learning exist: from rule-based approaches to deep neural networks. Recently, articles adopted, e.g. recurrent neural networks to build a similarity model as we deal with sequential data. Such an approach suffers from short-term memory, as it pays more attention to the end of a sequence. Neural network with Transformer architecture instead cast their attention over all sequences to make a decision. To make them more efficient in terms of computational time, we introduce a limited attention mechanism similar to Informer and Performer architectures. We conduct experiments on open datasets with more than 20 wells making our experiments reliable and suitable for industrial usage. The best results were obtained with our adaptation of the Informer variant of Transformer with ROC AUC 0.982. It outperforms classical approaches with ROC AUC 0.824, Recurrent neural networks with ROC AUC 0.934 and straightforward usage of Transformers with ROC AUC 0.961.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/04/2021

TransfoRNN: Capturing the Sequential Information in Self-Attention Representations for Language Modeling

In this paper, we describe the use of recurrent neural networks to captu...
research
04/30/2019

Very Deep Self-Attention Networks for End-to-End Speech Recognition

Recently, end-to-end sequence-to-sequence models for speech recognition ...
research
06/21/2023

Probing the limit of hydrologic predictability with the Transformer network

For a number of years since its introduction to hydrology, recurrent neu...
research
10/25/2020

Attention is All You Need in Speech Separation

Recurrent Neural Networks (RNNs) have long been the dominant architectur...
research
05/21/2023

Temporal Fusion Transformers for Streamflow Prediction: Value of Combining Attention with Recurrence

Over the past few decades, the hydrology community has witnessed notable...
research
04/20/2020

Recurrent Convolutional Neural Networks help to predict location of Earthquakes

We examine the applicability of modern neural network architectures to t...
research
03/26/2021

A Practical Survey on Faster and Lighter Transformers

Recurrent neural networks are effective models to process sequences. How...

Please sign up or login with your details

Forgot password? Click here to reset