Event-LSTM: An Unsupervised and Asynchronous Learning-based Representation for Event-based Data

05/10/2021
by   Lakshmi Annamalai, et al.
16

Event cameras are activity-driven bio-inspired vision sensors, thereby resulting in advantages such as sparsity,high temporal resolution, low latency, and power consumption. Given the different sensing modality of event camera and high quality of conventional vision paradigm, event processing is predominantly solved by transforming the sparse and asynchronous events into 2D grid and subsequently applying standard vision pipelines. Despite the promising results displayed by supervised learning approaches in 2D grid generation, these approaches treat the task in supervised manner. Labeled task specific ground truth event data is challenging to acquire. To overcome this limitation, we propose Event-LSTM, an unsupervised Auto-Encoder architecture made up of LSTM layers as a promising alternative to learn 2D grid representation from event sequence. Compared to competing supervised approaches, ours is a task-agnostic approach ideally suited for the event domain, where task specific labeled data is scarce. We also tailor the proposed solution to exploit asynchronous nature of event stream, which gives it desirable charateristics such as speed invariant and energy-efficient 2D grid generation. Besides, we also push state-of-the-art event de-noising forward by introducing memory into the de-noising process. Evaluations on activity recognition and gesture recognition demonstrate that our approach yields improvement over state-of-the-art approaches, while providing the flexibilty to learn from unlabelled data.

READ FULL TEXT

page 1

page 4

page 5

page 6

research
04/17/2019

End-to-End Learning of Representations for Asynchronous Event-Based Data

Event cameras are vision sensors that record asynchronous streams of per...
research
03/20/2020

Event-based Asynchronous Sparse Convolutional Networks

Event cameras are bio-inspired sensors that respond to per-pixel brightn...
research
01/10/2020

Matrix-LSTM: a Differentiable Recurrent Surface for Asynchronous Event-Based Data

Dynamic Vision Sensors (DVSs) asynchronously stream events in correspond...
research
01/20/2023

An Asynchronous Intensity Representation for Framed and Event Video Sources

Neuromorphic "event" cameras, designed to mimic the human vision system ...
research
09/23/2020

Unsupervised Feature Learning for Event Data: Direct vs Inverse Problem Formulation

Event-based cameras record an asynchronous stream of per-pixel brightnes...
research
04/07/2020

Real-time Classification from Short Event-Camera Streams using Input-filtering Neural ODEs

Event-based cameras are novel, efficient sensors inspired by the human v...
research
07/25/2018

Attention Mechanisms for Object Recognition with Event-Based Cameras

Event-based cameras are neuromorphic sensors capable of efficiently enco...

Please sign up or login with your details

Forgot password? Click here to reset