Explainable Action Prediction through Self-Supervision on Scene Graphs

02/07/2023
by   Pawit Kochakarn, et al.
0

This work explores scene graphs as a distilled representation of high-level information for autonomous driving, applied to future driver-action prediction. Given the scarcity and strong imbalance of data samples, we propose a self-supervision pipeline to infer representative and well-separated embeddings. Key aspects are interpretability and explainability; as such, we embed in our architecture attention mechanisms that can create spatial and temporal heatmaps on the scene graphs. We evaluate our system on the ROAD dataset against a fully-supervised approach, showing the superiority of our training regime.

READ FULL TEXT

page 2

page 5

page 6

research
04/07/2021

Human-Vehicle Cooperation on Prediction-Level: Enhancing Automated Driving with Human Foresight

To maximize safety and driving comfort, autonomous driving systems can b...
research
12/07/2022

Towards Explainable Motion Prediction using Heterogeneous Graph Representations

Motion prediction systems aim to capture the future behavior of traffic ...
research
09/09/2021

NEAT: Neural Attention Fields for End-to-End Autonomous Driving

Efficient reasoning about the semantic, spatial, and temporal structure ...
research
10/14/2022

Motion Inspired Unsupervised Perception and Prediction in Autonomous Driving

Learning-based perception and prediction modules in modern autonomous dr...
research
01/02/2023

Learning Road Scene-level Representations via Semantic Region Prediction

In this work, we tackle two vital tasks in automated driving systems, i....
research
12/17/2020

Temporal LiDAR Frame Prediction for Autonomous Driving

Anticipating the future in a dynamic scene is critical for many fields s...
research
09/02/2021

roadscene2vec: A Tool for Extracting and Embedding Road Scene-Graphs

Recently, road scene-graph representations used in conjunction with grap...

Please sign up or login with your details

Forgot password? Click here to reset