Foundations of Spatial Perception for Robotics: Hierarchical Representations and Real-time Systems

by   Nathan Hughes, et al.

3D spatial perception is the problem of building and maintaining an actionable and persistent representation of the environment in real-time using sensor data and prior knowledge. Despite the fast-paced progress in robot perception, most existing methods either build purely geometric maps (as in traditional SLAM) or flat metric-semantic maps that do not scale to large environments or large dictionaries of semantic labels. The first part of this paper is concerned with representations: we show that scalable representations for spatial perception need to be hierarchical in nature. Hierarchical representations are efficient to store, and lead to layered graphs with small treewidth, which enable provably efficient inference. We then introduce an example of hierarchical representation for indoor environments, namely a 3D scene graph, and discuss its structure and properties. The second part of the paper focuses on algorithms to incrementally construct a 3D scene graph as the robot explores the environment. Our algorithms combine 3D geometry, topology (to cluster the places into rooms), and geometric deep learning (e.g., to classify the type of rooms the robot is moving across). The third part of the paper focuses on algorithms to maintain and correct 3D scene graphs during long-term operation. We propose hierarchical descriptors for loop closure detection and describe how to correct a scene graph in response to loop closures, by solving a 3D scene graph optimization problem. We conclude the paper by combining the proposed perception algorithms into Hydra, a real-time spatial perception system that builds a 3D scene graph from visual-inertial data in real-time. We showcase Hydra's performance in photo-realistic simulations and real data collected by a Clearpath Jackal robots and a Unitree A1 robot. We release an open-source implementation of Hydra at


page 1

page 10

page 17

page 18

page 19

page 21

page 22

page 24


Hydra: A Real-time Spatial Perception Engine for 3D Scene Graph Construction and Optimization

3D scene graphs have recently emerged as a powerful high-level represent...

Bridging Scene Understanding and Task Execution with Flexible Simulation Environments

Significant progress has been made in scene understanding which seeks to...

Kimera: from SLAM to Spatial Perception with 3D Dynamic Scene Graphs

Humans are able to form a complex mental model of the environment they m...

Mono-hydra: Real-time 3D scene graph construction from monocular camera input with IMU

The ability of robots to autonomously navigate through 3D environments d...

NerfBridge: Bringing Real-time, Online Neural Radiance Field Training to Robotics

This work was presented at the IEEE International Conference on Robotics...

Hydra-Multi: Collaborative Online Construction of 3D Scene Graphs with Multi-Robot Teams

3D scene graphs have recently emerged as an expressive high-level map re...

3D Dynamic Scene Graphs: Actionable Spatial Perception with Places, Objects, and Humans

We present a unified representation for actionable spatial perception: 3...

Please sign up or login with your details

Forgot password? Click here to reset