Predictive and Semantic Layout Estimation for Robotic Applications in Manhattan Worlds

11/19/2018
by   Armon Shariati, et al.
0

This paper describes an approach to automatically extracting floor plans from the kinds of incomplete measurements that could be acquired by an autonomous mobile robot. The approach proceeds by reasoning about extended structural layout surfaces which are automatically extracted from the available data. The scheme can be run in an online manner to build water tight representations of the environment. The system effectively speculates about room boundaries and free space regions which provides useful guidance to subsequent motion planning systems. Experimental results are presented on multiple data sets.

READ FULL TEXT

page 2

page 3

page 4

page 7

page 8

research
06/01/2022

Semantic Room Wireframe Detection from a Single View

Reconstruction of indoor surfaces with limited texture information or wi...
research
06/15/2021

Task Allocation and Coordinated Motion Planning for Autonomous Multi-Robot Optical Inspection Systems

Autonomous multi-robot optical inspection systems are increasingly appli...
research
11/06/2018

Motion Planning for a UAV with a Straight or Kinked Tether

This paper develops and compares two motion planning algorithms for a te...
research
11/17/2019

A Configuration-Space Decomposition Scheme for Learning-based Collision Checking

Motion planning for robots of high degrees-of-freedom (DOFs) is an impor...
research
10/30/2019

Software Framework for Tribotronic Systems

Increasing the capabilities of sensors and computer algorithms produces ...
research
10/09/2019

3D Manhattan Room Layout Reconstruction from a Single 360 Image

Recent approaches for predicting layouts from 360 panoramas produce exce...
research
06/20/2021

Guiding vector fields in Paparazzi autopilot

This article is a technical report on the two different guidance systems...

Please sign up or login with your details

Forgot password? Click here to reset