Probabilistic Egocentric Motion Correction of Lidar Point Cloud and Projection to Camera Images for Moving Platforms

by   Mao Shan, et al.

The fusion of sensor data from heterogeneous sensors is crucial for robust perception in various robotics applications that involve moving platforms, for instance, autonomous vehicle navigation. In particular, combining camera and lidar sensors enables the projection of precise range information of the surrounding environment onto visual images. It also makes it possible to label each lidar point with visual segmentation/classification results for 3D mapping, which facilitates a higher level understanding of the scene. The task is however considered non-trivial due to intrinsic and extrinsic sensor calibration, and the distortion of lidar points resulting from the ego-motion of the platform. Despite the existence of many lidar ego-motion correction methods, the errors in the correction process due to uncertainty in ego-motion estimation are not possible to remove completely. It is thus essential to consider the problem a probabilistic process where the ego-motion estimation uncertainty is modelled and considered consistently. The paper investigates the probabilistic lidar ego-motion correction and lidar-to-camera projection, where both the uncertainty in the ego-motion estimation and time jitter in sensory measurements are incorporated. The proposed approach is validated both in simulation and using real-world data collected from an electric vehicle retrofitted with wide-angle cameras and a 16-beam scanning lidar.


page 1

page 5

page 7


Camera-Lidar Integration: Probabilistic sensor fusion for semantic mapping

An automated vehicle operating in an urban environment must be able to p...

Extrinsic Camera Calibration with Semantic Segmentation

Monocular camera sensors are vital to intelligent vehicle operation and ...

IN2LAAMA: INertial Lidar Localisation Autocalibration And MApping

In this paper, we present INertial Lidar Localisation Autocalibration An...

Probabilistic Rainfall Estimation from Automotive Lidar

Robust sensing and perception in adverse weather conditions remains one ...

Semantic sensor fusion: from camera to sparse lidar information

To navigate through urban roads, an automated vehicle must be able to pe...

Self-Supervised Depth Correction of Lidar Measurements from Map Consistency Loss

Depth perception is considered an invaluable source of information in th...

MoCap-less Quantitative Evaluation of Ego-Pose Estimation Without Ground Truth Measurements

The emergence of data-driven approaches for control and planning in robo...

Please sign up or login with your details

Forgot password? Click here to reset