A Hybrid mmWave and Camera System for Long-Range Depth Imaging

by   Diana Zhang, et al.

mmWave radars offer excellent depth resolution owing to their high bandwidth at mmWave radio frequencies. Yet, they suffer intrinsically from poor angular resolution, that is an order-of-magnitude worse than camera systems, and are therefore not a capable 3-D imaging solution in isolation. We propose Metamoran, a system that combines the complimentary strengths of radar and camera systems to obtain depth images at high azimuthal resolutions at distances of several tens of meters with high accuracy, all from a single fixed vantage point. Metamoran enables rich long-range depth imaging outdoors with applications to roadside safety infrastructure, surveillance and wide-area mapping. Our key insight is to use the high azimuth resolution from cameras using computer vision techniques, including image segmentation and monocular depth estimation, to obtain object shapes and use these as priors for our novel specular beamforming algorithm. We also design this algorithm to work in cluttered environments with weak reflections and in partially occluded scenarios. We perform a detailed evaluation of Metamoran's depth imaging and sensing capabilities in 200 diverse scenes at a major U.S. city. Our evaluation shows that Metamoran estimates the depth of an object up to 60 m away with a median error of 28 cm, an improvement of 13× compared to a naive radar+camera baseline and 23× compared to monocular depth estimation.


page 2

page 4

page 5

page 6

page 7

page 8

page 9

page 12


Hybrid Light Field Imaging for Improved Spatial Resolution and Depth Range

Light field imaging involves capturing both angular and spatial distribu...

Count-Free Single-Photon 3D Imaging with Race Logic

Single-photon cameras (SPCs) have emerged as a promising technology for ...

360MonoDepth: High-Resolution 360° Monocular Depth Estimation

360 cameras can capture complete environments in a single shot, which ma...

Pixel-Accurate Depth Evaluation in Realistic Driving Scenarios

This work presents an evaluation benchmark for depth estimation and comp...

Asynchronous Single-Photon 3D Imaging

Single-photon avalanche diodes (SPADs) are becoming popular in time-of-f...

How do neural networks see depth in single images?

Deep neural networks have lead to a breakthrough in depth estimation fro...

PS^2F: Polarized Spiral Point Spread Function for Single-Shot 3D Sensing

We propose a compact snapshot monocular depth estimation technique that ...

Please sign up or login with your details

Forgot password? Click here to reset