Radar Enlighten the Dark: Enhancing Low-Visibility Perception for Automated Vehicles with Camera-Radar Fusion

05/27/2023
by   Can Cui, et al.
0

Sensor fusion is a crucial augmentation technique for improving the accuracy and reliability of perception systems for automated vehicles under diverse driving conditions. However, adverse weather and low-light conditions remain challenging, where sensor performance degrades significantly, exposing vehicle safety to potential risks. Advanced sensors such as LiDARs can help mitigate the issue but with extremely high marginal costs. In this paper, we propose a novel transformer-based 3D object detection model "REDFormer" to tackle low visibility conditions, exploiting the power of a more practical and cost-effective solution by leveraging bird's-eye-view camera-radar fusion. Using the nuScenes dataset with multi-radar point clouds, weather information, and time-of-day data, our model outperforms state-of-the-art (SOTA) models on classification and detection accuracy. Finally, we provide extensive ablation studies of each model component on their contributions to address the above-mentioned challenges. Particularly, it is shown in the experiments that our model achieves a significant performance improvement over the baseline model in low-visibility scenarios, specifically exhibiting a 31.31 rainy scenes and a 46.99 this study is publicly available.

READ FULL TEXT

page 1

page 7

research
07/17/2023

Multi-Task Cross-Modality Attention-Fusion for 2D Object Detection

Accurate and robust object detection is critical for autonomous driving....
research
08/08/2022

RadSegNet: A Reliable Approach to Radar Camera Fusion

Perception systems for autonomous driving have seen significant advancem...
research
12/07/2022

Gaussian Radar Transformer for Semantic Segmentation in Noisy Radar Data

Scene understanding is crucial for autonomous robots in dynamic environm...
research
11/11/2022

Sensor Visibility Estimation: Metrics and Methods for Systematic Performance Evaluation and Improvement

Sensor visibility is crucial for safety-critical applications in automot...
research
07/17/2023

ROFusion: Efficient Object Detection using Hybrid Point-wise Radar-Optical Fusion

Radars, due to their robustness to adverse weather conditions and abilit...
research
09/09/2020

All-Weather sub-50-cm Radar-Inertial Positioning

Deployment of automated ground vehicles beyond the confines of sunny and...
research
07/14/2023

Achelous: A Fast Unified Water-surface Panoptic Perception Framework based on Fusion of Monocular Camera and 4D mmWave Radar

Current perception models for different tasks usually exist in modular f...

Please sign up or login with your details

Forgot password? Click here to reset