Towards Real-time 3D Reconstruction using Consumer UAVs
We present a near real-time solution for 3D reconstruction from aerial images captured by consumer UAVs. Our core idea is to simplify the multi-view stereo problem into a series of two-view stereo matching problems. Our method applies to UAVs equipped with only one camera and does not require special stereo capturing setups. We found that the neighboring two video frames taken by UAVs flying at a mid-to-high cruising altitude can be approximated as left and right views from a virtual stereo camera. Leveraging GPU-accelerated real-time stereo estimation and efficient PnP correspondence solving algorithms, our system simultaneously predicts scene geometry and camera position/orientation from the virtual stereo cameras. Also, this method allows user-selection of varying baseline lengths, which provides more flexibility given the trade-off between camera resolution, effective measuring distance, flight altitude and mapping accuracy. Our method outputs dense point clouds at a constant speed of 25 frames per second and is validated on a variety of real-world datasets with satisfactory results.
READ FULL TEXT