3D Position Estimation in Wide and Unconstrained Indoor Environments
Abstract
In this paper, a system for 3D position estimation in wide, unconstrained indoor environments is presented that employs infrared optical outside-in tracking of rigid-body targets with a stereo camera rig. To overcome limitations of state-of-the-art optical tracking systems, a pipeline for robust target identification and 3D point reconstruction has been investigated that enables camera calibration and tracking in environments with poor illumination, static and moving ambient light sources, occlusions and harsh conditions, such as fog. For evaluation, the system has been successfully applied in three different wide and unconstrained indoor environments, 1) user tracking for virtual and augmented reality applications, 2) handheld target tracking for tunneling and 3) machine guidance for mining. The results of each use case are discussed to embed the presented approach into a larger technological and application context. The experimental results demonstrate the system´s capabilities to track targets up to 100m. Comparing the proposed approach to prior art in optical tracking in terms of range coverage and accuracy, it significantly extends the available tracking range, while only requiring two cameras and providing a relative 3D point accuracy with sub-centimeter deviation up to 30m and low-centimeter deviation up to 100m.
Reference
A. Mossel: "3D Position Estimation in Wide and Unconstrained Indoor Environments"; Sensors, Physical Sensors: Sensors for Indoor Mapping and Navigation (2015), 12; 31482 - 31524.
BibTeX
Click into the text area and press Ctrl+A/Ctrl+C or ⌘+A/⌘+C to copy the BibTeX into your clipboard… or download the BibTeX.