Monocular vision-based range estimation supported by proprioceptive motion

    Research output: Contribution to journalArticleScientificpeer-review

    7 Citations (Scopus)
    395 Downloads (Pure)

    Abstract

    This paper describes an approach for fusion of monocular vision measurements, camera motion, odometer and inertial rate sensor measurements. The motion of the camera between successive images generates a baseline for range computations by triangulation. The recursive estimation algorithm is based on extended Kalman filtering. The depth estimation accuracy is strongly affected by the mutual observer and feature point geometry, measurement accuracy of observer motion parameters and line of sight to a feature point. The simulation study investigates how the estimation accuracy is affected by the following parameters: linear and angular velocity measurement errors, camera noise, and observer path. These results impose requirements to the instrumentation and observation scenarios. It was found that under favorable conditions the error in distance estimation does not exceed 2% of the distance to a feature point.
    Original languageEnglish
    Pages (from-to)150-158
    JournalGyroscopy and Navigation
    Volume8
    Issue number2
    DOIs
    Publication statusPublished - 2017
    Publication typeA1 Journal article-refereed

    Publication forum classification

    • Publication forum level 1

    Fingerprint

    Dive into the research topics of 'Monocular vision-based range estimation supported by proprioceptive motion'. Together they form a unique fingerprint.

    Cite this