Depth estimation with ego-motion assisted monocular camera

M. Mansour, P. Davidson, O. Stepanov, J. P. Raunio, M. M. Aref, R. Piché

Research output: Contribution to journalArticleScientificpeer-review

4 Citations (Scopus)

Abstract

Abstract—: We propose a method to estimate the distance to objects based on the complementary nature of monocular image sequences and camera kinematic parameters. The fusion of camera measurements with the kinematics parameters that are measured by an IMU and an odometer is performed using an extended Kalman filter. Results of field experiments with a wheeled robot corroborated the results of the simulation study in terms of accuracy of depth estimation. The performance of the approach in depth estimation is strongly affected by the mutual observer and feature point geometry, measurement accuracy of the observer’s motion parameters and distance covered by the observer. It was found that under favorable conditions the error in distance estimation can be as small as 1% of the distance to a feature point. This approach can be used to estimate distance to objects located hundreds of meters away from the camera.

Original languageEnglish
Pages (from-to)111-123
Number of pages13
JournalGyroscopy and Navigation
Volume10
Issue number3
DOIs
Publication statusPublished - 2019
Publication typeA1 Journal article-refereed

Keywords

  • computer vision
  • depth-from-motion
  • extended Kalman filter
  • image sequence
  • inertial sensing
  • sensor fusion

Publication forum classification

  • Publication forum level 1

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Computer Science(all)
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Depth estimation with ego-motion assisted monocular camera'. Together they form a unique fingerprint.

Cite this