Published October 2011 | Version v1
Conference paper

Self Calibration of a vision system embedded in a Visual SLAM framework

Description

This paper presents a novel approach to self calibrate the extrinsic parameters of a camera mounted on a mobile robot in the context of fusion with the odometry sensor. Calibrating precisely such system can be difficult if the camera is mounted on a vehicule where the frame is difficult to localize precisely (like on a car for example). However, the knowledge of the camera pose in the robot frame is essential in order to make a consistent fusion of the sensor measurements. Our approach is based on a Simultaneous Localization and Mapping (SLAM) framework: the estimation of the parameteres is made when the robot moves in an unknown environment which is only viewed by the camera. First, a study of the observability properties of the system is made in ordrer to charecterize conditions that its inputs have to satisfy to make possible the calibration process. Then, we show on 3 real experimentations with an omnidirectional camera the validity of the conditions and the quality of the estimation of the 3D pose of the camera with respect to the odometry frame.

Abstract

International audience

Additional details

Identifiers

URL
https://inria.hal.science/hal-00767402
URN
urn:oai:HAL:hal-00767402v1

Origin repository

Origin repository
UNICA