Research Areas
  Robotics >> Mobile Robot Navigation

Abstract
 This research presents an Extended Kalman filter(EKF) based real-time simultaneous localization and mapping(SLAM) method for a robot with an omni-directional camera. In the EKF part, we employ a sequential process that utilizes features in turn for updating the robot pose estimate. This approach enables us to efficiently reduce the computational complexity and predict feature search regions. Thus, it contributes to run the Kalman recursion in real time. In the feature management part, we deal with the initialization and measurement of features. To initialize the covariance of the feature as well as its 3D position, we exploit the uncertainties of robot pose and image noise at the same time. For robust measurement, features are reordered according to their qualities of cornerness and covariance.

Overall Flow
 Overall flow of our SLAM system



 Projection of i-th feature onto the cylindrical image surface. Epipolar plane (EP) passes the optical center. The origin moves to robot 2D position.



Experimental Result
 Experiments for the real-time localization and map building with an omni-directional camera are demonstrated in an indoor environment. On a 2.5 Ghz Pentium processor, the processing time was 46.68Hz, when we used 42.9 features which were the averages of observed features. Robot pose estimates were compared with the ground truth in terms of absolute estimation error.




2D maps of the robot motion and feature position


References
 Chae-Young Kim ,Jong-Sung Kim and Ki-Sang Hong, ¡°Real-time Simultaneous Localization and Mapping using Omni-directional Vision¡±, Ubiquitous Robot and Ambient Intelligence (URAI), 2007