Browse > Article
http://dx.doi.org/10.5302/J.ICROS.2011.17.2.108

Visibility Sensor with Stereo Infrared Light Sources for Mobile Robot Motion Estimation  

Lee, Min-Young (Hongik University)
Lee, Soo-Yong (Hongik University)
Publication Information
Journal of Institute of Control, Robotics and Systems / v.17, no.2, 2011 , pp. 108-115 More about this Journal
Abstract
This paper describes a new sensor system for mobile robot motion estimation using stereo infrared light sources and a camera. Visibility is being applied to robotic obstacle avoidance path planning and localization. Using simple visibility computation, the environment is partitioned into many visibility sectors. Based on the recognized edges, the sector a robot belongs to is identified and this greatly reduces the search area for localization. Geometric modeling of the vision system enables the estimation of the characteristic pixel position with respect to the robot movement. Finite difference analysis is used for incremental movement and the error sources are investigated. With two characteristic points in the image such as vertices, the robot position and orientation are successfully estimated.
Keywords
visibility; localization; stereo infrared light sources;
Citations & Related Records
Times Cited By KSCI : 2  (Citation Analysis)
Times Cited By SCOPUS : 0
연도 인용수 순위
1 D. C. K. Yuen and B. A. MacDonald, “Vision-based localization algorithm based on landmark matching, triangulation, reconstruction, and comparison,” IEEE Transactions on Robotics, vol. 21, no. 2, pp. 217-226, April 2005.   DOI   ScienceOn
2 R. Sim and G. Dudek, “Mobile robot localization from learned landmarks,” IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 2, pp. 1060-1065, 1998.
3 E. D. Kaplan, Understanding GPS: Principle and Applications, 1st Ed., Boston, MA: Artech House, 1996.
4 J. Yun, S. Kim, and J. Lee, “Robust positioning a mobile robot with active beacon sensors,” LNAI 4251, Part I, pp. 890-897, Oct. 2006.
5 B. Barshan and H. F. Durrant-Whyte, “Inertial navigation systems for mobile robots,” IEEE Transactions on Robotics and Automation, vol. 11, pp. 328-342, June 1995.   DOI
6 H. Durrant-Whyte and T. Bailey, “Simultaneous Localization and Mapping (SLAM): Part I The Essential Algorithms,” Robotics and Automation Magazine, vol. 13, no. 2, pp. 99-110, 2006.
7 I. Kim and S. Lee, “3D range measurement using infrared light and a camera,” Journal of Institute of Control, Robotics and Systems, vol. 14, no. 10, pp. 1005-1013, October 2008.   DOI
8 S. Lee and J.-B. Song, “3D environment perception using stereo infrared light sources and a camera,” Journal of Institute of Control, Robotics and Systems, vol. 15, no. 5, pp. 519-524, May 2009.   DOI
9 S. Lee, N. M. Amato, and J. Fellers, “Localization based on visibility sectors using range sensors,” IEEE International Conference on Robotics and Automation, vol. 4, pp. 3505-3511, 2000.
10 J. Kim, R. A. Pearce, and N. M. Amato, “Feature-based localization using scannable visibility sectors,” IEEE International Conference on Robotics and Automation, vol. 2, pp. 2854-2859, 2003.
11 E. Morini, F. Rocchi, C. A. Avizzano, and M. Bergamasco, “Visibility techniques applied to robotics,” IEEE RO-MAN, Viareggio, pp. 367-372, Sep. 2010.
12 T. Bandyopadhyay, Z. Liu, M. H. Ang, and W. K. G Seah, “Visibility-based exploration in unknown environment containing structured obstacles,” 12th International Conference on Advanced Robotics, pp. 484-491, 2005.
13 G. Hager and S. Atiya, “Real-time vision-based robot localization,” IEEE Transactions on Robotics and Automation, vol. 9, no. 6, pp. 785-800, 1993.   DOI
14 K. T. Simsarian, T. J. Olson, and N. Nandhakumar, “Viewinvariant regions and mobile robot self-localization,” IEEE Transactions on Robotics and Automation, vol. 12, no. 5, pp. 810-816, 1996.   DOI
15 T. Bailey and H. Durrant-Whyte, “Simultaneous localization and mapping (SLAM): part II,” Robotics and Automation Magazine, vol. 13, no. 3, pp. 108-117, 2006.   DOI   ScienceOn
16 C. M. Wang, “Location estimation and uncertainty analysis for mobile robots,” IEEE International Conference on Robotics and Automation, pp. 1230-1235, 1988.
17 G. Adorni, S. Cagnoni, S. Enderle, and G. K. Kraetzschmar, “Vision-based localization for mobile robots,” Robotics and Autonomous Systems, vol. 36, pp. 103-119, 2001.   DOI
18 J. Pages, J. Salvi, R. Garcia, and C. Matabosch, “Overview of coded light projection techniques for automatic 3D profiling,” IEEE International Conference on Robotics &Automation, vol. 1, pp. 133-138, Sep. 2003.
19 T. Oggier, P. Seitz, and N. Blanc, “Miniaturized all-solid-state 3D camera for real-time range imaging” Performance Metrics for Intelligent Systems Workshop, National Institute of Standards and Technology, Aug. 2004.
20 S. B. Gokturk, H. Yalcin, and C. Bamji, “A time-of-flight depth sensor - system description, issues and solutions,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW’04), pp. 35- 43, June 2004.   DOI
21 J. W. Weingarten, G. Gruener, and R. Siegwart, “A state-of-theart 3D sensor for robot navigation,” IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 3, pp. 2155-2160, Oct. 2004.
22 D. Um and W. N. P. Hung, “A novel infrared proximity array sensor for 3D visual sensing: Modeling and applications,” IEEE Conference on Robotics, Automation and Mechatronics, pp. 1-6, Dec. 2006.
23 H. Kawasaki, Y. Ohsawa, R. Furukawa, and Y. Nakamura, “Coded structured light based uncalibrated stereo system,” ICCV2005 International Conference on Computer Vision, 2005.
24 J. Borenstein, H. Everett, L. Feng, and D. Wehe, “Mobile robot positioning: Sensors and techniques,” Journal of Robotic Systems, vol. 14, no. 4, pp. 231-249, 1997.   DOI
25 I. J. Cox, “Blanche: Position estimation for an autonomous robot vehicle,” Autonomous Mobile Robots: Control, Planning, and Architecture, IEEE Computer Society Press, Los Alamitos, CA, pp. 285-292, 1991.