Browse > Article
http://dx.doi.org/10.7746/jkros.2013.8.3.197

Telepresence Robotic Technology for Individuals with Visual Impairments Through Real-time Haptic Rendering  

Park, Chung Hyuk (Mechatronics Engineering, Chungnam National University)
Howard, Ayanna M. (Mechatronics Engineering, Chungnam National University)
Publication Information
The Journal of Korea Robotics Society / v.8, no.3, 2013 , pp. 197-205 More about this Journal
Abstract
This paper presents a robotic system that provides telepresence to the visually impaired by combining real-time haptic rendering with multi-modal interaction. A virtual-proxy based haptic rendering process using a RGB-D sensor is developed and integrated into a unified framework for control and feedback for the telepresence robot. We discuss the challenging problem of presenting environmental perception to a user with visual impairments and our solution for multi-modal interaction. We also explain the experimental design and protocols, and results with human subjects with and without visual impairments. Discussion on the performance of our system and our future goals are presented toward the end.
Keywords
3D Haptic Rendering; Depth Camera; Telepresence Robot; Assistive Robotics; Visual Impairment;
Citations & Related Records
연도 인용수 순위
  • Reference
1 C. H. Park and A. M. Howard. Real World Haptic Exploration for Telepresence of the Visually Impaired. In Proceedings of ACM/IEEE International Conference on Human Robot Interaction(HRI). IEEE, 2012.
2 F. Ryd'en, S. N. Kosari, and H. J. Chizeck. Proxy method for fast haptic rendering from time varying point clouds. In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 2614 -2619, September 2011.
3 K. Salisbury, F. Conti, and F. Barbagli. Haptic rendering: introductory concepts, March-April 2004.
4 K. G. Sreeni and S. Chaudhuri. Haptic rendering of dense 3d point cloud data. In Proceedings of IEEE Haptics Symposium(HAPTICS), pages 333 -339, March 2012.
5 I. Ulrich and J. Borenstein. The GuideCane-applying mobile robot technologies to assist the visually impaired. IEEE Transactions on Systems, Man and Cybernetics, Part A: Systems and Humans, 31(2):131-136, 2002.
6 M. Yang, J. Lu, Z. Zhou, A. Safonova, and K. Kuchenbecker. A gpubased approach for real-time haptic rendering of 3d fluids. Proceedings of ACM SIGGRAPH Asia Sketches, 2009.
7 K. Lundin, B. Gudmundsson, and A. Ynnerman. General proxy-based haptics for volume vi sualization. In World Haptics, First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pages 557-560. IEEE, 2005.
8 C. P. Gharpure and V. Kulyukin. Robot-assisted shopping for the blind: issues in spatial cognition and product selection. Intelligent Servic e Robotics, 1(3) :237- 251, 2008.   DOI
9 D. Hong, S. Kimmel, R. Boehling, N. Camoriano, W. Cardwell, G. Jannaman, A. Purcell, D. Ross, and E. Russel. Development of a semi-autonomous vehicle operable by the visually-impaired. In IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, pages 539-544 . IEEE, 2008.
10 V. Kulyukin, C. P. Gharpure, J. Nicholson, and S. Pavithran. RFID in robot-assisted indoor navigation for the visually impaired. In Proceedings of IEEE/ RSJ International Conference on Intelligent Robots and Systems(IROS), volume 2, pages 1979-1984. IEEE, 2004.
11 C. H. Park and A. M. Howard. Towards real-time haptic exploration using a mobile robot as mediator. In Proceedings of IEEE Haptics Symposium(HAPTICS), pages 289-292. IEEE, 2010.
12 D. Pascolini and S. P. Mariotti. Global estimates of visual impairment: 2010. In British Journal Ophthalmology Online, 2011.
13 F. Ryd'en, H. J. Chezeck, S. N. Kosari, H. King, and B. Hannaford. Using kinect and a haptic interface for implementation of real-time virtual fixtures. In Proceedings of the 2nd Workshop on RGB-D: Advanced Reasoning with Depth Cameras (in conjunction with RSS'11), 2011.