Browse > Article
http://dx.doi.org/10.5302/J.ICROS.2007.13.8.735

A Portable Mediate Interface 'Handybot' for the Rich Human-Robot Interaction  

Hwang, Jung-Hoon (한국과학기술원 기계공학과)
Kwon, Dong-Soo (한국과학기술원 기계공학과)
Publication Information
Journal of Institute of Control, Robotics and Systems / v.13, no.8, 2007 , pp. 735-742 More about this Journal
Abstract
The importance of the interaction capability of a robot increases as the application of a robot is extended to a human's daily life. In this paper, a portable mediate interface Handybot is developed with various interaction channels to be used with an intelligent home service robot. The Handybot has a task-oriented channel of an icon language as well as a verbal interface. It also has an emotional interaction channel that recognizes a user's emotional state from facial expression and speech, transmits that state to the robot, and expresses the robot's emotional state to the user. It is expected that the Handybot will reduce spatial problems that may exist in human-robot interactions, propose a new interaction method, and help creating rich and continuous interactions between human users and robots.
Keywords
human-robot interaction; service robot; mediate interface; Handybot; icon language;
Citations & Related Records
연도 인용수 순위
  • Reference
1 E. H. Kim, K. H. Hyun, and Y. K. Kwak, 'Robust emotion recognition feature, frequency range of meaningful signal,' in the proceedings of IEEE International Workshop on Robot and Human Interactive Communication, Nashville, pp. 667-671, 2005
2 D.-S. Kwon, J.-H. Hwang, S.-H. Park, Z. Bien, D.-J. Kim, S.-W. Lee, Y. J. Kwon, K.-H. Hyun, Y. K Kwak, G. H. Lee, and J.-H. Kim, 'Mediated interface technologies using multi-modality,' Presented at The Technical Workshop of Center for Intelligent Robotics, 2004
3 P. T. Cox, C. C. Risley, and T. J. Smedley, 'Toward concrete representation in visual languages for robot control,' Journal of Visual Languages and Computing, vol. 9, pp. 211-239, 1998   DOI   ScienceOn
4 L. Leifer, M. Van der Loos, and D. Lees, 'Visual language programming for robot command/control in unstructured environments,' in the proceedings of Fifth International Conference on Advanced Robotics, Pisa, Italy, pp. 31-36, 1991
5 S. Hamad, 'The symbol grounding problem,' Physica D: Nonlinear Phenomena, vol. 42, pp. 335-346, 1990   DOI   ScienceOn
6 T. Nomura, T. Kanda, and T. Suzuki, 'Experimental investigation into influence of negative attitudes toward robots on human-robot interaction,' in the proceedings of The 3rd Workshop on Social Intelligence Design, pp. 125-135, 2004
7 T. Mitsui, T. Shibata, K. Wada, A. Touda, and K. Tanie, 'Psychophysiological effects by interaction with mental commit robot,' in the proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, Maui, pp. 1189-1194, 2001
8 H. Huttenrauch, A. Green, M. Norman, L. Oestreicher, and K. S. Eklundh, 'Involving users in the design of a mobile office robot,' IEEE Transactions on Systems, Man and Cybernetics, Part C, Applications and Reviews, vol. 30, pp. 113-124, 2004
9 R. Gockley, J. Forlizzi, and R. Simmons, 'Interactions with a moody robot,' in the proceedings of Human-Robot Interaction, Salt Lake City, Utah, pp. 186-193, 2006
10 J. D. Williams and S. Young, 'Partially observable markov decision processes for spoken dialog systems,' Computer Speech and Language, vol. 21, pp. 393-422, 2007   DOI   ScienceOn
11 S. W Lee, Kim, D.-J., Kim, Y. S., & Bien, Z., 'Training of feature extractor via new cluster validity-Application to adaptive facial expression recognition,' Lecture Notes in Computer Science, vol. 3684, pp. 542-548, 2005   DOI   ScienceOn
12 D.-S. Kwon, W.-C. Yoon, Z. Bien, Y. K. Kwak, K.-H. Hong, K.-S. Park, J.-Y. Yang, J.-H. Hwang, K.-B. Lee, J.-C. Park, Y.-C. Kim, Y-J. Kwon, S.-W. Lee, and K.-H. Hyun, 'Multimodal cognitive and emotional interaction module for human moot interacation,' Presented at The 5th Technical Workshop of Center for Intelligent Robotics, 2006
13 G Chronis and M. Skubic, 'Sketch-based navigation for mobile robots,' in the proceedings of The 12th IEEE International Corference on Fuzzy Systems, St. Louis, pp. 284-289, 2003
14 T. Fong, 'Collaborative control: A robot-centric model for vehicle teleoperation,' Robotics Institute, Doctor of Philosophy, 2001
15 H. K. Keskinpala, J. A. Adams, and K. Kawamura, 'PDA-based human-robotic interface,' in the proceedings of IEEE International Corference on Systems, Man and Cybernetics, pp. 3931-3936, 2003
16 D. Perzanowski, A. C. Schultz, W. Adams, E. Marsh, and M. Bugajska, 'Building a multimodal human-robot interface,' IEEE Journal on Intelligent Systems, vol. 16, pp. 16-21, 2001
17 R. A. Bolt, 'Put-that-there: voice and gesture at the graphics interface,' in the proceedings of The 7th annual conference on Computer graphics and interactive techniques, Seattle, pp. 262-270, 1980
18 R. Bischoff and V. Graefe, 'Dependable multimodal communication and Interaction with robotic assistants,' in the proceedings of IEEE International Workshop on Robot and Human Interactive Communication, Berlin, pp. 300-305, 2002
19 J. J. Pfeiffer, 'Altaira: A rule-based visual language for small mobile robots,' Journal of Visual Languages and Computing, vol.9, pp.127-150, 1998   DOI   ScienceOn