An Emotional Gesture-based Dialogue Management System using Behavior Network

행동 네트워크를 이용한 감정형 제스처 기반 대화 관리 시스템

  • 윤종원 (연세대학교 컴퓨터과학과) ;
  • 임성수 (연세대학교 컴퓨터과학과) ;
  • 조성배 (연세대학교 컴퓨터과학과)
  • Received : 2010.05.31
  • Accepted : 2010.09.02
  • Published : 2010.10.15

Abstract

Since robots have been used widely recently, research about human-robot communication is in process actively. Typically, natural language processing or gesture generation have been applied to human-robot interaction. However, existing methods for communication among robot and human have their limits in performing only static communication, thus the method for more natural and realistic interaction is required. In this paper, an emotional gesture based dialogue management system is proposed for sophisticated human-robot communication. The proposed system performs communication by using the Bayesian networks and pattern matching, and generates emotional gestures of robots in real-time while the user communicates with the robot. Through emotional gestures robot can communicate the user more efficiently also realistically. We used behavior networks as the gesture generation method to deal with dialogue situations which change dynamically. Finally, we designed a usability test to confirm the usefulness of the proposed system by comparing with the existing dialogue system.

최근 로봇의 다양한 활용과 더불어 로봇-사람간의 상호작용에 대한 연구 또한 활발하게 진행되고 있으며, 대표적으로 자연어 처리나 대화중 제스처 생성 등의 방법이 로봇과 사람간의 의사소통에 적용되었다. 그러나 기존의 로봇-사람간의 의사소통을 위한 방법을 통해서는 정적인 의사소통만을 수행할 수 있다는 한계가 존재하며, 보다 자연스럽고 사실적인 의사소통 방법에 대한 연구가 필요하다. 본 논문에서는 보다 수준 높은 의사소통을 위해 감정형 제스처 기반 대화 관리 시스템을 제안한다. 제안하는 시스템은 베이지안 네트워크와 패턴 매칭을 이용하여 로봇과 사람간의 대화를 수행함과 동시에 로봇-사람간의 대화 도중 실시간으로 상황에 어울리는 로봇의 감정형 제스처를 생성한다 감정형 제스처 생성을 통해 로봇은 사람에게 대화를 보다 효과적으로 전달함과 더불어 사실적인 상호작용을 수행할 수 있다. 제스처 생성은 동적으로 변화하는 대화 상황에 유연하게 대처할 수 있도록 행동 네트워크를 사용하였다. 이후 제안하는 감정형 제스처 기반 시스템의 유용성을 검증하기 위해 사용성 평가를 통해 감정이니 제스처를 사용하지 않는 기존의 대화 관리 시스템과의 비교를 수행하였다.

Keywords

References

  1. A. Green and K. Eklundh, "Designing for learnability in human-robot communication," IEEE Trans. Ind. Electron., vol.50, no.4, pp.644-650, 2003. https://doi.org/10.1109/TIE.2003.814763
  2. T. Fong, C. Thorpe, and C. Baur, "Robot, asker of questions," Robot. Auton. Syst., vol.42, no.3-4, pp.235-243, 2003. https://doi.org/10.1016/S0921-8890(02)00378-0
  3. H. Prendinger and M. Ishizuka, "Let's talk! Socially intelligent agents for language conversation training," IEEE Trans. Syst., Man, Cybern. A, Syst. Humans, vol.31, no.5, pp.465-471, 2001. https://doi.org/10.1109/3468.952722
  4. D. Sanford and J. Roach, "A theory of dialogue structures to help manage human-computer interaction," IEEE Trans. Syst., Man, Cybern., vol.18, no.4, pp.567-574, 1988. https://doi.org/10.1109/21.17375
  5. A. Ratnaparkhi, "Trainable approaches to surface natural language generation and their application to conversational dialog systems," Computer Speech and Language, vol.16, no.3-4, pp.435-455, 2002. https://doi.org/10.1016/S0885-2308(02)00025-6
  6. M. A. Walker, O. C. Rambow and M. Rogati, "Training a sentence planner for spoken dialogue using boosting," Computer Speech and Language, vol.16, no.3-4, pp.409-433, 2002. https://doi.org/10.1016/S0885-2308(02)00027-X
  7. E. Levin, R. Pieraccini and W. Eckert, "A stochastic model of human-machine interaction for learning dialog strategies," IEEE Trans. on Speech and Audio Processing, vol.8, no.1, pp.11-23, 2000. https://doi.org/10.1109/89.817450
  8. I. Bulyko and M. Ostendorf, "Efficient integrated response generation from multiple targets using weighted finite state transducers," Computer Speech and Language, vol.16, no.3-4, pp.533-550, 2002. https://doi.org/10.1016/S0885-2308(02)00023-2
  9. N. Mitsunaga, C. Smith, T. Kanda, H. Ishiguro, and N. Hagita, "Adapting robot behavior for human- robot interaction," IEEE Trans. on robotics, vol.24, no.4, pp.911-916, 2008. https://doi.org/10.1109/TRO.2008.926867
  10. W. Breitfuss, H. Prendinger, and M. Ishizuka, "Automatic generation of gaze and gestures for dialogues between embodied conversational agents," International Journal of Semantic Computing, vol. 2, no.1, pp.71-90, 2008. https://doi.org/10.1142/S1793351X0800035X
  11. S. Franklin, and A. Graesser, "Is it an agent, or just a program? : A taxonomy for autonomous agents," Lecture Notes in Computer Science, vol.1193, pp.21-35, 1997.
  12. S. Macskassy and S. Stevenson, "A conversational agent," Master essay, Rutgers University, 1996.
  13. P. Nuguesm et al., "A conversational agent to navigate in virtual worlds," Proc. of the 11th Workshop on Language Technology, pp.23-33, 1996.
  14. J.-H. Hong and S.-B. Cho, "A two-stage bayesian network for effective development of conversational agent," Lecture Notes in Computer Science, vol.2690, pp.1-9, 2003.
  15. C.-J. Lee, S.-K. Jung, S.-H. Kim, and G. C.-B. Lee, "Example-based dialog modeling for practical multi-domain dialog system," Speech Communication, vol.51, no.5, pp.466-484, 2009. https://doi.org/10.1016/j.specom.2009.01.008
  16. P. Maes, "How to do the right thing," Connection Science Journal, vol.1, no.3, pp.291-323, 1989. https://doi.org/10.1080/09540098908915643
  17. M. N. Nicolescu and M. J. Mataric, "Extending behavior-based systems capabilities using an abstract behavior representation," AAAI Fall Symposium on Parallel Cognition, pp.27-34, 2000.
  18. M. N. Nicolescu and M. J. Mataric, "A hierarchical architecture for behavior-based robots," In Proc. of First Int. Joint Conf. on Autonomous Agents and Multi-Agent Systems, pp.227-233, 2002.
  19. A. Khoo and R. Zubek, "Applying inexpensive AI techniques to computer games," IEEE Intelligent Systems, vol.17, no.4, pp.48-53, 2002.
  20. D. Evans, "Emotion," Oxford University Press, 2001.
  21. A. Ortony, G. Clore, and A. Collins, "The cognitive structure of emotions," Chicago University Press, 1998.
  22. C. D. Elliott, "The affective reasoner: A process model of emotions in a multi-agent system," Northwestern University, 1992.
  23. M. El-Nasr, T. R. Loerger, and J. Yen, "Emotionally Expressive Agents," 1998.
  24. C.-T. Cheng, Y.-T. Yang, S.-H. Miao, and C.-C. Wong, "Motion and emotional behavior design for pet robot dog," Lecture Notes in Computer Science, vol.5744, pp.13-22, 2009.
  25. S.-S. Lim, S.-B. Cho, "Automatic construction of hierarchical Bayesian networks for topic inference of conversational agent," Journal of Korea Information Science Society: Software and Applications, vol.33, no.10, pp.877-885, Oct. 2006.
  26. E. Horvitz and T. Paek, "A computational architecture for conversation," Proc. 7th Int. Conf. User Modeling, pp.201-210, 1999.
  27. C. Bartneck, "Integrating the OCC model of emotions in embodied characters," In Proc. of the Workshop on Virtual Conversational Characters: Applications, Methods, and Research Challenges, 2002.
  28. B. R. Steunebrink, M. Dastani, and J.-J. Ch. Meyer, "A formal model of emotion-based action tendency for intelligent agents," Lecture Notes in Computer Science, vol.5816, pp.174-186, 2009.
  29. J. Brooke, "SUS: A quick and dirty usability scale," Usability evaluation in industry, 1996.