DOI QR코드

DOI QR Code

로봇활용 공동 주의 훈련자극에 대한 사용자 반응상태를 추정하는 프로세스

The process of estimating user response to training stimuli of joint attention using a robot

  • Kim, Da-Young (Korea Institute of Robotics and Technology Convergence) ;
  • Yun, Sang-Seok (Division of Mechanical Convergence Engineering, Silla University)
  • 투고 : 2021.07.16
  • 심사 : 2021.08.12
  • 발행 : 2021.10.31

초록

본 연구는 사회성 상호작용 훈련자극에 반응하는 아동의 행동 및 정서적 긴장상태를 연산하는 심리반응 상태 추정 프로세스를 제안한다. 행동 중재에 필요한 훈련자극으로는 공동 주의(Joint attention) 사회성 훈련을 채택하고, 훈련프로토콜은 불연속 개별시도 훈련(DTT: Discrete trial training)기법이 적용된다. 공동 주의 훈련에서 사용자의 집중과 긴장 정도를 확인하기 위해 3가지 훈련자극용 콘텐츠를 구성한 후, 캐릭터 형태의 탁상 로봇을 이용하여 사용자에게 훈련자극을 수행하게 된다. 그런 다음, 비전 기반 헤드 포즈 인식기와 기하학 연산모델로 사용자 응시반응을 추정하고, PPG와 GSR의 생체신호를 심박변이도와 히스토그램 기법으로 신경계 반응을 분석한다. 로봇을 활용한 실험에서 공동 주의에 대한 각 콘텐츠 별 훈련에 사용자의 심리반응을 정량화 할 수 있음을 확인하였다.

In this paper, we propose a psychological state estimation process that computes children's attention and tension in response to training stimuli. Joint attention was adopted as the training stimulus required for behavioral intervention, and the Discrete trial training (DTT) technique was applied as the training protocol. Three types of training stimulation contents are composed to check the user's attention and tension level and provided mounted on a character-shaped tabletop robot. Then, the gaze response to the user's training stimulus is estimated with the vision-based head pose recognition and geometrical calculation model, and the nervous system response is analyzed using the PPG and GSR bio-signals using heart rate variability(HRV) and histogram techniques. Through experiments using robots, it was confirmed that the psychological response of users to training contents on joint attention could be quantified.

키워드

과제정보

This work was supported by the National Research Foundation of Korea(NRF) grant funded by the Korea government(MSIT) (No. NRF-2021R1F1A1063669).

참고문헌

  1. Centers for Disease Control and Prevention [Internet]. Available: https://www.cdc.gov/ncbddd/autism/facts.html.
  2. E. Y. Choi, "Literature Review of Robots Used for the Rehabilitation of Children with Autistic Spectrum Disorder," Journal of Rehabilitation Welfare Engineering & Assistive Technology, vol. 9, no. 4, pp. 265-273, 2015.
  3. S. Kim and Y. Ko, "Analysis of robot research and development cases used for treatment and social adaptation of children with autism spectrum disorder," Journal of Integrated Design Research, vol. 16. no. 4. pp. 21-32, 2017.
  4. H. Kozima, M. P. Michalowski, and C. Nakagawa, "Keepon," International Journal of Social Robotics, vol. 1, pp. 3-18, 2009. https://doi.org/10.1007/s12369-008-0009-8
  5. I. Berk-Smeekens, M. Korte, M Dongen-Boomsma, I. Oosterling, J. Boer, E. Barakova, T. Lourens, J. Glennon, W. Staal, and J. K. Buitelaar, "Pivotal Response Treatment with and without robot-assistance for children with autism: a randomized controlled trial," European Child & Adolescent Psychiatry, pp. 1-13, 2021.
  6. J. G. Lee, B. H. Lee, J. S. Jung, and J. Y. Kwon, "Robot design and action study for the treatment of autistic spectrum disorders children," Journal of Institute of Korean Electrical and Electronics Engineers, vol. 20, no. 2, pp. 196-199, 2016.
  7. S. S. Yun, H. Kim, J. Choi, and S. K. Park, "A robotic system with behavioral intervention facilitating eye contact and facial emotion recognition of children with autism spectrum disorders," Journal of Korea Robotics Society, vol. 10, no. 2, pp. 61-69, 2015. https://doi.org/10.7746/jkros.2015.10.2.061
  8. S. Lee, K. Hong, and S. Y. Kang, "A review of studies on discrete trail teaching(DTT) for children with autism spectrum disorders: Based on the intervention outcome and the staff variable," Journal of Emotional and Behavioral Disorders, vol. 28, no. 3, pp. 523-549, 2012.
  9. P. Mundy, M. Sigman, J. Ungerer, and T. Sherman, "Defining the social deficits of autism: The contribution of non-verbal communication measures," Journal of Child Psychology and Psychiatry and Allied Disciplines, vol. 27, pp. 657-669, 1986. https://doi.org/10.1111/j.1469-7610.1986.tb00190.x
  10. M. Tolgyessy, M. Dekan, F. Duchon, J. Rodina, P. Hubinsky, and L. U. Chovanec, "Foundations of visual linear human-robot interaction via pointing gesture navigation," International Journal of Social Robotics, vol. 9, no. 4, pp. 509-523, 2017. https://doi.org/10.1007/s12369-017-0408-9
  11. D. Y. Kim, H. K. Eom, and S. S. Yun, "Design of head robot understanding human Pose," in Proceedings of 2020 Fall Conference of ESK, pp. 330-333, Oct. 2020.
  12. A. Arcentales, R. Daniel, C. Betancourt, I. Yepez, D. Alulema, and A. V. Guaman, "A multivariate signal analysis of a sensing platform prototype for stress detection," in Proceedings of International Conference on Intelligent Information Technology, pp. 63-77, 2020.
  13. B. Kim and J. Min, Use and Interpretation of HRV in Stress Clinics, Panmun Education, 2015.
  14. J. P. Parsons, R. Bedford, E. J. H. Jones, T. Charman, M. H. Johnson, and T. Gliga, "Gaze following and attention to objects in infants at familial risk for ASD," Frontiers in Psychology, vol. 10, pp. 1799, 2019. https://doi.org/10.3389/fpsyg.2019.01799