DOI QR코드

DOI QR Code

Non-intrusive Calibration for User Interaction based Gaze Estimation

사용자 상호작용 기반의 시선 검출을 위한 비강압식 캘리브레이션

  • 이태균 (과학기술연합대학원대학교 ICT전공) ;
  • 유장희 (한국전자통신연구원 인공지능연구소)
  • Received : 2020.05.17
  • Accepted : 2020.06.19
  • Published : 2020.06.30

Abstract

In this paper, we describe a new method for acquiring calibration data using a user interaction process, which occurs continuously during web browsing in gaze estimation, and for performing calibration naturally while estimating the user's gaze. The proposed non-intrusive calibration is a tuning process over the pre-trained gaze estimation model to adapt to a new user using the obtained data. To achieve this, a generalized CNN model for estimating gaze is trained, then the non-intrusive calibration is employed to adapt quickly to new users through online learning. In experiments, the gaze estimation model is calibrated with a combination of various user interactions to compare the performance, and improved accuracy is achieved compared to existing methods.

본 논문에서는 웹 페이지 탐색 시 지속해서 발생하는 사용자 상호작용 과정을 이용하여 시선 검출을 위한 캘리브레이션 데이터를 획득하고, 사용자의 시선을 검출하는 동안 자연스럽게 캘리브레이션을 수행하는 방법에 관하여 기술하였다. 제안된 비강압식 캘리브레이션은 획득한 캘리브레이션 데이터를 이용하여 미리 학습된 시선 검출 CNN 모델을 새로운 사용자에 적응하도록 보정하는 과정이다. 이를 위해 훈련을 통해서 시선을 검출하는 일반화된 모델을 만들고 캘리브레이션에서는 온라인 학습 과정을 통해 빠르게 새로운 사용자에 적응하도록 하였다. 실험을 통하여 다양한 사용자 상호작용의 조합으로 시선 검출 모델을 캘리브레이션 하여 성능을 비교하였으며, 기존 방법 대비 개선된 정확도를 얻을 수 있었다.

Keywords

Acknowledgement

이 논문은 2020년도 정부(과학기술정보통신부)의 재원으로 수행된 연구임. (2019-0-00330, 영유아/아동의 발달장애 조기선별을 위한 행동·반응 심리인지 AI 기술 개발)

References

  1. X. Wang, K. Liu, and X. Qian, "A Survey on Gaze Estimation", in Proceeding of the 10th International Conference on Intelligent Systems and Knowledge Engineering, pp.260-267, Taipei, Taiwan. Nov. 2015. DOI: https://doi.org/10.1109/iske.2015.12
  2. K. Rayner, "Eye Movements in Reading and Information Processing: 20 Years of Research", Psychological Bulletin, vol.124, no.3, pp.372-422, Nov. 1998. DOI: https://doi.org/10.1037/0033-2909.124.3.372
  3. G. Buscher, A. Dengel, and L. van Elst, "Eye Movements as Implicit Relevance Feedback", in Proceeding of the CHI'08 Extended Abstracts on Human Factors in Computing Systems, pp.2991-2996, Florence, Italy, Apr. 2008. DOI: https://doi.org/10.1145/1358628.1358796
  4. L. E. Nacke, S. Stellmach, D. Sasse, and C. A. Lindley, "Gameplay Experience in a Gaze Interaction Game", arXiv preprint arXiv:1004.0259, 2010. URL: https://arxiv.org/abs/1004.0259
  5. A. Navab, L. Gillespie Lynch, S. P. Johnson, M. Sigman, and T. Hutman, "Eye Tracking as a Measure of Responsiveness to Joint Attention in Infants at Risk for Autism", Infancy, vol.17, no.4, pp.416-431, Jul. 2012. DOI:https://doi.org/10.1111/j.1532-7078.2011.00082.x
  6. M. Eizenman, H. Y. Lawrence, L. Grupp, and E. Eizenman, "A Naturalistic Visual Scanning Approach to Assess Selective Attention in Major Depressive Disorder", Psychiatry Research, vol.118, no.2, pp.117-128, Jun. 2003. DOI: https://doi.org/10.1016/s0165-1781(03)00068-4
  7. S. S. Deepika and G. Murugesan, "A Novel Approach for Human Computer Interface based on Eye Movements for Disabled People", in Proceeding of the IEEE International Conference on Electrical, Computer and Communication Technologies, pp.1-3, Coimbatore, India, Mar. 2015. DOI: https://doi.org/10.1109/icecct.2015.7226124
  8. J. P. Johnson, "Targeted Advertising and Advertising Avoidance", The RAND Journal of Economics, vol.44, no.1, pp.128-144, Apr. 2013. DOI: https://doi.org/10.2139 /ssrn.2018938 https://doi.org/10.2139/ssrn.2018938
  9. E. D. Guestrin and M. Eizenman, "Remote Point-of-Gaze Estimation Requiring a Single-point Calibration for Applications with Infants", in Proceeding of the 2008 Symposium on Eye Tracking Research & Applications(ETRA), pp.267-274, Savannah, USA, Mar. 2008. DOI: https://doi.org/10.1145/1344471.1344531
  10. Y. Durna and F. Ari, "Design of a Binocular Pupil and Gaze Point Detection System Utilizing High Definition Images", Applied Sciences, vol.7, no.5, pp.1-16, May. 2017. DOI: https://doi.org/10.3390/app7050498
  11. M. S. A. bin Suhaimi, K. Matsushita, M. Sasaki, and W. Njeri, "24-Gaze-Point Calibration Method for Improving the Precision of AC-EOG Gaze Estimation", Sensors, vol.19, no.17, pp.1-13, Aug. 2019. DOI: https://doi.org/10.3390/s19173650
  12. K. Harezlak, P. Kasprowski, and M. Stasch, "Towards Accurate Eye Tracker Calibration-Methods and Procedures", Procedia Computer Science, vol.35, pp.1073-1081, Sep. 2014. DOI: https://doi.org/10.1016/j.procs.2014.08.194
  13. T. Nagamatsu, J. Kamahara, and N. Tanaka, "Calibration-free Gaze Tracking using a Binocular 3D Eye Model", in Proceeding of the CHI'09 Extended Abstracts on Human Factors in Computing Systems, pp.3613-3618, Boston, USA, Apr. 2009. DOI: https://doi.org/10.1145/1520340.1520543
  14. S. W. Shih, Y. T. Wu, and J. Liu, "A Calibration-free Gaze Tracking Technique", in Proceeding of International Conference on Pattern Recognition (ICPR), pp. 201-204; Barcelona, Spain, Sep. 2000. DOI: https://doi.org/10.1109/icpr.2000.902895
  15. K. Wang, S. Wang, and Q. Ji, "Deep Eye Fixation Map Learning for Calibration-free Eye Gaze Tracking", in Proceeding of the 9th Biennial ACM Symposium on Eye Tracking Research & Applications (ETRA), pp.47-55; Charleston, USA, Mar. 2016. DOI: https://doi.org/10.1145/2857491.2857515
  16. Y. Sugano, Y. Matsushita, and Y. Sato, "Calibration-free Gaze Sensing using Saliency Maps", in Proceeding of the International Conference on Computer Vision and Pattern Recognition (CVPR), pp.2667-2674; San Francisco, USA, Jun. 2010. DOI: https://doi.org/10.1109/cvpr.2010.5539984
  17. X. Zhang, Y. Sugano, M. Fritz, and A. Bulling, "It's Written All over Your Face: Full-face Appearance-based Gaze Estimation", in Proceeding of the International Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), pp.51-60; Honolulu, USA, Jul. 2017. DOI: https://doi.org/10.1109/cvprw.2017.284
  18. D. Sahoo, Q. Pham, J. Lu, and S. C. Hoi, "Online Deep Learning: Learning Deep Neural Networks on The Fly", in Proceeding of the Twenty-Seventh International Joint Conference on Artificial Intelligence (IJCAI), pp.2660-2666, Melbourne, Australia, Aug. 2017. DOI:https://doi.org/10.24963/ijcai.2018/369
  19. D. J. Liebling and S. T. Dumais, "Gaze and Mouse Coordination in Everyday Work", in Proceeding of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication, pp.1141-1150, Seattle, USA, Sep. 2014. DOI: https://doi.org/10.1145/2638728.2641692
  20. J. Huang, R. White, and G. Buscher, "User See, User Point: Gaze and Cursor Alignment in Web Search", in Proceeding of the SIGCHI Conference on Human Factors in Computing Systems, pp.1341-1350, Austin, USA, May. 2012. DOI: https://doi.org/10.1145/2207676.2208591
  21. A. Papoutsaki, A. Gokaslan, J. Tompkin, Y. He, and J. Huang, "The Eye of the Typer: a Benchmark and Analysis of Gaze Behavior during Typing", in Proceeding of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA), pp.1-9; Warsaw, Poland, Jun. 2018. DOI: https://doi.org/10.1145/3204493.3204552
  22. A. Krizhevsky, I. Sutskever, and G. E. Hinton, "ImageNet Classification with Deep Convolutional Neural Networks", in Proceeding of Advances in Neural Information Processing Systems (NIPS), pp.1097-1105; Lake Tahoe, USA, Dec. 2012. DOI: https://dl.acm.org/doi/10.5555/2999134.2999257
  23. https://www.tobiipro.com/ko/product-listing/tobii-pro-x3-120/, 2020.
  24. A. Papoutsaki, P. Sangkloy, J. Laskey, N. Daskalova, J. Huang, and J. Hays, "Webgazer: Scalable Webcam Eye Tracking using User Interactions", in Proceeding of the 25th International Joint Conference on Artificial Intelligence (IJCAI), pp.3839-3845, New York, USA, Jul. 2016. URL: https://dl.acm.org/doi/10.5555/3061053.3061156