Browse > Article

Evaluation of Gaze Depth Estimation using a Wearable Binocular Eye tracker and Machine Learning  

Shin, Choonsung (KETI)
Lee, Gun (University of South Australia)
Kim, Youngmin (KETI)
Hong, Jisoo (KETI)
Hong, Sung-Hee (KETI)
Kang, Hoonjong (KETI)
Lee, Youngho (Mokpo National University)
Abstract
In this paper, we propose a gaze depth estimation method based on a binocular eye tracker for virtual reality and augmented reality applications. The proposed gaze depth estimation method collects a wide range information of each eye from the eye tracker such as the pupil center, gaze direction, inter pupil distance. It then builds gaze estimation models using Multilayer perceptron which infers gaze depth with respect to the eye tracking information. Finally, we evaluated the gaze depth estimation method with 13 participants in two ways: the performance based on their individual models and the performance based on the generalized model. Through the evaluation, we found that the proposed estimation method recognized gaze depth with 90.1% accuracy for 13 individual participants and with 89.7% accuracy for including all participants.
Keywords
Gaze Depth; 3D gaze; Eye tracking; Virtual reality; Augmented reality;
Citations & Related Records
연도 인용수 순위
  • Reference
1 https://www.facebook.com/spaces [Accessed Aug. 09, 2017]
2 S. Orts-Escolano, C. Rhemann, S. Fanello, W. Chang, A. Kowdle, Y. Degtyarev, D. Kim, P. L. Davidson, S. Khamis, M. Dou, V. Tankovich, C. Loop, Q. Cai, P. A. Chou, S. Mennicken, J. Valentin, V. Pradeep, S. Wang, S. B. Kang, P. Kohli, Y. Lutchyn, C. Keskin, and S. Izadi.. "Holoportation: Virtual 3D Teleportation in Real-time," In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16), pp. 741-754, 2016.
3 D. W. Hansen, Q. Ji, In the Eye of the Beholder: A Survey of Models for Eyes and Gaze, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 32, No. 3, pp. 478-500, 2010.   DOI
4 C. Cho, J. Lee, E. Lee and K. Park, "Robust gaze-tracking method by using frontal-viewing and eye-tracking cameras," Opt. Eng, Vol. 48, No. 12, 2009.
5 K. Tan, D.J Kriegman, N Ahuja, "Appearance-based Eye Gaze Estimation," Proceedings. Sixth IEEE Workshop on Applications of Computer Vision (WACV 2002).
6 Y. Sugano, Y. Matsushita, Y. Sato, "Appearance-based gaze estimation using visual saliency," IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol 35, No.2, pp. 329-341, 2013.   DOI
7 E. G. Mlot, H. Bahmani, S. Wahl, and E. Kasneci. 2016. 3D Gaze Estimation using Eye Vergence. In Proceedings of the 9th International Joint Conference on Biomedical Engineering Systems and Technologies, 2016.
8 T. Toyama, D. Sonntag, J. Orlosky, K. Kiyokawa, A Natural Interface for Multi-focal Plane Head Mounted Displays Using 3D gaze," In Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces (AVI '14). pp. 25-32, 2014.
9 J. Lee, C. Cho, K. Shin, E. Lee, and K. Park, 3D Gaze Tracking Method Using Purkinje, Images on Eye Optical Model and Pupil, Optics and Lasers in Engineering, Vol. 50, No. 5, pp. 736-751, 2012.
10 Y. Itoh, J. Orlosky, K. Kiyokawa, T. Amano, and M. Sugimoto, "Monocular Focus Estimation Method for a Freely-Orienting Eye using Purkinje-Sanson Images," In Proceeding of VR 2017.
11 M. Kassner, W. Patera, and A. Bulling, "Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction," In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct), pp. 1151-1160.
12 scikit-learn http://scikit-learn.org/stable/ [Accessed Aug. 09, 2017]
13 Weka http://www.cs.waikato.ac.nz/-ml/weka/ [Accessed Aug. 09, 2017]
14 Y. Lee, C. Shin, A. Plopski, Y. Itoh, A. Dey, G. Lee, S. Kim, M. Billinghurst, "Estimating Gaze Depth Using Multi-Layer Perceptron," 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR), Nara, Japan, 2017, pp. 26-29.