Browse > Article
http://dx.doi.org/10.22156/CS4SMB.2021.11.01.020

Resolution Estimation Technique in Gaze Tracking System for HCI  

Kim, Ki-Bong (Dept. of Health and IT Convergence, Daejeon Health Institute of Technology)
Choi, Hyun-Ho (Dept. of Health and IT Convergence, Daejeon Health Institute of Technology)
Publication Information
Journal of Convergence for Information Technology / v.11, no.1, 2021 , pp. 20-27 More about this Journal
Abstract
Eye tracking is one of the NUI technologies, and it finds out where the user is gazing. This technology allows users to input text or control GUI, and further analyzes the user's gaze so that it can be applied to commercial advertisements. In the eye tracking system, the allowable range varies depending on the quality of the image and the degree of freedom of movement of the user. Therefore, there is a need for a method of estimating the accuracy of eye tracking in advance. The accuracy of eye tracking is greatly affected by how the eye tracking algorithm is implemented in addition to hardware variables. Accordingly, in this paper, we propose a method to estimate how many degrees of gaze changes when the pupil center moves by one pixel by estimating the maximum possible movement distance of the pupil center in the image.
Keywords
HCI(human computer interaction); Eye detection; Hough transform; FOV(Field of View); Fovea;
Citations & Related Records
연도 인용수 순위
  • Reference
1 F. Karray, M. Alemzadeh, J. A. Saleh, and M. N. Arab. (2008). Human-Computer Interaction: Overview on State of the Art. International Journal on Smart Sensing and Intelligent Systems, 1(1), 137-159.   DOI
2 A. J. Hornof and A. Cavender. (2005). EyeDraw: enabling children with severe motor impairments to draw with their eyes. In Proceedings of the SIGCHI conference on Human factors in computing systems (pp. 161-170).
3 G. Buscher, E. Cutrell & M. R. Morris. (2009). What do you see when you're surfing?: using eye tracking to predict salient regions of web pages, In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 21-30).
4 S. Zhao & R. R. Grigat. (2006). Robust eye detection under active infrared illumination, In Proceedings of the International Conference on Pattern Recognition (Vol. 4, pp. 481-484). IEEE.
5 Z. Zhu, K. Fujimura & Q. Ji. (2002). Real-time eye detection and tracking under various light conditions. In proceedings of the ACM SIGCHI symposium on eye tracking research and applications (pp. 139-144).
6 X. Zhang, Q. Tang, H. Jin, Y. Qiu & Y. Guo. (2012). Eye location based on adaboost and random forests. Journal of Software, 7(10), 2365-2371.
7 H. C. Lee, W. O. Lee, C. W. Cho, S. Y. Gwon, K. R. Park, H. Lee, and J. Cha. (2013). Remote Gaze Tracking System on a Large Display, Sensors, 13 (10), 13439-13463.   DOI
8 P. Wang, M. B. Green & Q. Ji. (2005). Automatic eye detection and its validation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR) (pp.164-164).
9 T. Kawaguchi & M. Rizon. (2003). Iris detection using intensity and edge information. Pattern Recognition, 36(2), 549-562.   DOI
10 B. Kim, H. Lee & W. Kim. (2010). Rapid eye detection method for non-Glasses type 3D display on portable devices. IEEE Transactions on Consumer Electronics, 56(4), 2498-2505.   DOI
11 A. L. Yuille, P. W. Hallinan & D. S. Cohen. (1992). Feature extraction from faces using deformable templates. International Journal of Computer Vision, 8(2), 99-111. DOI : 10.1007/BF00127169   DOI
12 S. T. Jang, J. H. Lee, J. Y. Jang & W. D. Jang. (2018). Gaze Tracking with Low-cost EOG Measuring Device. Journal of Korea Convergence Society, 9(11), 53-60. DOI : 10.15207/JKCS.2018.9.11.053   DOI
13 P. Artal, A. Benito & J. Tabernero. (2006). The human eye is an example of robust optical design. Journal of Vision, 6(1), 1-7. DOI : 10.1167/6.1.1   DOI
14 R. C. Gonzalez & R. E. Woods. (2001). Digital Image Processing. Addison-Wesley Longman Publishing Co., Inc., Boston, MA.
15 H. C. Lee, D. T. Luong, C. W. Cho, E. C. Lee & K. R. Park. (2010). Gaze tracking system at a distance for controlling IPTV. Consumer Electronics, IEEE Transactions on, 56(4), 2577-2583. DOI : 10.1109/TCE.2010.5681143   DOI
16 H. Kim, S. Lee, H. H. Han & J. Kim. (2020). Saliency Attention Method for Salient Object Detection Based on Deep Learning. Journal of the Korea Convergence Society, 11(12), 39-47.   DOI