References
- M. Kassner, W. Patera, and A. Bulling, "Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction," Proc. 2014 ACM Int. J t. Conf. Pervasive Ubiquitous Comput. Adjun. Publ., pp. 1151160, 2014.
- M. L. Mele and S. Federici, "Gaze and eye-tracking solutions for psychological research," Cogn. Process., vol. 13, no. 1 SUPPL, 2012.
- K. Wang, S. Wang, and Q. Ji, "Deep eye fixation map learning for calibration-free eye gaze tracking," Proc. Ninth Bienn. ACM Symp. Eye Track. Res. Appl. - ETRA '16, pp. 47-55, 2016.
- A. Wawro, "Gamasutra - Windows to the soul: Fove makes a case for eye-tracking VR games," Gamasutra, 2015. [Online]. Available: https://goo.gl/8Yn3VS.
- S. Julier et al., "Information filtering for mobile augmented reality," Proc. - IEEE ACM Int. Symp. Augment. Reality, ISAR 2000, pp. 3-11, 2000.
- T. Toyama, J. Orlosky, D. Sonntag, and K. Kiyokawa, "A natural interface for multi-focal plane head mounted displays using 3D gaze," Proc. 12th Int. Work. Conf. Adv. Vis. Interfaces (AVI 2014) , vol. 2, pp. 25-32, 2014.
- Y. M. Kwon and J. K. Shul, "Experimental researches on gaze-based 3D interaction to stereo image display," Technol. E-Learning Digit. Entertain. Proc., vol. 3942, pp. 1112 -1120, 2006.
- J. W. Lee, C. W. Cho, K. Y. Shin, E. C. Lee, and K. R. Park, "3D gaze tracking method using Purkinje images on eye optical model and pupil," Opt. Lasers Eng., vol. 50, no. 5, pp. 736-751, 2012. https://doi.org/10.1016/j.optlaseng.2011.12.001
- Y. Lee et al., "Estimating Gaze Depth Using Multi-Layer Perceptron," in International Symposium on Ubiquitous Virtual Reality (ISUVR) , 2017, pp. 26-29.
- B. Benninger, "Google Glass, ultrasound and palpation: The anatomy teacher of the future?," Clin. Anat., vol. 28, no. 2, pp. 152 -155, 2015. https://doi.org/10.1002/ca.22480
- R. Furlan, "The future of augmented reality: Hololens - Microsoft's AR headset shines despite rough edges [Resources-Tools and Toys]," IEEE Spectr., vol. 53, no. 6, p. 21, 2016. https://doi.org/10.1109/MSPEC.2016.7473143
- P. Olson, "Epson Smart Glasses Browse YouTube With A Nod And Tilt Of The Head.," Forbes.com, p. 9, 2013.
- 박재승, 석윤찬, "시선추적형 가상현실기기를 통한 광고분석 시스템," 스마트미디어저널, 55권, 3호, pp. 62-66, 2016.
- 김영상, 김영익, "증강현실을 적용한 관광지 사물인식 실감체험 앱 콘텐츠 구현," 스마트미디어저널, 5권, 1호, pp. 122-129, 2016.
- 이영천, "마커리스 트래킹 기반 증강현실을 이용한 문화콘텐츠 개발," 스마트미디어저널, 5권, 4호, pp. 90-95, 2016.
- K. Essig, M. Pomplun, and H. Ritter, "A neural network for 3D gaze recording with binocular eye trackers," Int. J . Parallel, Emergent Distrib. Syst., vol. 21, no. February 2015, pp. 79-95, 2006. https://doi.org/10.1080/17445760500354440
- A. T. Duchowski, B. Pelfrey, D. H. House, and R. Wang, "Measuring gaze depth with an eye tracker during stereoscopic display," in Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization - APGV '11, 2011, vol. 1, no. 212, p. 15.
- Y. Lee, K. Masai, K. Kunze, M. Sugimoto, and M. Billinghurst, "A Remote Collaboration System with Empathy Glasses," in Adjunct Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2016, 2016, pp. 342-343.
- F. Pedregosa et al., "Scikit-learn: Machine Learning in Python," J . Mach. Learn. Res., vol. 12, pp. 2825-2830, 2012.