DOI QR코드

DOI QR Code

Method for Automatic Switching Screen of OST-HMD using Gaze Depth Estimation

시선 깊이 추정 기법을 이용한 OST-HMD 자동 스위칭 방법

  • Received : 2018.03.10
  • Accepted : 2018.03.30
  • Published : 2018.03.31

Abstract

In this paper, we propose automatic screen on / off method of OST-HMD screen using gaze depth estimation technique. The proposed method uses MLP (Multi-layer Perceptron) to learn the user's gaze information and the corresponding distance of the object, and inputs the gaze information to estimate the distance. In the learning phase, eye-related features obtained using a wearable eye-tracker. These features are then entered into the Multi-layer Perceptron (MLP) for learning and model generation. In the inference step, eye - related features obtained from the eye tracker in real time input to the MLP to obtain the estimated depth value. Finally, we use the results of this calculation to determine whether to turn the display of the HMD on or off. A prototype was implemented and experiments were conducted to evaluate the feasibility of the proposed method.

본 논문에서는 시선 깊이 추정 기술을 이용한 OST-HMD의 자동화면 on/off 기능을 제안한다. 제안하는 방법은 MLP(Multi-layer Perceptron)을 이용하여 사용자의 시선 정보와 보는 물체의 거리를 학습 한 후, 시선 정보만 입력하여 거리를 추정한다. 학습 단계에서는 착용 할 수 있는 양안 추적기를 사용하여 시선 관련 특징을 얻는다. 그런 다음 이 특징을 다층 퍼셉트론 (MLP: Multi-layer Perceptron)에 입력하여 학습하고 모델을 생성한다. 추론 단계에서는 안구 추적기로부터 실시간으로 시선 관련 특징을 얻고 이를 MLP에 입력하여 추정 깊이 값을 얻는다. 마지막으로 HMD의 화면을 켜거나 끌 것인지 여부를 결정하기 위해 이 계산결과를 활용한다. 제안된 방법의 가능성을 평가하기 위해 프로토타입을 구현하고 실험을 수행하였다.

Keywords

References

  1. M. Kassner, W. Patera, and A. Bulling, "Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction," Proc. 2014 ACM Int. J t. Conf. Pervasive Ubiquitous Comput. Adjun. Publ., pp. 1151160, 2014.
  2. M. L. Mele and S. Federici, "Gaze and eye-tracking solutions for psychological research," Cogn. Process., vol. 13, no. 1 SUPPL, 2012.
  3. K. Wang, S. Wang, and Q. Ji, "Deep eye fixation map learning for calibration-free eye gaze tracking," Proc. Ninth Bienn. ACM Symp. Eye Track. Res. Appl. - ETRA '16, pp. 47-55, 2016.
  4. A. Wawro, "Gamasutra - Windows to the soul: Fove makes a case for eye-tracking VR games," Gamasutra, 2015. [Online]. Available: https://goo.gl/8Yn3VS.
  5. S. Julier et al., "Information filtering for mobile augmented reality," Proc. - IEEE ACM Int. Symp. Augment. Reality, ISAR 2000, pp. 3-11, 2000.
  6. T. Toyama, J. Orlosky, D. Sonntag, and K. Kiyokawa, "A natural interface for multi-focal plane head mounted displays using 3D gaze," Proc. 12th Int. Work. Conf. Adv. Vis. Interfaces (AVI 2014) , vol. 2, pp. 25-32, 2014.
  7. Y. M. Kwon and J. K. Shul, "Experimental researches on gaze-based 3D interaction to stereo image display," Technol. E-Learning Digit. Entertain. Proc., vol. 3942, pp. 1112 -1120, 2006.
  8. J. W. Lee, C. W. Cho, K. Y. Shin, E. C. Lee, and K. R. Park, "3D gaze tracking method using Purkinje images on eye optical model and pupil," Opt. Lasers Eng., vol. 50, no. 5, pp. 736-751, 2012. https://doi.org/10.1016/j.optlaseng.2011.12.001
  9. Y. Lee et al., "Estimating Gaze Depth Using Multi-Layer Perceptron," in International Symposium on Ubiquitous Virtual Reality (ISUVR) , 2017, pp. 26-29.
  10. B. Benninger, "Google Glass, ultrasound and palpation: The anatomy teacher of the future?," Clin. Anat., vol. 28, no. 2, pp. 152 -155, 2015. https://doi.org/10.1002/ca.22480
  11. R. Furlan, "The future of augmented reality: Hololens - Microsoft's AR headset shines despite rough edges [Resources-Tools and Toys]," IEEE Spectr., vol. 53, no. 6, p. 21, 2016. https://doi.org/10.1109/MSPEC.2016.7473143
  12. P. Olson, "Epson Smart Glasses Browse YouTube With A Nod And Tilt Of The Head.," Forbes.com, p. 9, 2013.
  13. 박재승, 석윤찬, "시선추적형 가상현실기기를 통한 광고분석 시스템," 스마트미디어저널, 55권, 3호, pp. 62-66, 2016.
  14. 김영상, 김영익, "증강현실을 적용한 관광지 사물인식 실감체험 앱 콘텐츠 구현," 스마트미디어저널, 5권, 1호, pp. 122-129, 2016.
  15. 이영천, "마커리스 트래킹 기반 증강현실을 이용한 문화콘텐츠 개발," 스마트미디어저널, 5권, 4호, pp. 90-95, 2016.
  16. K. Essig, M. Pomplun, and H. Ritter, "A neural network for 3D gaze recording with binocular eye trackers," Int. J . Parallel, Emergent Distrib. Syst., vol. 21, no. February 2015, pp. 79-95, 2006. https://doi.org/10.1080/17445760500354440
  17. A. T. Duchowski, B. Pelfrey, D. H. House, and R. Wang, "Measuring gaze depth with an eye tracker during stereoscopic display," in Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization - APGV '11, 2011, vol. 1, no. 212, p. 15.
  18. Y. Lee, K. Masai, K. Kunze, M. Sugimoto, and M. Billinghurst, "A Remote Collaboration System with Empathy Glasses," in Adjunct Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2016, 2016, pp. 342-343.
  19. F. Pedregosa et al., "Scikit-learn: Machine Learning in Python," J . Mach. Learn. Res., vol. 12, pp. 2825-2830, 2012.