DOI QR코드

DOI QR Code

저가형 EOG 계측장치를 이용한 시선추적

Gaze Tracking with Low-cost EOG Measuring Device

  • 장승태 (동명대학교 의공학과) ;
  • 이정환 (동명대학교 의공학과) ;
  • 장재영 (동명대학교 의공학과) ;
  • 장원두 (동명대학교 전자.의용공학부)
  • Jang, Seung-Tae (Department of Biomedical Engineering, Tongmyung University) ;
  • Lee, Jung-Hwan (Department of Biomedical Engineering, Tongmyung University) ;
  • Jang, Jae-Young (Department of Biomedical Engineering, Tongmyung University) ;
  • Chang, Won-Du (School of Electronic and Biomedical Engineering, Tongmyung University)
  • 투고 : 2018.08.30
  • 심사 : 2018.11.20
  • 발행 : 2018.11.28

초록

아두이노와 저가형 생체신호 증폭기를 사용하여 시선추적실험을 실시하고 결과를 분석하였다. 본 연구에서는 간단한 4방향의 시선이동 인식실험과 함께, 시선을 사용하여 영어 알파벳 등을 직접 쓰는 eye-writing 인식실험을 함께 진행함으로, 새롭게 구성한 안구전도 측정장치의 실용성을 평가하고, 더 나아가 저가형 안구전도 장치가 eye-writing과 같은 복잡한 사람-컴퓨터간 상호작용도구로 활용될 수 있는지를 분석하였다. 실험을 위해서 PSL-iEOG와 아두이노를 사용하는 저가형 안구전도 측정장치가 구성되었으며, 패턴분류를 위해 dynamic positional warping과 웨이블릿 변환이 사용되었다. 실험 결과, 저가형 측정장치는 비교적 단순한 알고리즘만으로도 외부 잡음이 유입되지 않은 경우 90%에 가까운 정확도로 시선방향을 인식할 수 있었으며, eye-writing의 경우에도 5개 패턴에 대해서 90%의 중위 정확도를 달성할 수 있었다. 그러나 패턴의 숫자가 증가함에 따라 정확도가 매우 감소하여, 다양한 패턴의 직접적인 입력이라는 eye-writing의 장점을 부각하기 위해서는 저가형 장치에 특화된 알고리즘의 개발 등 추가적인 연구가 필요할 것으로 여겨진다.

This paper describes the experiments of gaze tracking utilizing a low-cost electrooculogram measuring device. The goal of the experiments is to verify whether the low-cost device can be used for a complicated human-computer interaction tool, such as the eye-writing. Two experiments are conducted for this goal: a simple gaze tracking of four directional eye-movements, and eye-writing-which is to draw letters or shapes in a virtual space. Eye-written alphabets were obtained by two PSL-iEOGs and an Arduino Uno; they were classified by dynamic positional warping after preprocessed by a wavelet function. The results show that the expected recognition accuracy of the four-directional recognition is close to 90% when noises are controlled, and the similar median accuracy (90.00%) was achieved for the eye-writing when the number of writing patterns are limited to five. In future works, additional algorithms for stabilizing the signal need to be developed.

키워드

OHHGBW_2018_v9n11_53_f0001.png 이미지

Fig. 1. EOG measuring device

OHHGBW_2018_v9n11_53_f0002.png 이미지

Fig. 2. Location of electrodes.

OHHGBW_2018_v9n11_53_f0003.png 이미지

Fig. 3. Experimental Procedure

OHHGBW_2018_v9n11_53_f0004.png 이미지

Fig. 4. Overal Structure of Algorithm

OHHGBW_2018_v9n11_53_f0005.png 이미지

Fig. 5. Median accuracy across subjects as increasing the number of classes

Table 1. Branches of constraint slope for DPW

OHHGBW_2018_v9n11_53_t0001.png 이미지

Table 2. Confusion matrix for eye-movement recognitions

OHHGBW_2018_v9n11_53_t0002.png 이미지

Table 3. Recognition accuracy of directional eye movements

OHHGBW_2018_v9n11_53_t0003.png 이미지

Table 4. Recognition results of eye-written characters. S, B, E denote space, back space, and enter symbols respectively.

OHHGBW_2018_v9n11_53_t0004.png 이미지

Table 5. Recognition accuracy according to subjects

OHHGBW_2018_v9n11_53_t0005.png 이미지

Table 6. Recognition accuracy of eye-writing

OHHGBW_2018_v9n11_53_t0006.png 이미지

참고문헌

  1. W. D. Chang, H.-S. Cha & C.-H. Im. (2016). Removing the Interdependency between Horizontal and Vertical Eye-movement Components in Electrooculograms. Sensors, 16(2), 227. https://doi.org/10.3390/s16020227
  2. C. H. Morimoto & M. R. M. Mimica. (2005). Eye Gaze Tracking Techniques for Interactive Applications. Computer Vision and Image Understanding. 98, 4-24. https://doi.org/10.1016/j.cviu.2004.07.010
  3. A. Bulling, D. Roggen & G. Troster. (2009). Wearable EOG goggles: Seamless Sensing and Context-awareness in Everyday Environments, Journal of Ambient Intelligent Smart Environent. 1, 157-171.
  4. W. D. Chang, H. S. Cha, D. Y. Kim, S. H. Kim & C. H. Im. (2017). Development of an Electrooculogram-based Eye-computer Interface for Communication of Indiv-id-als with Amyotrophic Lateral Sclerosis, Journal of Neuroengineering and Rehabilitation. 14(1), Article ID: 89.
  5. F. Fang & T.Shinozaki. (2018). Electrooculography-based Continuous Eye-writing Recognition System for Efficient Assistive Communication Systems. PLoS One, 13(2), Article ID: e0192684.
  6. S. Benedetto, M. Pedrotti, L. Minin, T. Baccino, A. Re & R. Montanari. (2011). Driver Workload and Eye Blink Duration, Transportation Research Part F, 14, 199-208. https://doi.org/10.1016/j.trf.2010.12.001
  7. B. D. Yetton, M. Niknazar, K.A. Duggan, E.A. McDevitt, L.N. Whitehurst, N. Sattari & S.C. Mednick. (2015). Automatic Detection of Rapid Eye Movements (REMs): A Machine Learning Spproach, Journal of Neuroscience Methods, 259, 72-82.
  8. R. Barea, L. Boquete, M. Mazo, E. Lopez. (2002). Wheelchair Guidance Strategies using EOG, Journal of Intelligent and Robotic Systems: Theory and Applications, 34(3), 279-299. https://doi.org/10.1023/A:1016359503796
  9. J. Z. Tsai, C. K. Lee, C. M. Wu, J. J. Wu , K. P. Kao. (2008). A Feasibility Study of an Eye-writing System Based on Electro-oculography. Journal of Medical and Biological Engineering, 28, 39-46.
  10. K. R. Lee, W. D. Chang, S. Kim, C. H. Im. (2017). Real-Time ‘Eye-Writing' Recognition using Electrooculogram (EOG). IEEE Transactions on Neural Systems and Rehabilitation Engineering, 25(1), 37-48. https://doi.org/10.1109/TNSRE.2016.2542524
  11. D. Borghetti, A. Bruni, M. Fabbrini, L. Murri, F. Sartucci. (2007). A Low-Cost Interface for Control of Computer Functions by Means of Eye Movements, Computers in Biology and Medicine, 37(12), 1765-1770. https://doi.org/10.1016/j.compbiomed.2007.05.003
  12. W.-D. Chang and J. Shin. (2009). Dynamic Positional Warping: Dynamic Time Warping for Online Handwriting. International Journal of Pattern Recognition and Artificial Intelligence, 23(5), 967-986. https://doi.org/10.1142/S0218001409007454
  13. W.-D. Chang and J. Shin. (2008) DPW Approach for Random Forgery Problem in Online Handwritten Signature Verification. The 4th International Conference on Networked Computing and Advanced Information Management, pp. 347-352. Gyeongju: IEEE.
  14. K. Yamagishi, J. Hori, M. Miyakawa. (2006). Development of EOG-based communication system controlled by eight-directional eye movements. International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 2574-2577. New York: IEEE.
  15. S. L. Wu, L. De Liao, S. W. Lu, W. L. Jiang, S.A. Chen, C.T. Lin. (2013). Controlling a human-computer interface system with a novel classification method that uses electrooculography signals, IEEE Transactions on Biomedical Engineering, 60, 2133-2141. https://doi.org/10.1109/TBME.2013.2248154
  16. A. Banerjee, S. Datta, M. Pal, A. Konar, D.N. Tibarewala, R. Janarthanan. (2013). Classifying Electrooculogram to Detect Directional Eye Movements, Procedia Technology, 10, 67-75. https://doi.org/10.1016/j.protcy.2013.12.338
  17. E. Ianez, J.M. Azorin, C. Perez-Vidal. (2013). Using Eye Movement to Control a Computer: A Design for a Lightweight Electro-Oculogram Electrode Array and Computer Interface, PLoS One, 8(7), Article ID: e67099.
  18. H.-J. Kim. (2017). A Review Study of Biosensors applicable to Wellness Wear, Journal of Digital Convergence, 15(11), 231-243. https://doi.org/10.14400/JDC.2017.15.11.231
  19. Y.-S. Jeong. (2017). Data Storage and Security Model for Mobile Healthcare Service based on IoT, Journal of Digital Convergence, 15(3), 187-193. https://doi.org/10.14400/JDC.2017.15.3.187
  20. M.-J. Lee, H.-K. Kang. (2017). Effects of Mobile based-Healthcare Service using Human Coaching to the Self-care of Diabetes. Journal of Convergence for Information Technology, 7(4), 83-89. https://doi.org/10.22156/CS4SMB.2017.7.4.083
  21. M.-G. Cho. (2017). Smart Elderly-care System using Smart-phone. Journal of Convergence for Information Technology, 7(5), 129-135. https://doi.org/10.14801/jaitc.2017.7.2.129