Browse > Article
http://dx.doi.org/10.9717/kmms.2020.23.12.1454

Improving Eye-gaze Mouse System Using Mouth Open Detection and Pop Up Menu  

Byeon, Ju Yeong (Dept. of The Global School of Media, College of Information Technology, Soongsil University)
Jung, Keechul (Dept. of The Global School of Media, College of Information Technology, Soongsil University)
Publication Information
Abstract
An important factor in eye-tracking PC interface for general paralyzed patients is the implementation of the mouse interface, for manipulating the GUI. With a successfully implemented mouse interface, users can generate mouse events exactly at the point of their choosing. However, it is difficult to define this interaction in the eye-tracking interface. This problem has been defined as the Midas touch problem and has been a major focus of eye-tracking research. There have been many attempts to solve this problem using blink, voice input, etc. However, it was not suitable for general paralyzed patients because some of them cannot wink or speak. In this paper, we propose a mouth-pop-up, eye-tracking mouse interface that solves the Midas touch problem as well as becoming a suitable interface for general paralyzed patients using a common RGB camera. The interface presented in this paper implements a mouse interface that detects the opening and closing of the mouth to activate a pop-up menu that the user can select the mouse event. After implementation, a performance experiment was conducted. As a result, we found that the number of malfunctions and the time to perform tasks were reduced compared to the existing method.
Keywords
HCI; Interface; Eye Gaze; Landmark Detection; Mouse; Midas Touch Problem; Assistive Technology; Spinal Cord Injury;
Citations & Related Records
Times Cited By KSCI : 2  (Citation Analysis)
연도 인용수 순위
1 WHO. ISCOS, Iternational Perspectives on Spinal Cord Injury, Geneva, Switzerland, 2013.
2 X. Zhang, X. Liu, S.M. Yuan, and S.F. Lin, "Eye Tracking Based Control System for Natural Human-computer Interaction," Computational Intelligence and Neuroscience, Vol. 2017, No. 1, pp. 1-9, 2017.
3 H. Singh and J. Singh, "Real-time Eye Blink and Wink Detection for Object Selection in HCI Systems," Journal on Multimodal User Interfaces, Vol. 12, No. 1, pp. 55-65, 2018.   DOI
4 L.L. Palmer, "Inability to Wink an Eye and Eye Dominance," Perceptual and Motor Skills, Vol. 42, No. 3, pp. 825-826, 1976.   DOI
5 S. Soundarajan and H. Cecotti, "A Gaze-based Virtual Keyboard Using a Mouth Switch for Command Selection," Proceeding of Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 3334-3337, 2018.
6 J.H. Park, M.H. Park, and S.B Lim, "A Proposal of Eye-voice Method Based on the Comparative Analysis of Malfunctions on Pointer Click in Gaze Interface for the Upper Limb Disabled," Journal of Korea Multimedia Society, Vol. 23, No. 4, pp. 566-573, 2020.
7 W.A. Choi, J.C. Shin, D.Y. Lee, D.H. Kim, S.D. Kim, S.W. Kang, et al., "Noninvasive Respiratory Management for Patients with Cervical Spinal Cord Injury Annals of Rehabilitation Medicine," Annals of Rehabilitation Medicine, Vol. 34, No. 5, pp. 518-523, 2010.
8 H.N. H, K.M. Um, E.B. Song, C.S. Kim, and J.W. Heo, "Development and Clinical Evaluation of Wireless Gyro-mouse for the Upper Extremity Disabled to Use Computer," Science of Emotion and Sensibility, Vol. 9, No. 2, pp. 93-100, 2006.
9 Winuser.h(2019), https://docs.microsoft.com/en-us/windows/win32/api/winuser (accessed July 24, 2020).
10 Interaction Library(2018), https://developer.tobii.com/consumer-eye-trackers-interaction-library (accessed July 9, 2020).
11 W. Barfield and T. Furness, Virtual Environments and Advanced Interface Design, Oxford University Press, New York, 1995.