Browse > Article
http://dx.doi.org/10.9717/kmms.2020.23.4.566

A Proposal of Eye-Voice Method based on the Comparative Analysis of Malfunctions on Pointer Click in Gaze Interface for the Upper Limb Disabled  

Park, Joo Hyun (Research Institute of ICT Convergence, Dept of IT Engineering, Sookmyung Women's University)
Park, Mi Hyun (Dept. of IT Engineering, Sookmyung Women's University)
Lim, Soon-Bum (Research Institute of ICT Convergence, Dept of IT Engineering, Sookmyung Women's University)
Publication Information
Abstract
Computers are the most common tool when using the Internet and utilizing a mouse to select and execute objects. Eye tracking technology is welcomed as an alternative technology to help control computers for users who cannot use their hands due to their disabilities. However, the pointer execution method of the existing eye tracking technique causes many malfunctions. Therefore, in this paper, we developed a gaze tracking interface that combines voice commands to solve the malfunction problem when the upper limb disabled uses the existing gaze tracking technology to execute computer menus and objects. Usability verification was conducted through comparative experiments regarding the improvements of the malfunction. The upper limb disabled who are hand-impaired use eye tracking technology to move the pointer and utilize the voice commands, such as, "okay" while browsing the computer screen for instant clicks. As a result of the comparative experiments on the reduction of the malfunction of pointer execution with the existing gaze interfaces, we verified that our system, Eye-Voice, reduced the malfunction rate of pointer execution and is effective for the upper limb disabled to use.
Keywords
Eye Tracking; Voice Command; The Upper Limb Disabilities; Pointer Control; Accessibility; Device Accessibility;
Citations & Related Records
Times Cited By KSCI : 2  (Citation Analysis)
연도 인용수 순위
1 J.H. Park, S. Park, M. Lee and S.B. Lim, “Voice Activity Detection Algorithm using Wavelet Band Entropy Ensemble Analysis in Car Noisy Environments,” Journal of Korea Multimedia Society, Vol. 21, No. 11, pp. 1342-1352, 2018.   DOI
2 D.H. Jin, A study on Usability Evaluation of an Eye Mouse-Based on the Function of Click, Double-Click, Drag, and Scroll, Master's Thesis of Seoul National University, 2016.
3 A. Murata, R. Uetsugi and T. Hayami, "Study on cursor shape suitable for eye-gaze input system," 2012 Proceedings of SICE Annual Conference(SICE), pp. 926-931, 2012.
4 Alastair Chetcuti and Chris Porter, "Butterfleye : Supporting the Development of Accessible Web Applications for Users with Severe Motor-Impairment" Proceedings of the 30th International BCS Human Computer Interaction Conference, 2016.
5 Menges, Raphael, Chandan Kumar, Daniel J Muller and Korok Sengupta. "GazeTheWeb: A Gaze-Controlled Web Browser." Proceedings of the 14th Web for All Conference on The Future of Accessible Work, Article No. 25, ACM, 2017.
6 Python. https://www.python.org/ (accessed March 24, 2018)
7 tobii_reserach API. http://developer.tobiipro.com/ (accessed March 24, 2018)
8 Google Cloud Speech API. https://cloud.google.com/speech-to-text/?hl=ko (accessed March 24, 2018)
9 Tobii Etyetracket X130. https://www.tobiipro.com/ (accessed March 24, 2018)
10 Window API. https://docs.microsoft.com/enus/windows/win32/apiindex/api-index-portal (accessed March 24, 2018)
11 S. Kwak, I. Kim, D. Sim, S.H. Lee, and S.S. Hwang, "A Computer Access System for the Physically Disabled Using Eye-Tracking and Speech Recognition", Journal of the HCI Society of Korea, Vol. 12, No. 4, pp. 5-15, 2017.