Browse > Article
http://dx.doi.org/10.9717/kmms.2021.24.7.903

Object Magnification and Voice Command in Gaze Interface for the Upper Limb Disabled  

Park, Joo Hyun (Research Institute of ICT Convergence, Sookmyung Women's University)
Jo, Se-Ran (Dept. of IT Engineering, Sookmyung Women's University)
Lim, Soon-Bum (Dept. of IT Engineering, Sookmyung Women's University)
Publication Information
Abstract
Eye tracking research for upper limb disabilities is showing an effect in the aspect of device control. However, the reality is that it is not enough to perform web interaction with only eye tracking technology. In the Eye-Voice interface, a previous study, in order to solve the problem that the existing gaze tracking interfaces cause a malfunction of pointer execution, a gaze tracking interface supplemented with a voice command was proposed. In addition, the reduction of the malfunction rate of the pointer was confirmed through a comparison experiment with the existing interface. In this process, the difficulty of pointing due to the small size of the execution object in the web environment was identified as another important problem of malfunction. In this study, we propose an auto-magnification interface of objects so that people with upper extremities can freely click web contents by improving the problem that it was difficult to point and execute due to the high density of execution objects and their arrangements in web pages.
Keywords
Eye Tracking; Auto Magnification of Objects; The Upper Limb Disabilities; Pointer Control; Accessibility; Device Accessibility;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Google Cloud Speech API(2020). https://cloud.google.com/speech-to-text/?hl=ko (accessed March 24, 2020).
2 HTML(2011). https://www.w3.org/TR/2011/WD-html5-20110405/ (accessed March 24, 2020).
3 Tobii_reserach API(2020). http://developer. tobiipro.com/ (accessed March 24, 2020).
4 Tobii Etyetracket X130(2019). https://www.tobiipro.com/ (accessed March 24, 2020).
5 J.H. Park, M.H. Park, and S.B. Lim, "A Proposal of Eye-Voice Method based on the Comparative Analysis of Malfunctions on Pointer Click in Gaze Interface for the Upper Limb Disabled," Journal of Korea Multimedia Society, Vol. 23, No. 4, pp. 566-573, 2020.
6 J.H. Park, Multimodal Interface to Improve Digital Device Accessibility for the People with Disabilities in Web Environment, Doctoral Dissertation of Sookmyung Women's University, 2020.
7 A. Murata, R. Uetsugi, and T. Hayami, "Study on Cursor Shape Suitable for Eye-Gaze Input System," Proceedings of SICE Annual Conference(SICE), pp. 926-931, 2012.
8 J.-R. Choi, A Digital Publishing Framework for Crowdsourcing based Adaptive ebook Contents, Doctors Dissertation of Sookmyung Women's University, 2017.
9 A. Chetcuti and C. Porter, "Butterfleye : Supporting the Development of Accessible Web Applications for Users with Severe MotorImpairment," Proceedings of the 30th International BCS Human Computer Interaction Conference, pp. 1-3, 2016.
10 R. Menges, C. Kumar, D.J. Muller, and K. Sengupta, "GazeTheWeb: A Gaze-Controlled Web Browser," Proceedings of the 14th Web for All Conference on The Future of Accessible Work, Article No. 25, 2017.
11 S. Kwak, I. Kim, D. Sim, S.H. Lee, and S.S. Hwang, "A Computer Access System for the Physically Disabled Using Eye-Tracking and Speech Recognition," Journal of the HCI Society of Korea, Vol. 12, No. 4, pp. 5-15, 2017.
12 Python(2020). https://www.python.org/ (accessed March 24, 2020).