• Title/Summary/Keyword: gaze-based user interface

Search Result 19, Processing Time 0.028 seconds

A Study on Interaction of Gaze-based User Interface in Mobile Virtual Reality Environment (모바일 가상현실 환경에서의 시선기반 사용자 인터페이스 상호 작용에 관한 연구)

  • Kim, Mingyu;Lee, Jiwon;Jeon, Changyu;Kim, Jinmo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.23 no.3
    • /
    • pp.39-46
    • /
    • 2017
  • This study proposes a gaze-based user interface to provide user oriented interaction suitable for the virtual reality environment on mobile platforms. For this purpose, a mobile platform-based three-dimensional interactive content is produced, to test whether the proposed interface increases user satisfaction through the interactions in a mobile virtual reality environment. The gaze-based interface, the most common input method for mobile virtual reality content, is designed by considering two factors: the field of view and the feedback system. The performance of the proposed gaze-based interface is analyzed by conducting experiments on whether or not it offers motives for user interest, effects of enhanced immersion, differentiated formats from existing ones, and convenience in operating content.

A New Eye Tracking Method as a Smartphone Interface

  • Lee, Eui Chul;Park, Min Woo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.4
    • /
    • pp.834-848
    • /
    • 2013
  • To effectively use these functions many kinds of human-phone interface are used such as touch, voice, and gesture. However, the most important touch interface cannot be used in case of hand disabled person or busy both hands. Although eye tracking is a superb human-computer interface method, it has not been applied to smartphones because of the small screen size, the frequently changing geometric position between the user's face and phone screen, and the low resolution of the frontal cameras. In this paper, a new eye tracking method is proposed to act as a smartphone user interface. To maximize eye image resolution, a zoom lens and three infrared LEDs are adopted. Our proposed method has following novelties. Firstly, appropriate camera specification and image resolution are analyzed in order to smartphone based gaze tracking method. Secondly, facial movement is allowable in case of one eye region is included in image. Thirdly, the proposed method can be operated in case of both landscape and portrait screen modes. Fourthly, only two LED reflective positions are used in order to calculate gaze position on the basis of 2D geometric relation between reflective rectangle and screen. Fifthly, a prototype mock-up design module is made in order to confirm feasibility for applying to actual smart-phone. Experimental results showed that the gaze estimation error was about 31 pixels at a screen resolution of $480{\times}800$ and the average hit ratio of a $5{\times}4$ icon grid was 94.6%.

A New Ergonomic Interface System for the Disabled Person (장애인을 위한 새로운 감성 인터페이스 연구)

  • Heo, Hwan;Lee, Ji-Woo;Lee, Won-Oh;Lee, Eui-Chul;Park, Kang-Ryoung
    • Journal of the Ergonomics Society of Korea
    • /
    • v.30 no.1
    • /
    • pp.229-235
    • /
    • 2011
  • Objective: Making a new ergonomic interface system based on camera vision system, which helps the handicapped in home environment. Background: Enabling the handicapped to manipulate the consumer electronics by the proposed interface system. Method: A wearable device for capturing the eye image using a near-infrared(NIR) camera and illuminators is proposed for tracking eye gaze position(Heo et al., 2011). A frontal viewing camera is attached to the wearable device, which can recognize the consumer electronics to be controlled(Heo et al., 2011). And the amount of user's eye fatigue can be measured based on eye blink rate, and in case that the user's fatigue exceeds in the predetermined level, the proposed system can automatically change the mode of gaze based interface into that of manual selection. Results: The experimental results showed that the gaze estimation error of the proposed method was 1.98 degrees with the successful recognition of the object by the frontal viewing camera(Heo et al., 2011). Conclusion: We made a new ergonomic interface system based on gaze tracking and object recognition Application: The proposed system can be used for helping the handicapped in home environment.

Performance Comparison of Manual and Touch Interface using Video-based Behavior Analysis

  • Lee, Chai-Woo;Bahn, Sang-Woo;Kim, Ga-Won;Yun, Myung-Hwan
    • Journal of the Ergonomics Society of Korea
    • /
    • v.29 no.4
    • /
    • pp.655-659
    • /
    • 2010
  • The objective of this study is to quantitatively incorporate user observation into usability evaluation of mobile interfaces using monitoring techniques in first- and third-person points of view. In this study, an experiment was conducted to monitor and record users' behavior using Ergoneers Dikablis, a gaze tracking device. The experiment was done with 2 mobile phones each with a button keypad interface and a touchscreen interface for comparative analysis. The subjects included 20 people who have similar experiences and proficiency in using mobile devices. Data from video recordings were coded with Noldus Observer XT to find usage patterns and to gather quantitative data for analysis in terms of effectiveness, efficiency and satisfaction. Results showed that the button keypad interface was generally better than the touchcreen interface. The movements of the fingers and gaze were much simpler when performing given tasks on the button keypad interface. While previous studies have mostly evaluated usability with performance measures by only looking at task results, this study can be expected to contribute by suggesting a method in which the behavioral patterns of interaction is evaluated.

MyWorkspace: VR Platform with an Immersive User Interface (MyWorkspace: 몰입형 사용자 인터페이스를 이용한 가상현실 플랫폼)

  • Yoon, Jong-Won;Hong, Jin-Hyuk;Cho, Sung-Bae
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.52-55
    • /
    • 2009
  • With the recent development of virtual reality, it has been actively investigated to develop user interfaces for immersive interaction. Immersive user interfaces improve the efficiency and the capability of information processing in the virtual environment providing various services, and provide effective interaction in the field of ubiquitous and mobile computing. In this paper, we propose an virtual reality platform "My Workspace" which renders an 3D virtual workspace by using an immersive user interface. We develop an interface that integrates an optical see-through head-mounted display, a Wii remote controller, and a helmet with infrared LEDs. It estimates the user's gaze direction in terms of horizontal and vertical angles based on the model of head movements. My Workspace expands the current 2D workspace based on monitors into the layered 3D workspace, and renders a part of 3D virtual workspace corresponding to the gaze direction. The user can arrange various tasks on the virtual workspace and switch each task by moving his head. In this paper, we will also verify the performance of the immersive user interface as well as its usefulness with the usability test.

  • PDF

Gaze Detection Based on Facial Features and Linear Interpolation on Mobile Devices (모바일 기기에서의 얼굴 특징점 및 선형 보간법 기반 시선 추적)

  • Ko, You-Jin;Park, Kang-Ryoung
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.8
    • /
    • pp.1089-1098
    • /
    • 2009
  • Recently, many researches of making more comfortable input device based on gaze detection technology have been performed in human computer interface. Previous researches were performed on the computer environment with a large sized monitor. With recent increase of using mobile device, the necessities of interfacing by gaze detection on mobile environment were also increased. In this paper, we research about the gaze detection method by using UMPC (Ultra-Mobile PC) and an embedded camera of UMPC based on face and facial feature detection by AAM (Active Appearance Model). This paper has following three originalities. First, different from previous research, we propose a method for tracking user's gaze position in mobile device which has a small sized screen. Second, in order to detect facial feature points, we use AAM. Third, gaze detection accuracy is not degraded according to Z distance based on the normalization of input features by using the features which are obtained in an initial user calibration stage. Experimental results showed that gaze detection error was 1.77 degrees and it was reduced by mouse dragging based on the additional facial movement.

  • PDF

Adaptive Zoom-based Gaze Tracking for Enhanced Accuracy and Precision (정확도 및 정밀도 향상을 위한 적응형 확대 기반의 시선 추적 기법)

  • Song, Hyunjoo;Jo, Jaemin;Kim, Bohyoung;Seo, Jinwook
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.9
    • /
    • pp.610-615
    • /
    • 2015
  • The accuracy and precision of video-based remote gaze trackers is affected by numerous factors (e.g. the head movement of the participant). However, it is challenging to control all factors that have an influence, and doing so (e.g., using a chin-rest to control geometry) could lead to losing the benefit of using gaze trackers, i.e., the ecological validity of their unobtrusive nature. We propose an adaptive zoom-based gaze tracking technique, ZoomTrack that addresses this problem by improving the resolution of the gaze tracking results. Our approach magnifies a region-of-interest (ROI) and retrieves gaze points at a higher resolution under two different zooming modes: only when the gaze reaches the ROI (temporary) or whenever a participant stares at the stimuli (omnipresent). We compared these against the base case without magnification in a user study. The results are then used to summarize the advantages and limitations of our technique.

A Study of Secure Password Input Method Based on Eye Tracking with Resistance to Shoulder-Surfing Attacks (아이트래킹을 이용한 안전한 패스워드 입력 방법에 관한 연구 - 숄더 서핑 공격 대응을 중심으로)

  • Kim, Seul-gi;Yoo, Sang-bong;Jang, Yun;Kwon, Tae-kyoung
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.30 no.4
    • /
    • pp.545-558
    • /
    • 2020
  • The gaze-based input provides feedback to confirm that the typing is correct when the user types the text. Many studies have already demonstrated that feedback can increase the usability of gaze-based inputs. However, because the information of the typed text is revealed through feedback, it can be a target for shoulder-surfing attacks. Appropriate feedback needs to be used to improve security without compromising the usability of the gaze-based input using the original feedback. In this paper, we propose a new gaze-based input method, FFI(Fake Flickering Interface), to resist shoulder-surfing attacks. Through experiments and questionnaires, we evaluated the usability and security of the FFI compared to the gaze-based input using the original feedback.

A Study on Gamepad/Gaze based Input Processing for Mobile Platform Virtual Reality Contents (모바일 플랫폼 가상현실 콘텐츠에 적합한 게임패드/시선 기반 입력 처리 기술에 관한 연구)

  • Lee, Jiwon;Kim, Mingyu;Jeon, Changyu;Kim, Jinmo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.22 no.3
    • /
    • pp.31-41
    • /
    • 2016
  • This study proposes suitable input processing technique for producing the mobile platform based virtual reality contents. First of all, we produce the mobile platform based virtual reality interactive contents to be used in experiments for improve user's immersion who experience the virtual reality contents, get interests and design the input processing that easily controllable. Then design the input processing technique in two methods, with gamepad that accessibility to mobile and with directly through user's gaze to interface. Through virtual reality based input processing technique that we proposed, we analyse effect of improve user's immersion, cause of get interests and whether provide the convenience or not for controlling contents through experiments. Moreover we verify whether bring negative psychological elements like sickness, fatigue or not.

EOG-based User-independent Gaze Recognition using Wavelet Coefficients and Dynamic Positional Warping (웨이블릿 계수와 Dynamic Positional Warping을 통한 EOG기반의 사용자 독립적 시선인식)

  • Chang, Won-Du;Im, Chang-Hwan
    • Journal of Korea Multimedia Society
    • /
    • v.21 no.9
    • /
    • pp.1119-1130
    • /
    • 2018
  • Writing letters or patterns on a virtual space by moving a person's gaze is called "eye writing," which is a promising tool for various human-computer interface applications. This paper investigates the use of conventional eye writing recognition algorithms for the purpose of user-independent recognition of eye-written characters. Two algorithms are presented to build the user-independent system: eye-written region extraction using wavelet coefficients and template generation. The experimental results of the proposed system demonstrated that with dynamic positional warping, an F1 score of 79.61% was achieved for 12 eye-written patterns, thereby indicating the possibility of user-independent use of eye writing.