• Title/Summary/Keyword: Eye-gaze Interface

Search Result 36, Processing Time 0.022 seconds

A Study of Secure Password Input Method Based on Eye Tracking with Resistance to Shoulder-Surfing Attacks (아이트래킹을 이용한 안전한 패스워드 입력 방법에 관한 연구 - 숄더 서핑 공격 대응을 중심으로)

  • Kim, Seul-gi;Yoo, Sang-bong;Jang, Yun;Kwon, Tae-kyoung
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.30 no.4
    • /
    • pp.545-558
    • /
    • 2020
  • The gaze-based input provides feedback to confirm that the typing is correct when the user types the text. Many studies have already demonstrated that feedback can increase the usability of gaze-based inputs. However, because the information of the typed text is revealed through feedback, it can be a target for shoulder-surfing attacks. Appropriate feedback needs to be used to improve security without compromising the usability of the gaze-based input using the original feedback. In this paper, we propose a new gaze-based input method, FFI(Fake Flickering Interface), to resist shoulder-surfing attacks. Through experiments and questionnaires, we evaluated the usability and security of the FFI compared to the gaze-based input using the original feedback.

A Study on Manipulating Method of 3D Game in HMD Environment by using Eye Tracking (HMD(Head Mounted Display)에서 시선 추적을 통한 3차원 게임 조작 방법 연구)

  • Park, Kang-Ryoung;Lee, Eui-Chul
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.45 no.2
    • /
    • pp.49-64
    • /
    • 2008
  • Recently, many researches about making more comfortable input device based on gaze detection technology have been done in human computer interface. However, the system cost becomes high due to the complicated hardware and there is difficulty to use the gaze detection system due to the complicated user calibration procedure. In this paper, we propose a new gaze detection method based on the 2D analysis and a simple user calibration. Our method used a small USB (Universal Serial Bus) camera attached on a HMD (Head-Mounted Display), hot-mirror and IR (Infra-Red) light illuminator. Because the HMD is moved according to user's facial movement, we can implement the gaze detection system of which performance is not affected by facial movement. In addition, we apply our gaze detection system to 3D first person shooting game. From that, the gaze direction of game character is controlled by our gaze detection method and it can target the enemy character and shoot, which can increase the immersion and interest of game. Experimental results showed that the game and gaze detection system could be operated at real-time speed in one desktop computer and we could obtain the gaze detection accuracy of 0.88 degrees. In addition, we could know our gaze detection technology could replace the conventional mouse in the 3D first person shooting game.

The Development of Human-mobile communication system (E-mobile system) Using EOG (ElectroOculoGraphy)

  • K., Youngmin;D., Nakju;Y., Youngil;C., Wankyun
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2000.10a
    • /
    • pp.266-266
    • /
    • 2000
  • This paper relates to develop the human-mobile interface system using saccadic eye movements as an aid to the disabled. EOG(ElectroOculoGraphy) method is used to measure the potentials of rapid eye movements because the method is not expensive and the device is simple. But because the resolution and accuracy of this signal are not good, the algorithm to remove the drifting, using ideal velocity shape, is applied to process the signals. The mobile robot (POSTUR-II) used in this system was developed in Robot & Bio-mechatronics laboratory in POSTECH and has the tele-operation system for the tele-communication with a main computer. Our Research is to help the physically disabled except his eye movements to operate some works with the mobile. Our Results about the system's possibility will be showed by some experimental tests giving the point information to the mobile by eye-gaze.

  • PDF

A Study on the Gaze Flow of Internet Portal Sites Utilizing Eye Tracking (아이트래킹을 활용한 인터넷 포털사이트의 시선 흐름에 관한 연구)

  • Hwang, Mi-Kyung;Kwon, Mahn-Woo;Lee, Sang-Ho;Kim, Chee-Yong
    • Journal of the Korea Convergence Society
    • /
    • v.13 no.2
    • /
    • pp.177-183
    • /
    • 2022
  • This study investigated through eye tracking what gaze path the audience searches through portal sites (Naver, Daum, Zoom, and Nate). As a result of the layout analysis according to the gaze path of the search engine, the four main pages, which can be called to be the gateway to information search, appeared in the form of a Z-shaped layout. The news and search pages of each site use an F-shape, which means that when people's eyes move from top to right in an F-shape, they read while moving their eyes from left to right(LTR), which sequentially moves to the bottom. As a result of analyzing through the heat map, gaze plot, and cluster, which are the visual analysis indicators of eye tracking, the concentration of eyes on the photo and head copy was found the most in the heat map, and it can be said to be of high interest in the information. The flow of gaze flows downward from the top left to the right, and it can be seen that the cluster is most concentrated at the top of the portal site. The website designer should focus on improving the accessibility and readability of the information desired by the user in the layout design, and periodic interface changes are required by investigating and analyzing the tendencies and behavioral patterns of the main users.

Eye Gaze toy Human Computer Interaction (눈동자의 움직임을 이용한 휴먼 컴퓨터 인터랙션)

  • 권기문;이정준;박강령;김재희
    • Proceedings of the IEEK Conference
    • /
    • 2003.11b
    • /
    • pp.83-86
    • /
    • 2003
  • This paper suggests user's interface with computer by means of detecting gaze under HMD, head mounted display, environment. System is derived as follows; firstly, calibrate a camera in HMD, which determines geometrical relationship between monitor and captured image. Second, detect the center of pupil using algorithm of the center of mass and represent a gazing position on the computer screen. If user blinks or stares at a certain position for a while, message is sent to computer. Experimental results show the center of mass is robust against glint effects, and detecting error was 7.1%. and 4.85% in vertical and horizontal direction, respectively. To adjust detailed movement of a mouse takes 0.8 sec more. The 98% of blinking is detected successfully and 94% of clicking detection is resulted.

  • PDF

Autonomous Wheelchair System Using Gaze Recognition (시선 인식을 이용한 자율 주행 휠체어 시스템)

  • Kim, Tae-Ui;Lee, Sang-Yoon;Kwon, Kyung-Su;Park, Se-Hyun
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.14 no.4
    • /
    • pp.91-100
    • /
    • 2009
  • In this paper, we propose autonomous intelligent wheelchair system which recognize the commands using the gaze recognition and avoid the detected obstacles by sensing the distance through range sensors on the way to driving. The user's commands are recognized by the gaze recognizer which use a centroid of eye pupil and two reflection points extracted using a camera with infrared filter and two infrared LEDs. These are used to control the wheelchair through the user interface. Then wheelchair system detects the obstacles using 10 ultrasonic sensors and assists that it avoid collision with obstacles. The proposed intelligent wheelchair system consists of gaze recognizor, autonomous driving module, sensor control board and motor control board. The gaze recognizer cognize user's commands through user interface, then the wheelchair is controled by the motor control board using recognized commands. Thereafter obstacle information detected by ultrasonic sensors is transferred to the sensor control board, and this transferred to the autonomous driving module. In the autonomous driving module, the obstacles are detected. For generating commands to avoid these obstacles, there are transferred to the motor control board. The experimental results confirmed that the proposed system can improve the efficiency of obstacle avoidance and provide the convenient user interface to user.

Facial Feature Tracking and Head Orientation-based Gaze Tracking

  • Ko, Jong-Gook;Kim, Kyungnam;Park, Seung-Ho;Kim, Jin-Young;Kim, Ki-Jung;Kim, Jung-Nyo
    • Proceedings of the IEEK Conference
    • /
    • 2000.07a
    • /
    • pp.11-14
    • /
    • 2000
  • In this paper, we propose a fast and practical head pose estimation scheme fur eye-head controlled human computer interface with non-constrained background. The method we propose uses complete graph matching from thresholded images and the two blocks showing the greatest similarity are selected as eyes, we also locate mouth and nostrils in turn using the eye location information and size information. The average computing time of the image(360*240) is within 0.2(sec) and we employ template matching method using angles between facial features for head pose estimation. It has been tested on several sequential facial images with different illuminating conditions and varied head poses, It returned quite a satisfactory performance in both speed and accuracy.

  • PDF

Improved Feature Extraction of Hand Movement EEG Signals based on Independent Component Analysis and Spatial Filter

  • Nguyen, Thanh Ha;Park, Seung-Min;Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.22 no.4
    • /
    • pp.515-520
    • /
    • 2012
  • In brain computer interface (BCI) system, the most important part is classification of human thoughts in order to translate into commands. The more accuracy result in classification the system gets, the more effective BCI system is. To increase the quality of BCI system, we proposed to reduce noise and artifact from the recording data to analyzing data. We used auditory stimuli instead of visual ones to eliminate the eye movement, unwanted visual activation, gaze control. We applied independent component analysis (ICA) algorithm to purify the sources which constructed the raw signals. One of the most famous spatial filter in BCI context is common spatial patterns (CSP), which maximize one class while minimize the other by using covariance matrix. ICA and CSP also do the filter job, as a raw filter and refinement, which increase the classification result of linear discriminant analysis (LDA).

A Study for Detecting a Gazing Point Based on Reference Points (참조점을 이용한 응시점 추출에 관한 연구)

  • Kim, S.I.;Lim, J.H.;Cho, J.M.;Kim, S.H.;Nam, T.W.
    • Journal of Biomedical Engineering Research
    • /
    • v.27 no.5
    • /
    • pp.250-259
    • /
    • 2006
  • The information of eye movement is used in various fields such as psychology, ophthalmology, physiology, rehabilitation medicine, web design, HMI(human-machine interface), and so on. Various devices to detect the eye movement have been developed but they are too expensive. The general methods of eye movement tracking are EOG(electro-oculograph), Purkinje image tracker, scleral search coil technique, and video-oculograph(VOG). The purpose of this study is to embody the algorithm which tracks the location of the gazing point at a pupil. Two kinds of location data were compared to track the gazing point. One is the reference points(infrared LEDs) which is effected from the globe. Another is the center point of the pupil which is gained with a CCD camera. The reference point was captured with the CCD camera and infrared lights which were not recognized by human eyes. Both of images which were thrown and were not thrown an infrared light on the globe were captured and saved. The reflected reference points were detected with the brightness difference between the two saved images. In conclusion, the circumcenter theory of a triangle was used to look for the center of the pupil. The location of the gazing point was relatively indicated with the each center of the pupil and the reference point.

A Study on Eye Tracking Techniques using Wearable Devices (웨어러블향(向) 시선추적 기법에 관한 연구)

  • Jaehyuck Jang;Jiu Jung;Junghoon Park
    • Smart Media Journal
    • /
    • v.12 no.3
    • /
    • pp.19-29
    • /
    • 2023
  • The eye tracking technology is widespread all around the society, and is demonstrating great performances in both preciseness and convenience. Hereby we can glimpse new possibility of an interface's conduct without screen-touching. This technology can become a new way of conversation for those including but not limited to the patients suffering from Lou Gehrig's disease, who are paralyzed each part by part of the body and finally cannot help but only moving eyes. Formerly in that case, the patients were given nothing to do but waiting for the death, even being unable to communicate with there families. A new interface that harnesses eyes as a new means of communication, although it conveys great difficulty, can be helpful for them. There surely are some eye tracking systems and equipment for their exclusive uses on the market. Notwithstanding, several obstacles including the complexity of operation and their high prices of over 12 million won($9,300) are hindering universal supply to people and coverage for the patients. Therefore, this paper suggests wearable-type eye tracking device that can support minorities and vulnerable people and be occupied inexpensively and study eye tracking method in order to maximize the possibility of future development across the world, finally proposing the way of designing and developing a brought-down costed eye tracking system based on high-efficient wearable device.