• 제목/요약/키워드: gaze-eye tracking

검색결과 130건 처리시간 0.021초

상지장애인을 위한 시선 인터페이스에서 포인터 실행 방법의 오작동 비교 분석을 통한 Eye-Voice 방식의 제안 (A Proposal of Eye-Voice Method based on the Comparative Analysis of Malfunctions on Pointer Click in Gaze Interface for the Upper Limb Disabled)

  • 박주현;박미현;임순범
    • 한국멀티미디어학회논문지
    • /
    • 제23권4호
    • /
    • pp.566-573
    • /
    • 2020
  • Computers are the most common tool when using the Internet and utilizing a mouse to select and execute objects. Eye tracking technology is welcomed as an alternative technology to help control computers for users who cannot use their hands due to their disabilities. However, the pointer execution method of the existing eye tracking technique causes many malfunctions. Therefore, in this paper, we developed a gaze tracking interface that combines voice commands to solve the malfunction problem when the upper limb disabled uses the existing gaze tracking technology to execute computer menus and objects. Usability verification was conducted through comparative experiments regarding the improvements of the malfunction. The upper limb disabled who are hand-impaired use eye tracking technology to move the pointer and utilize the voice commands, such as, "okay" while browsing the computer screen for instant clicks. As a result of the comparative experiments on the reduction of the malfunction of pointer execution with the existing gaze interfaces, we verified that our system, Eye-Voice, reduced the malfunction rate of pointer execution and is effective for the upper limb disabled to use.

실시간 눈과 시선 위치 추적 (Real Time Eye and Gaze Tracking)

  • 황선기;김문환;차샘;조은석;배철수
    • 한국정보전자통신기술학회논문지
    • /
    • 제2권3호
    • /
    • pp.61-69
    • /
    • 2009
  • 본 논문에서는 새로운 실시간 시선 추적 방식을 제안하고자한다. 기존의 시선추적 방식은 사용자가 머리를 조금만 움직여도 잘못된 결과를 얻을 수가 있었고 각각의 사용자에 대하여 교정 과정을 수행할 필요가 있었다. 제안된 시선 추적 방법은 적외선 조명과 Generalized Regression Neural Networks(GRNN)를 이용함으로써 교정 과정 없이 머리의 움직임이 큰 경우에도 견실하고 정확한 시선 추적을 가능하도록 하였고 매핑 기능을 일반화함으로써 각각의 교정과정을 생략 할 수 있게 하여 학습에 참여하지 않은 다른 사용자도 시선 추적을 가능케 하였다. 실험결과 얼굴의 움직임이 있는 경우에는 평균 90%, 다른 사용자에 대해서는 평균 85%의 시선 추적 결과를 나타내었다.

  • PDF

착용형 양안 시선추적기와 기계학습을 이용한 시선 초점 거리 추정방법 평가 (Evaluation of Gaze Depth Estimation using a Wearable Binocular Eye tracker and Machine Learning)

  • 신춘성;이건;김영민;홍지수;홍성희;강훈종;이영호
    • 한국컴퓨터그래픽스학회논문지
    • /
    • 제24권1호
    • /
    • pp.19-26
    • /
    • 2018
  • 본 논문은 가상현실 및 증강현실을 위해 양안식 눈추적기 기반의 시선 깊이 추정 기법을 제안한다. 제안한 방법은 먼저 양안식 눈추적기로부터 안구 및 시선과 관련된 다양한 정보를 획득한다. 이후 획득된 정보를 바탕으로 다층퍼셉트론 알고리즘 기반의 시선 추적과 인식 모델을 통해 눈 시선 깊이를 추정한다. 제안한 방법을 검증하기 위해 13명의 참여자를 모집하고 개인별 시선 추적과 범용 시선 추적에 대한 성능을 분석하였다. 실험결과 개인별 모델에서는 90.1%, 그리고 전체 사용자를 대상으로 한 범용 모델에서는 89.7%의 정확도를 보였다.

A Human-Robot Interface Using Eye-Gaze Tracking System for People with Motor Disabilities

  • Kim, Do-Hyoung;Kim, Jae-Hean;Yoo, Dong-Hyun;Lee, Young-Jin;Chung, Myung-Jin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • 제3권4호
    • /
    • pp.229-235
    • /
    • 2001
  • Recently, service area has been emerging field f robotic applications. Even though assistant robots play an important role for the disabled and the elderly, they still suffer from operating the robots using conventional interface devices such as joysticks or keyboards. In this paper we propose an efficient computer interface using real-time eye-gaze tracking system. The inputs to the proposed system are images taken by a camera and data from a magnetic sensor. The measured data is sufficient to describe the eye and head movement because the camera and the receiver of a magnetic sensor are stationary with respect to the head. So the proposed system can obtain the eye-gaze direction in spite of head movement as long as the distance between the system and the transmitter of a magnetic position sensor is within 2m. Experimental results show the validity of the proposed system in practical aspect and also verify the feasibility of the system as a new computer interface for the disabled.

  • PDF

상지장애인을 위한 시선 인터페이스에서의 객체 확대 및 음성 명령 인터페이스 개발 (Object Magnification and Voice Command in Gaze Interface for the Upper Limb Disabled)

  • 박주현;조세란;임순범
    • 한국멀티미디어학회논문지
    • /
    • 제24권7호
    • /
    • pp.903-912
    • /
    • 2021
  • Eye tracking research for upper limb disabilities is showing an effect in the aspect of device control. However, the reality is that it is not enough to perform web interaction with only eye tracking technology. In the Eye-Voice interface, a previous study, in order to solve the problem that the existing gaze tracking interfaces cause a malfunction of pointer execution, a gaze tracking interface supplemented with a voice command was proposed. In addition, the reduction of the malfunction rate of the pointer was confirmed through a comparison experiment with the existing interface. In this process, the difficulty of pointing due to the small size of the execution object in the web environment was identified as another important problem of malfunction. In this study, we propose an auto-magnification interface of objects so that people with upper extremities can freely click web contents by improving the problem that it was difficult to point and execute due to the high density of execution objects and their arrangements in web pages.

머리 움직임이 자유로운 안구 응시 추정 시스템 (Eye Gaze Tracking System Under Natural Head Movements)

  • 김수찬
    • 전자공학회논문지SC
    • /
    • 제41권5호
    • /
    • pp.57-64
    • /
    • 2004
  • 한 대의 카메라와 반사각의 조절이 가능한 2개의 거울, 그리고 별도의 적외선 광원을 이용하여 자유로운 머리 움직임이 가능한 안구 응시점 추정 시스템을 제안하였다. 거울의 회전 각도는 카메라의 광축(opticai axis) 상에 안구가 올 수 있도록 공간 좌표계와 선형 방정식을 이용하여 계산하였다 제안한 시스템은 수평 방향으로 90cm 수직 방향으로 60cm 범위 내에서의 머리 움직임이 가능하였고, 응시점의 공간 해상도 각각 6°, 7°이며, 시간 해상도는 10~15 frames/sec이었다. Generalized regression neural networks(GRNN)을 기반으로 하여 2단계의 GRNN을 거치는 소위 hierarchical generalized regression neural networks(H-GRNN)을 이용하여 얻어진 인자를 모니터 좌표로 변환하였다. GRNN을 한번 사용하였을 경우 정확도가 85%이었으나 H-GRNN을 이용할 경우 약 9% 높은 94%의 정확도를 얻을 수 있었다. 그리고 입력 파라미터의 정규화를 통하여 재보정의 불편함을 제거했을 뿐만 아니라 약간의 얼굴 회전이 발생하였을 경우에도 동일한 성능을 보였다. 본 시스템은 공간 해상도는 크게 높지 않으나 자유로운 머리 움직임을 허용되므로 안정성과 피검자의 활동에 제약을 줄였다는 점에서 의의를 찾을 수 있다.

실시간 눈과 시선 위치 추적 (Real Time Eye and Gaze Tracking)

  • 조현섭;김희숙
    • 한국산학기술학회논문지
    • /
    • 제6권2호
    • /
    • pp.195-201
    • /
    • 2005
  • 본 논문에서는 새로운 실시간 시선 추적 방식을 제안하고자한다. 기존의 시선추적 방식은 사용자가 머리를 조금만 움직여도 잘못된 결과를 얻을 수가 있었고 각각의 사용자에 대하여 교정 과정을 수행할 필요가 있었다. 따라서 제안된 시선 추적 방법은 적외선 조명과 Generalized Regression Neural Networks(GRNN)를 이용함으로써 교정 과정 없이 머리의 움직임이 큰 경우에도 견실하고 정확한 시선 추적을 가능하도록 하였다. GRNN을 사용함으로써 매핑기능은 원활하게 할 수 있었고, 머리의 움직임은 시선 매핑 기능에 의해 적절하게 시선추적에 반영되어 얼굴의 움직임이 있는 경우에도 시선추적이 가능토록 하였고, 매핑 기능을 일반화함으로써 각각의 교정과정을 생략 할 수 있게 하여 학습에 참여하지 않은 다른 사용자도 시선 추적을 가능케 하였다. 실험결과 얼굴의 움직임이 있는 경우에는 평균 90%, 다른 사용자에 대해서는 평균 85%의 시선 추적 결과를 나타내었다.

  • PDF

Gaze Differences between Expert and Novice Teachers in Science Classes

  • Kim, Won-Jung;Byeon, Jung-Ho;Lee, Il-Sun;Kwon, Yong-Ju
    • 한국과학교육학회지
    • /
    • 제32권9호
    • /
    • pp.1443-1451
    • /
    • 2012
  • This study aims to investigate the gaze patterns of two expert and two novice teachers in one hour of lecture type class. Teachers recruited from the same middle school conducted the class each, wearing an eye-tracker. Gaze rate and gaze movement pattern were analyzed. The scene where teachers faced in the classroom was categorized into three zones; student zone, material zone, and non-teaching zone. Student zone was divided into nine areas of interest to see the gaze distribution within the student zone. Expert teachers showed focused gaze on student zone while novice teachers' gaze rate was significantly higher at the non-teaching zone, compared to expert teachers' one. Within student zone, expert teachers' gaze spread to the rear areas, but novice teachers' one was narrowly resided in the middle areas of the student zone. This difference in gaze caused different eye movement pattern: experts' T pattern and novices' I pattern. On the other hand, both teacher groups showed the least gaze rate onto the left and right front areas. Which change is required to teachers' gaze behavior and what must be considered in order to make effective teacher gaze in the classroom setting were discussed.

도시가로환경 구성요소의 우선순위에 관한 연구 - 아이트래킹 실험을 통한 관심영역설정 분석을 중심으로 - (A Study on the Priorities of Urban Street Environment Components - Focusing on An Analysis of AOI (Area of Interest) Setup through An Eye-tracking Experiment -)

  • 이선화;이창노
    • 한국실내디자인학회논문집
    • /
    • 제25권1호
    • /
    • pp.73-80
    • /
    • 2016
  • Street is the most fundamental component of city and place to promote diverse actions of people. Pedestrians gaze at various street environments. A visual gaze means that there are interesting elements and these elements need to be preferentially improved in the street environment improvement project. Therefore, this study aims to set up the priorities of street environment components by analyzing eye movements from a pedestrian perspective. In this study, street environment components were classified into road, street facility, building(facade) and sky and as street environment images, three "Streets of Youth" situated in Gwangbok-ro, Seomyeon and Busan University of Busan were selected. The experiment targeted 30 males and females in their twenties to forties. After setting the angle of sight through a calibration test, an eye-tracking experiment regarding the three images was conducted. Lastly, the subjects were asked to fill in questionnaires. The following three conclusions were obtained from the results of the eye-tracking experiment and the survey. First, building was the top priority of street environment components and it was followed by street facility, road and sky. Second, as components to be regarded as important, fast 'Sequence', many 'Fixation Counts' and 'Visit Counts', short 'Time to First Fixation' and long 'Fixation Duration' and 'Visit Duration' were preferred. Third, after voluntary eye movements, the subjects recognized the objects with the highest gaze frequency and the lowest gaze frequency.

Development of a Non-contact Input System Based on User's Gaze-Tracking and Analysis of Input Factors

  • Jiyoung LIM;Seonjae LEE;Junbeom KIM;Yunseo KIM;Hae-Duck Joshua JEONG
    • 한국인공지능학회지
    • /
    • 제11권1호
    • /
    • pp.9-15
    • /
    • 2023
  • As mobile devices such as smartphones, tablets, and kiosks become increasingly prevalent, there is growing interest in developing alternative input systems in addition to traditional tools such as keyboards and mouses. Many people use their own bodies as a pointer to enter simple information on a mobile device. However, methods using the body have limitations due to psychological factors that make the contact method unstable, especially during a pandemic, and the risk of shoulder surfing attacks. To overcome these limitations, we propose a simple information input system that utilizes gaze-tracking technology to input passwords and control web surfing using only non-contact gaze. Our proposed system is designed to recognize information input when the user stares at a specific location on the screen in real-time, using intelligent gaze-tracking technology. We present an analysis of the relationship between the gaze input box, gaze time, and average input time, and report experimental results on the effects of varying the size of the gaze input box and gaze time required to achieve 100% accuracy in inputting information. Through this paper, we demonstrate the effectiveness of our system in mitigating the challenges of contact-based input methods, and providing a non-contact alternative that is both secure and convenient.