• Title/Summary/Keyword: Intelligent gaze-tracking

Search Result 6, Processing Time 0.017 seconds

Development of a Non-contact Input System Based on User's Gaze-Tracking and Analysis of Input Factors

  • Jiyoung LIM;Seonjae LEE;Junbeom KIM;Yunseo KIM;Hae-Duck Joshua JEONG
    • Korean Journal of Artificial Intelligence
    • /
    • v.11 no.1
    • /
    • pp.9-15
    • /
    • 2023
  • As mobile devices such as smartphones, tablets, and kiosks become increasingly prevalent, there is growing interest in developing alternative input systems in addition to traditional tools such as keyboards and mouses. Many people use their own bodies as a pointer to enter simple information on a mobile device. However, methods using the body have limitations due to psychological factors that make the contact method unstable, especially during a pandemic, and the risk of shoulder surfing attacks. To overcome these limitations, we propose a simple information input system that utilizes gaze-tracking technology to input passwords and control web surfing using only non-contact gaze. Our proposed system is designed to recognize information input when the user stares at a specific location on the screen in real-time, using intelligent gaze-tracking technology. We present an analysis of the relationship between the gaze input box, gaze time, and average input time, and report experimental results on the effects of varying the size of the gaze input box and gaze time required to achieve 100% accuracy in inputting information. Through this paper, we demonstrate the effectiveness of our system in mitigating the challenges of contact-based input methods, and providing a non-contact alternative that is both secure and convenient.

Real-Time Eye Tracking Using IR Stereo Camera for Indoor and Outdoor Environments

  • Lim, Sungsoo;Lee, Daeho
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.8
    • /
    • pp.3965-3983
    • /
    • 2017
  • We propose a novel eye tracking method that can estimate 3D world coordinates using an infrared (IR) stereo camera for indoor and outdoor environments. This method first detects dark evidences such as eyes, eyebrows and mouths by fast multi-level thresholding. Among these evidences, eye pair evidences are detected by evidential reasoning and geometrical rules. For robust accuracy, two classifiers based on multiple layer perceptron (MLP) using gradient local binary patterns (GLBPs) verify whether the detected evidences are real eye pairs or not. Finally, the 3D world coordinates of detected eyes are calculated by region-based stereo matching. Compared with other eye detection methods, the proposed method can detect the eyes of people wearing sunglasses due to the use of the IR spectrum. Especially, when people are in dark environments such as driving at nighttime, driving in an indoor carpark, or passing through a tunnel, human eyes can be robustly detected because we use active IR illuminators. In the experimental results, it is shown that the proposed method can detect eye pairs with high performance in real-time under variable illumination conditions. Therefore, the proposed method can contribute to human-computer interactions (HCIs) and intelligent transportation systems (ITSs) applications such as gaze tracking, windshield head-up display and drowsiness detection.

A Study on the Mechanism of Social Robot Attitude Formation through Consumer Gaze Analysis: Focusing on the Robot's Face (소비자 시선 분석을 통한 소셜로봇 태도 형성 메커니즘 연구: 로봇의 얼굴을 중심으로)

  • Ha, Sangjip;Yi, Eunju;Yoo, In-jin;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.28 no.1
    • /
    • pp.243-262
    • /
    • 2022
  • In this study, eye tracking was used for the appearance of the robot during the social robot design study. During the research, each part of the social robot was designated as AOI (Areas of Interests), and the user's attitude was measured through a design evaluation questionnaire to construct a design research model of the social robot. The data used in this study are Fixation, First Visit, Total Viewed, and Revisits as eye tracking indicators, and AOI (Areas of Interests) was designed with the face, eyes, lips, and body of the social robot. And as design evaluation questionnaire questions, consumer beliefs such as Face-highlighted, Human-like, and Expressive of social robots were collected and as a dependent variable was attitude toward robots. Through this, we tried to discover the mechanism that specifically forms the user's attitude toward the robot, and to discover specific insights that can be referenced when designing the robot.

Analysis of the Effect of Yellow Carpet Installation according to Driving Behavior with Eye Tracking Data (가상주행실험 기반 운전자 시각행태에 따른 옐로카펫 설치 효과 분석)

  • Sungkab Joo;Dohoon Kim;Hyemin Mun;Homin Choi
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.22 no.5
    • /
    • pp.43-52
    • /
    • 2023
  • Traffic accidents among children have been decreasing after the installation of yellow carpets. However, the explanatory power of the causal relationship between yellow carpet installation and traffic accidents is still insufficient. The yellow carpet effect was analyzed in greater depth using virtual reality (VR) simulation experiments in various situation that could not be evaluated in existing actual vehicle research studies due to difficulties or risks in implementation. A target site where an actual yellow carpet was installed was selected and, implemented into a virtual environment. Subjects were made to, were gaze measurement equipment and ride the simulator. The visual/driving behavior before and after yellow carpet installation was compared, and a t-test analysis was performed for statistical verification. All the results were found to be statistically significant.

시선인식을 이용한 지능형 휠체어 시스템

  • Kim, Tae-Ui;Lee, Sang-Yoon;Kwon, Kyung-Su;Park, Se-Hyun
    • Proceedings of the Korea Society for Industrial Systems Conference
    • /
    • 2009.05a
    • /
    • pp.88-92
    • /
    • 2009
  • 본 논문에서는 시선인식을 이용한 지능형 휠체어 시스템에 대해 설명한다. 지능형 휠체어는 초음파센서를 이용하여 전동휠체어가 장애물을 감지하여 회피할 수 있게 하고, 조이스틱을 움직이기 힘든 중증 장애인을 위해 시선인식 및 추적을 이용하여 전동휠체어를 움직일 수 있게 하는 인터페이스를 제안한다. 지능형 휠체어는 시선인식 및 추적 모듈, 사용자 인터페이스, 장애물 회피 모듈, 모터 제어 모듈, 초음파 센서 모듈로 구성된다. 시선인식 및 추적 모듈은 적외선 카메라와 두개의 광원으로 사용자 눈의 각막 표면에 두 개의 반사점을 생성하고, 중심점을 구한 뒤, 동공의 중심점과 두 반사점의 중심을 이용하여 시선 추적을 한다. 시선이 응시하는 곳의 명령어를 사용자 인터페이스를 통해서 하달 받고, 모터 제어 모듈은 하달된 명령과 센서들에 의해 반환된 장애물과의 거리 정보로 모터제어보드에 연결되어 있는 두 개의 좌우 모터들을 조종한다. 센서 모듈은 전등휠체어가 움직이는 동안에 주기적으로 센서들로부터 거리 값을 반환 받아 벽 또는 장애물을 감지하여 장애물 회피 모듈에 의해 장애물을 우회 하도록 움직인다. 제안된 방법의 인터페이스는 실험을 통해 시선을 이용하여 지능형 휠체어에 명령을 하달하고 지능형 휠체어가 임의로 설치된 장애물을 효과적으로 감지하고 보다 정확하게 장애물을 회피 할 수 있음을 보였다.

  • PDF

Object Recognition Face Detection With 3D Imaging Parameters A Research on Measurement Technology (3D영상 객체인식을 통한 얼굴검출 파라미터 측정기술에 대한 연구)

  • Choi, Byung-Kwan;Moon, Nam-Mee
    • Journal of the Korea Society of Computer and Information
    • /
    • v.16 no.10
    • /
    • pp.53-62
    • /
    • 2011
  • In this paper, high-tech IT Convergence, to the development of complex technology, special technology, video object recognition technology was considered only as a smart - phone technology with the development of personal portable terminal has been developed crossroads. Technology-based detection of 3D face recognition technology that recognizes objects detected through the intelligent video recognition technology has been evolving technologies based on image recognition, face detection technology with through the development speed is booming. In this paper, based on human face recognition technology to detect the object recognition image processing technology is applied through the face recognition technology applied to the IP camera is the party of the mouth, and allowed the ability to identify and apply the human face recognition, measurement techniques applied research is suggested. Study plan: 1) face model based face tracking technology was developed and applied 2) algorithm developed by PC-based measurement of human perception through the CPU load in the face value of their basic parameters can be tracked, and 3) bilateral distance and the angle of gaze can be tracked in real time, proved effective.