• Title/Summary/Keyword: 눈추적

Search Result 201, Processing Time 0.026 seconds

A Study on the Mechanism of Social Robot Attitude Formation through Consumer Gaze Analysis: Focusing on the Robot's Face (소비자 시선 분석을 통한 소셜로봇 태도 형성 메커니즘 연구: 로봇의 얼굴을 중심으로)

  • Ha, Sangjip;Yi, Eunju;Yoo, In-jin;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.28 no.1
    • /
    • pp.243-262
    • /
    • 2022
  • In this study, eye tracking was used for the appearance of the robot during the social robot design study. During the research, each part of the social robot was designated as AOI (Areas of Interests), and the user's attitude was measured through a design evaluation questionnaire to construct a design research model of the social robot. The data used in this study are Fixation, First Visit, Total Viewed, and Revisits as eye tracking indicators, and AOI (Areas of Interests) was designed with the face, eyes, lips, and body of the social robot. And as design evaluation questionnaire questions, consumer beliefs such as Face-highlighted, Human-like, and Expressive of social robots were collected and as a dependent variable was attitude toward robots. Through this, we tried to discover the mechanism that specifically forms the user's attitude toward the robot, and to discover specific insights that can be referenced when designing the robot.

Moving object tracking using active camera (능동 카메라를 이용한 이동 물체 추적)

  • Park Hyun-Suk;Han Jong-Won;Jo Jin-Su;Lee Yill-Byung
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2006.06b
    • /
    • pp.364-366
    • /
    • 2006
  • 본 논문에서는 인간의 눈 움직임이 반영된 물체 추적 기능을 모방하여 CCD 카메라를 통하여 실시간으로 입력되는 영상 데이터로부터 특징기반 정합방법을 응용하여 움직임 정보를 추출한 후, 팬-틸트(pan-tilt) 기능의 하드웨어를 제어하여 실시간으로 이동하는 물체를 효율적으로 추적하는 시스템을 제안하였다. 기존의 연구들에서는 주로 물체의 색상값을 이용하여 추적이 이루어지므로 조명이나 카메라의 변화에 따라 이동 물체를 놓치거나 유사한 색의 다른 물체를 잘못 추적하는 문제가 있었다. 이러한 문제점을 해결하기 위하여 측정기반의 정합을 응용하여 이동하는 카메라에서 이동물체를 추출하고 이 이동 물체의 좌표를 이동하여 팬-틸트 하드웨어를 제어하여 추적을 수행하였다. 실험 결과 본 시스템은 움직이는 물체를 감지해서 팬-틸트 하드웨어를 올바르게 제어하며 카메라의 움직임을 보정해가며 전체적으로 움직이는 영상 내에서 실제 움직이는 물체를 일관성 있게 추적하는 만족스러운 결과를 보인다.

  • PDF

Robot vision system for face tracking using color information from video images (로봇의 시각시스템을 위한 동영상에서 칼라정보를 이용한 얼굴 추적)

  • Jung, Haing-Sup;Lee, Joo-Shin
    • Journal of Advanced Navigation Technology
    • /
    • v.14 no.4
    • /
    • pp.553-561
    • /
    • 2010
  • This paper proposed the face tracking method which can be effectively applied to the robot's vision system. The proposed algorithm tracks the facial areas after detecting the area of video motion. Movement detection of video images is done by using median filter and erosion and dilation operation as a method for removing noise, after getting the different images using two continual frames. To extract the skin color from the moving area, the color information of sample images is used. The skin color region and the background area are separated by evaluating the similarity by generating membership functions by using MIN-MAX values as fuzzy data. For the face candidate region, the eyes are detected from C channel of color space CMY, and the mouth from Q channel of color space YIQ. The face region is tracked seeking the features of the eyes and the mouth detected from knowledge-base. Experiment includes 1,500 frames of the video images from 10 subjects, 150 frames per subject. The result shows 95.7% of detection rate (the motion areas of 1,435 frames are detected) and 97.6% of good face tracking result (1,401 faces are tracked).

Determining UAV Flight Direction Control Method for Shooting the images of Multiple Users based on NUI/NUX (NUI/NUX 기반 복수의 사용자를 촬영하기 위한 UAV 비행방향 제어방법)

  • Kwak, Jeonghoon;Sung, Yunsick
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2018.05a
    • /
    • pp.445-446
    • /
    • 2018
  • 최근 무인항공기 (Unmanned Aerial Vehicle, UAV)에 장착한 카메라를 활용하여 사용자의 눈높이가 아닌 새로운 시각에서 사용자를 촬영한 영상을 제공한다. 사용자를 추적하며 촬영하기 위해 저전력 블루투스 (Bluetooth Low Energy, BLE) 신호, 영상, 그리고 Natural User Interface/Natual User Experience(NUI/NUX) 기술을 활용한다. BLE 신호로 사용자를 추적하는 경우 사용자의 후방에서 추적하며 사용자만을 추적하며 촬영 가능한 문제가 있다. 하지만 복수의 사용자를 전방에서 추적하며 촬영하는 방법이 필요하다. 본 논문에서는 복수의 사용자를 추적하며 전방에서 촬영하기 위해 UAV의 비행방향을 결정하는 방법을 설명한다. 복수의 사용자로부터 측정 가능한 BLE 신호들을 UAV에서 측정한다. 복수개의 BLE 신호의 변화를 활용하여 UAV의 비행방향을 결정한다.

Real-Time Automatic Tracking of Facial Feature (얼굴 특징 실시간 자동 추적)

  • 박호식;배철수
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.8 no.6
    • /
    • pp.1182-1187
    • /
    • 2004
  • Robust, real-time, fully automatic tracking of facial features is required for many computer vision and graphics applications. In this paper, we describe a fully automatic system that tracks eyes and eyebrows in real time. The pupils are tracked using the red eye effect by an infrared sensitive camera equipped with infrared LEDs. Templates are used to parameterize the facial features. For each new frame, the pupil coordinates are used to extract cropped images of eyes and eyebrows. The template parameters are recovered by PCA analysis on these extracted images using a PCA basis, which was constructed during the training phase with some example images. The system runs at 30 fps and requires no manual initialization or calibration. The system is shown to work well on sequences with considerable head motions and occlusions.

Design and Implementation of Eye-Gaze Estimation Algorithm based on Extraction of Eye Contour and Pupil Region (눈 윤곽선과 눈동자 영역 추출 기반 시선 추정 알고리즘의 설계 및 구현)

  • Yum, Hyosub;Hong, Min;Choi, Yoo-Joo
    • The Journal of Korean Association of Computer Education
    • /
    • v.17 no.2
    • /
    • pp.107-113
    • /
    • 2014
  • In this study, we design and implement an eye-gaze estimation system based on the extraction of eye contour and pupil region. In order to effectively extract the contour of the eye and region of pupil, the face candidate regions were extracted first. For the detection of face, YCbCr value range for normal Asian face color was defined by the pre-study of the Asian face images. The biggest skin color region was defined as a face candidate region and the eye regions were extracted by applying the contour and color feature analysis method to the upper 50% region of the face candidate region. The detected eye region was divided into three segments and the pupil pixels in each pupil segment were counted. The eye-gaze was determined into one of three directions, that is, left, center, and right, by the number of pupil pixels in three segments. In the experiments using 5,616 images of 20 test subjects, the eye-gaze was estimated with about 91 percent accuracy.

  • PDF

Realtime Facial Expression Data Tracking System using Color Information (컬러 정보를 이용한 실시간 표정 데이터 추적 시스템)

  • Lee, Yun-Jung;Kim, Young-Bong
    • The Journal of the Korea Contents Association
    • /
    • v.9 no.7
    • /
    • pp.159-170
    • /
    • 2009
  • It is very important to extract the expression data and capture a face image from a video for online-based 3D face animation. In recently, there are many researches on vision-based approach that captures the expression of an actor in a video and applies them to 3D face model. In this paper, we propose an automatic data extraction system, which extracts and traces a face and expression data from realtime video inputs. The procedures of our system consist of three steps: face detection, face feature extraction, and face tracing. In face detection, we detect skin pixels using YCbCr skin color model and verifies the face area using Haar-based classifier. We use the brightness and color information for extracting the eyes and lips data related facial expression. We extract 10 feature points from eyes and lips area considering FAP defined in MPEG-4. Then, we trace the displacement of the extracted features from continuous frames using color probabilistic distribution model. The experiments showed that our system could trace the expression data to about 8fps.

Alarm Device Using Eye-Tracking Web-camera (웹카메라를 이용한 시선 추적식 졸음 방지 디바이스)

  • Kim, Seong-Joo;Kim, Yoo-Hyun;Shin, Eun-Jung;Lee, Kang-Hee
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2013.01a
    • /
    • pp.321-322
    • /
    • 2013
  • 본 논문은 웹카메라를 이용하여 시선 추적식 졸음 방지 디바이스를 제안한다. 이는 하드웨어와 소프트웨어 두 부분으로 설계되었으며, 웹카메라를 이용하여 사용자의 눈을 인식하고, Arduino와 Max/msp를 기반으로 한다. Eye-Tracking 기술을 적용하여 사용자의 상태를 파악하고, 상태에 따라 적절한 졸음 방지 기능을 수행하도록 한다. 또한 졸음 방지 기능, 탁상 보조등과 같은 다양한 기능을 수행한다. 사용자는 웹카메라를 통한 시선 추적식 알람 디바이스를 이용함으로써, 새로운 경험을 제공 받는다. 세계 최초(World-First)로 시선추적 기술을 이용하여 남녀노소 누구나 업무 중 이용이 가능한 디바이스이다.

  • PDF

Estimation of a Driver's Physical Condition Using Real-time Vision System (실시간 비전 시스템을 이용한 운전자 신체적 상태 추정)

  • Kim, Jong-Il;Ahn, Hyun-Sik;Jeong, Gu-Min;Moon, Chan-Woo
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.9 no.5
    • /
    • pp.213-224
    • /
    • 2009
  • This paper presents a new algorithm for estimating a driver's physical condition using real-time vision system and performs experimentation for real facial image data. The system relies on a face recognition to robustly track the center points and sizes of person's two pupils, and two side edge points of the mouth. The face recognition constitutes the color statistics by YUV color space together with geometrical model of a typical face. The system can classify the rotation in all viewing directions, to detect eye/mouth occlusion, eye blinking and eye closure, and to recover the three dimensional gaze of the eyes. These are utilized to determine the carelessness and drowsiness of the driver. Finally, experimental results have demonstrated the validity and the applicability of the proposed method for the estimation of a driver's physical condition.

  • PDF