• Title/Summary/Keyword: EyeTracking

Search Result 454, Processing Time 0.023 seconds

Exploring the Analysis of Male and Female Shopper's Visual Attention to Online Shopping Information Contents: Emphasis on Human Brand Image (온라인 쇼핑정보에 대한 남성과 여성 간 시각 주의도 탐색 연구: 휴먼 브랜드 이미지를 중심으로)

  • Hwang, Yoon Min;Lee, Kun Chang
    • The Journal of the Korea Contents Association
    • /
    • v.19 no.2
    • /
    • pp.328-339
    • /
    • 2019
  • Shopping information contents shown on online shopping sites represent online retailer's intention to draw potential consumers' visual attention. However, unfortunately, previous studies in literature show that most of the shopping information contents are naively designed just to appeal to consumers' visual attention without systematic and logical analysis of consumers' possible different visual reactions depending on gender. To fill in the research void like this, this study proposes eye-tracking approach to investigating the research issue of how gender affects consumers' visual attention towards human brand image contents on the online shopping sites. For the sake of conducting related eye-tracking experiments, we adopted two types of products - notebook computer as a utilitarian product, and perfume as a hedonic product. Results revealed that female consumers show higher visual attention to human brand image contents than male consumers. Besides, significant gender difference exists on the human brand image contents more highly when they are attached with a hedonic product like perfume, than a utilitarian product like notebook computer. From the eye-tracking-based experiment results like this, this study suggested theoretical backgrounds about gender differences towards online shopping information contents and related human brand image contents as well.

A Study on the Visual Attention of Popular Animation Characters Utilizing Eye Tracking (아이트래킹을 활용한 인기 애니메이션 캐릭터의 시각적 주의에 관한 연구)

  • Hwang, Mi-Kyung;Kwon, Mahn-Woo;Park, Min-Hee;Yin, Shuo-Han
    • The Journal of the Korea Contents Association
    • /
    • v.19 no.6
    • /
    • pp.214-221
    • /
    • 2019
  • Visual perception information acquired through human eyes contains much information on how to view visual stimuli using eye tracking technology, it is possible to acquire and analyze consumer visual information as quantitative data. These measurements can be used to measure emotions that customers feel unconsciously, and they can be directly collected by numerically quantifying the character's search response through eye tracking. In this study, we traced the character's area of interest (AOI) and found that the average of fixation duration, count, average of visit duration, count, and finally the time to first fixation was analyzed. As a result of analysis, it was found that there were many cognitive processing processes on the face than the character's body, and the visual attention was high. The visual attention of attraction factor has also been able to verify that attraction is being presented as an important factor in determining preferences for characters. Based on the results of this study, further studies of more characters will be conducted and quantitative interpretation methods can be used as basic data for character development and factors to be considered in determining character design.

A Study on the Mechanism of Social Robot Attitude Formation through Consumer Gaze Analysis: Focusing on the Robot's Face (소비자 시선 분석을 통한 소셜로봇 태도 형성 메커니즘 연구: 로봇의 얼굴을 중심으로)

  • Ha, Sangjip;Yi, Eunju;Yoo, In-jin;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.28 no.1
    • /
    • pp.243-262
    • /
    • 2022
  • In this study, eye tracking was used for the appearance of the robot during the social robot design study. During the research, each part of the social robot was designated as AOI (Areas of Interests), and the user's attitude was measured through a design evaluation questionnaire to construct a design research model of the social robot. The data used in this study are Fixation, First Visit, Total Viewed, and Revisits as eye tracking indicators, and AOI (Areas of Interests) was designed with the face, eyes, lips, and body of the social robot. And as design evaluation questionnaire questions, consumer beliefs such as Face-highlighted, Human-like, and Expressive of social robots were collected and as a dependent variable was attitude toward robots. Through this, we tried to discover the mechanism that specifically forms the user's attitude toward the robot, and to discover specific insights that can be referenced when designing the robot.

Study on the Expression of Sensory Visualization through AR Display Connection - Focusing on Eye Tracking (AR 디스플레이 연결을 통한 감각시각화에 대한 표현 검토)

  • Ma Xiaoyu
    • The Journal of the Convergence on Culture Technology
    • /
    • v.10 no.2
    • /
    • pp.357-363
    • /
    • 2024
  • As AR display virtual technology enters public learning life extensively, the way in which reality and virtual connection are connected is also changing. The purpose of this paper is to study the expression between the 3D connection sensory information visualization experience and virtual reality enhancement through the visual direction sensory information visualization experience of the plane. It is analyzed by examining the basic setting method compared to the current application of AR display and flat visualization cases. The scope of this paper is to enable users to have a better experience through the relationship with sensory visualization, centering on eye tracking technology in the four categories of AR display connection design: gesture connection, eye tracking, voice connection, and sensor. Focusing on eye tracking technology through AR display interaction and current application and comparative analysis of flat visualization cases, the geometric consistency of visual figures, light and color consistency, combination of multi-sensory interaction methods, rational content display, and smart push presented sensory visualization in virtual reality more realistically and conveniently, providing a simple and convenient sensory visualization experience to the audience.

Design and Implementation of Real-time three dimensional Tracking system of gazing point (삼차원 응시 위치의 실 시간 추적 시스템 구현)

  • 김재한
    • Proceedings of the IEEK Conference
    • /
    • 2003.07c
    • /
    • pp.2605-2608
    • /
    • 2003
  • This paper presents design and implementation methods of the real-time three dimensional tracking system of the gazing point. The proposed method is based on three dimensional data processing of eye images in the 3D world coordinates. The system hardware consists of two conventional CCD cameras for acquisition of stereoscopic image and computer for processing. And in this paper, the advantages of the proposed algorithm and test results ate described.

  • PDF

Design and Implementation of Eye-Gaze Estimation Algorithm based on Extraction of Eye Contour and Pupil Region (눈 윤곽선과 눈동자 영역 추출 기반 시선 추정 알고리즘의 설계 및 구현)

  • Yum, Hyosub;Hong, Min;Choi, Yoo-Joo
    • The Journal of Korean Association of Computer Education
    • /
    • v.17 no.2
    • /
    • pp.107-113
    • /
    • 2014
  • In this study, we design and implement an eye-gaze estimation system based on the extraction of eye contour and pupil region. In order to effectively extract the contour of the eye and region of pupil, the face candidate regions were extracted first. For the detection of face, YCbCr value range for normal Asian face color was defined by the pre-study of the Asian face images. The biggest skin color region was defined as a face candidate region and the eye regions were extracted by applying the contour and color feature analysis method to the upper 50% region of the face candidate region. The detected eye region was divided into three segments and the pupil pixels in each pupil segment were counted. The eye-gaze was determined into one of three directions, that is, left, center, and right, by the number of pupil pixels in three segments. In the experiments using 5,616 images of 20 test subjects, the eye-gaze was estimated with about 91 percent accuracy.

  • PDF

Elementary Teacher's Science Class Analysis using Mobile Eye Tracker (이동형 시선추적기를 활용한 초등교사의 과학 수업 분석)

  • Shin, Won-Sub;Kim, Jang-Hwan;Shin, Dong-Hoon
    • Journal of Korean Elementary Science Education
    • /
    • v.36 no.4
    • /
    • pp.303-315
    • /
    • 2017
  • The purpose of this study is to analyze elementary teachers' science class objectively and quantitatively using Mobile Eye Tracker. The mobile eye tracker is easy to wear in eyeglasses form. And experiments are collected in video form, so it is very useful for realizing objective data of teacher's class situation in real time. Participants in the study were 2 elementary teachers, and they are teaching sixth grade science in Seoul. Participants took a 40-minute class wearing a mobile eye tracker. Eye movements of participants were collected at 60 Hz, and the collected eye movement data were analyzed using SMI BeGaze 3.7. In this study, the area related to the class was set as the area of interest, we analyzed the visual occupancy of teachers. In addition, we analyzed the linguistic interaction between teacher and students. The results of the study are as follows. First, we analyze the visual occupancy of meaningful areas in teaching-learning activities by class stage. Second, the analysis of eye movements when teachers interacted with students showed that teacher A had a high percentage of students' faces, while teacher B had a high visual occupation in areas not related to classes. Third, the linguistic interaction of the participants were analyzed. Analysis areas include questions, attention-focused language, elementary science teaching terminology, daily interaction, humor, and unnecessary words. This study shows that it is possible to analyze elementary science class objectively and quantitatively through analysis of visual occupancy using mobile eye tracking. In addition, it is expected that teachers' visual attention in teaching activities can be used as an index to analyze the form of language interaction.

Influence of Endorser's Gaze Direction on Consumer's Visual Attention, Attitude and Recognition: Focused on the Eye Movement (광고 모델의 위치와 시선 방향이소비자의 시각적 주의, 태도 및재인에 미치는 효과: 안구운동추적기법을 중심으로)

  • Chung, Hyenyeong;Lee, Ji-Yeon;Nam, Yun-Ju
    • (The) Korean Journal of Advertising
    • /
    • v.29 no.7
    • /
    • pp.29-53
    • /
    • 2018
  • In our study, we investigated the effects of position of endorser and endorser's gaze direction(direct/averted_image/averted_text) on advertising attitude, purchase intent and brand recognition using eye-tracking method. Focusing on the printed cosmetic ads which the role of endorser is important and indirect persuade route is relatively is emphasized, we conducted experiment on 36 participants in 20s. As prior studies, our results shows that participants paid attention to more and faster on specific element which the endorser is gazing at. But it was not reflected to ad attitude and purchase intent directly. When the endorser is positioned in left the side, the highest purchase intent was shown in direct gaze condition, while when the endorser is on the right side, the highest ad attitude was shown in gazing image condition. Additionally, the brand recognition task following eye-tracking experiment shows that recognition accuracy was higher only in condition which the endorser is in the left side looking at the product image. These results demonstrated that the gaze direction of endorser plays a role as attentional guidance, which means it can lead customer's attention to particular region in the printed ad, but the effect can be varied depending on the position of endorser and which type of information the endorser is gazing at. Therefore, ultimately, to increase customer's ad attitude and purchase intent, complex consideration of not only the gazing direction of the endorser, but the position of endorser and other diverse elements is necessary.

Correcting the gaze depth by using DNN (DNN을 이용한 응시 깊이 보정)

  • Seok-Ho Han;Hoon-Seok Jang
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.16 no.3
    • /
    • pp.123-129
    • /
    • 2023
  • if we know what we're looking at, we can get a lot of information. Due to the development of eye tracking, Information on gaze point can be obtained through software provided by various eye tracking equipments. However, it is difficult to estimate accurate information such as the actual gaze depth. If it is possible to calibrate the eye tracker with the actual gaze depth, it will enable the derivation of realistic and accurate results with reliable validity in various fields such as simulation, digital twin, VR, and more. Therefore, in this paper, we experiment with acquiring and calibrating raw gaze depth using an eye tracker and software. The experiment involves designing a Deep Neural Network (DNN) model and then acquiring gaze depth values provided by the software for specified distances from 300mm to 10,000mm. The acquired data is trained through the designed DNN model and calibrated to correspond to the actual gaze depth. In our experiments with the calibrated model, we were able to achieve actual gaze depth values of 297mm, 904mm, 1,485mm, 2,005mm, 3,011mm, 4,021mm, 4,972mm, 6,027mm, 7,026mm, 8,043mm, 9,021mm, and 10,076mm for the specified distances from 300mm to 10,000mm.

Development of Real-Time Vision-based Eye-tracker System for Head Mounted Display (영상정보를 이용한 HMD용 실시간 아이트랙커 시스템)

  • Roh, Eun-Jung;Hong, Jin-Sung;Bang, Hyo-Choong
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.35 no.6
    • /
    • pp.539-547
    • /
    • 2007
  • In this paper, development and tests of a real-time eye-tracker system are discussed. The tracker system tracks a user's gaze point through movement of eyes by means of vision-based pupil detection. The vision-based method has an advantage of detecting the exact positions of user's eyes. An infrared camera and a LED are used to acquire a user's pupil image and to extract pupil region, which was hard to extract with software only, from the obtained image, respectively. We develop a pupil-tracking algorithm with Kalman filter and grab the pupil images by using DSP(Digital Signal Processing) system for real-time image processing technique. The real-time eye-tracker system tracks the movements of user's pupils to project their gaze point onto a background image.