• Title/Summary/Keyword: Gaze Data

Search Result 103, Processing Time 0.021 seconds

A Study on the Characteristics of Consumer Visual-Perceptional Information Acquisition in Commercial Facilities in Regard to its Construction of Space from Real-Time Eye Gaze Tracking (상업시설 공간구성의 실시간 시선추적에 나타난 소비자 시지각 정보획득 특성 연구)

  • Park, Sunmyung
    • Science of Emotion and Sensibility
    • /
    • v.21 no.2
    • /
    • pp.3-14
    • /
    • 2018
  • For satisfying consumer needs, commercial facilities require a variety of sale-related space expressions and eye-catching product arrangements; space composition can also be a direct marketing strategy. The human eye is the sensory organ that acquires the largest amount of information, and an analysis of visual information helps in understanding visual relations between . However, the existing studies are mostly focused on analysis of still frames in experimental images, and there is a lack of studies analyzing gaze information based on mobile images of commercial spaces. Therefore, this study analyzed emotional responses through gaze information of space users in reality using a video of a movement route through a commercial facility. The analysis targeted straight sections of the moving route; based on the data acquired, sectional characteristics of five gaze intensity ranges were examined. As a result, section A, the starting point of the route, had a low gaze intensity, while section B had the highest gaze intensity. This indicates that, starting in section B, the subjects explored the space in a stable way and needed time to adapt to the experimental video. In relation to space characteristics of the gaze-concentrated area, display formats of the right stores in 4 of 6 sections received greater attention. The gaze of consumers was mostly focused on props, and big gaze information was revealed in showcase display formats of the stores. In conclusion, this analysis method can provide highly useful direct design data about merchandise display and merchandise component arrangement based on consumer visual preference.

Eye Gaze Information and Game Level Design according to FPS Gameplay Beats

  • Choi, GyuHyeok;Kim, Mijin
    • Journal of information and communication convergence engineering
    • /
    • v.16 no.3
    • /
    • pp.189-196
    • /
    • 2018
  • Player's actions in a game occur in the process of gameplay experiences in a play space designed by the developer according to preset gameplay beats. Focusing on beats that induce a first-person shooter (FPS) game's main gameplay, this paper analyzes the differences in eye gaze information found in different players during the course of gameplay. For this research goal, the study divides the beat areas in which play actions appear in association with gameplay beats at a typical FPS game level, repeatedly conducts tests in accordance with a player's experience level (novice and expert group), and collects and analyzes eye gaze information data in three types of beat areas. The analysis result suggests concrete guidelines for game level design for different beat areas based on an FPS game player's experience level. This empirical experiment method and result can lessen repetitive modification work for game level design and consequently be utilized for optimizing the game level to the developer's intention.

A Human-Robot Interface Using Eye-Gaze Tracking System for People with Motor Disabilities

  • Kim, Do-Hyoung;Kim, Jae-Hean;Yoo, Dong-Hyun;Lee, Young-Jin;Chung, Myung-Jin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.3 no.4
    • /
    • pp.229-235
    • /
    • 2001
  • Recently, service area has been emerging field f robotic applications. Even though assistant robots play an important role for the disabled and the elderly, they still suffer from operating the robots using conventional interface devices such as joysticks or keyboards. In this paper we propose an efficient computer interface using real-time eye-gaze tracking system. The inputs to the proposed system are images taken by a camera and data from a magnetic sensor. The measured data is sufficient to describe the eye and head movement because the camera and the receiver of a magnetic sensor are stationary with respect to the head. So the proposed system can obtain the eye-gaze direction in spite of head movement as long as the distance between the system and the transmitter of a magnetic position sensor is within 2m. Experimental results show the validity of the proposed system in practical aspect and also verify the feasibility of the system as a new computer interface for the disabled.

  • PDF

Robust Gaze-Fixing of an Active Vision System under Variation of System Parameters (시스템 파라미터의 변동 하에서도 강건한 능동적인 비전의 시선 고정)

  • Han, Youngmo
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.1 no.3
    • /
    • pp.195-200
    • /
    • 2012
  • To steer a camera is done based on system parameters of the vision system. However, the system parameters when they are used might be different from those when they were measured. As one method to compensate for this problem, this research proposes a gaze-steering method based on LMI(Linear Matrix Inequality) that is robust to variations in the system parameters of the vision system. Simulation results show that the proposed method produces less gaze-tracking error than a contemporary linear method and more stable gaze-tracking error than a contemporary nonlinear method. Moreover, the proposed method is fast enough for realtime processing.

Steering Gaze of a Camera in an Active Vision System: Fusion Theme of Computer Vision and Control (능동적인 비전 시스템에서 카메라의 시선 조정: 컴퓨터 비전과 제어의 융합 테마)

  • 한영모
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.41 no.4
    • /
    • pp.39-43
    • /
    • 2004
  • A typical theme of active vision systems is gaze-fixing of a camera. Here gaze-fixing of a camera means by steering orientation of a camera so that a given point on the object is always at the center of the image. For this we need to combine a function to analyze image data and a function to control orientation of a camera. This paper presents an algorithm for gaze-fixing of a camera where image analysis and orientation control are designed in a frame. At this time, for avoiding difficulties in implementing and aiming for real-time applications we design the algorithm to be a simple closed-form without using my information related to calibration of the camera or structure estimation.

Correcting the gaze depth by using DNN (DNN을 이용한 응시 깊이 보정)

  • Seok-Ho Han;Hoon-Seok Jang
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.16 no.3
    • /
    • pp.123-129
    • /
    • 2023
  • if we know what we're looking at, we can get a lot of information. Due to the development of eye tracking, Information on gaze point can be obtained through software provided by various eye tracking equipments. However, it is difficult to estimate accurate information such as the actual gaze depth. If it is possible to calibrate the eye tracker with the actual gaze depth, it will enable the derivation of realistic and accurate results with reliable validity in various fields such as simulation, digital twin, VR, and more. Therefore, in this paper, we experiment with acquiring and calibrating raw gaze depth using an eye tracker and software. The experiment involves designing a Deep Neural Network (DNN) model and then acquiring gaze depth values provided by the software for specified distances from 300mm to 10,000mm. The acquired data is trained through the designed DNN model and calibrated to correspond to the actual gaze depth. In our experiments with the calibrated model, we were able to achieve actual gaze depth values of 297mm, 904mm, 1,485mm, 2,005mm, 3,011mm, 4,021mm, 4,972mm, 6,027mm, 7,026mm, 8,043mm, 9,021mm, and 10,076mm for the specified distances from 300mm to 10,000mm.

A Design of the Finite State Machine to Control User's Gaze on a Screen (화면 응시 제어를 위한 유한 상태 기계 설계)

  • Moon, Bong-Hee
    • Journal of the Korea Society of Computer and Information
    • /
    • v.16 no.5
    • /
    • pp.127-134
    • /
    • 2011
  • A finite state machine was designed to control user's gaze on the screen when the user is monitoring the. It consists of a set of situations where pupils are gazed and a set of states which decide the gaze on a screen or sleeping. The states were especially classified into main states, pre-states and potential states. The machine uses the situation history, which decide current state using continuous previous situation and current situation, and improves the accuracy to control the gaze on a screen. We implemented the machine with the data which were get using a pupil detection method, and tested the verification of the system with monitoring operations. The experimentation using the method which get date from real images shows advantage of decision whether it is temporary gaze or long-term gaze.

Analysis of Gaze Related to Cooperation, Competition and Focus Levels (협력, 경쟁, 집중 수준에 따른 시선 분석)

  • Cho, Ji Eun;Lee, Dong Won;Park, MinJi;Whang, Min-Cheol
    • The Journal of the Korea Contents Association
    • /
    • v.17 no.9
    • /
    • pp.281-291
    • /
    • 2017
  • Emotional interaction in virtual reality is necessary of social communication. However, social emotion has been tried to be less recognized quantitatively. This study was to determined social gaze of emotion in business domain. 417 emotion words were collected and 16 emotion words were selected to Goodness of Fit. Emotion word were mapped into 2 dimensional space through multidimensional scaling analysis. Then, X axis defined dimensions of cooperation, competition, and Y axis of low focus and high focus through the FGD. 52 subjects were presented to stimuli for emotion and gaze movement data were collected. Independent t-test results showed that the gaze factor increased in the face, eye, and nose areas at cooperation, and the gaze factor increased in the right face and nose areas at the low focus. It is expected that this will be used as a basic research to evaluate emotions needed in business environment in virtual space.

A Study on the Sensory Motor Coordination to Visual and Sound Stimulation (빛과 소리 자극에 대한 지각 운동의 협력에 관한 연구)

  • Kim, Nam-Gyun;Ko, Yong-Ho;Ifukube, T.
    • Journal of Biomedical Engineering Research
    • /
    • v.15 no.1
    • /
    • pp.77-82
    • /
    • 1994
  • We investigated the characteristic of the sensory motor coordination by measuring the hand point ins and the gaze movement to the visual and sound stimulation. Our results showed that the gaze vol ocity to sound stimulation did not depend on stimulation direction, but lagged behind 0.2 sec toward the pheriperal direction to the visual stimulation. Our data showed that to both visual and sound stimulation, the error of hand pointing value increased with an increasement of eccentricity.

  • PDF

Visual Modeling and Content-based Processing for Video Data Storage and Delivery

  • Hwang Jae-Jeong;Cho Sang-Gyu
    • Journal of information and communication convergence engineering
    • /
    • v.3 no.1
    • /
    • pp.56-61
    • /
    • 2005
  • In this paper, we present a video rate control scheme for storage and delivery in which the time-varying viewing interests are controlled by human gaze. To track the gaze, the pupil's movement is detected using the three-step process : detecting face region, eye region, and pupil point. To control bit rates, the quantization parameter (QP) is changed by considering the static parameters, the video object priority derived from the pupil tracking, the target PSNR, and the weighted distortion value of the coder. As results, we achieved human interfaced visual model and corresponding region-of-interest rate control system.