• Title/Summary/Keyword: Gaze Direction

Search Result 69, Processing Time 0.047 seconds

The Effects of Gaze Direction on the Stability and Coordination of the Lower Limb Joint during Drop-Landing (드롭랜딩 시 시선 방향의 차이가 하지관절의 안정성과 협응에 미치는 영향)

  • Kim, Kewwan;Ahn, Seji
    • Korean Journal of Applied Biomechanics
    • /
    • v.31 no.2
    • /
    • pp.126-132
    • /
    • 2021
  • Objective: The purpose of this study was to investigate how three gaze directions (bottom, normal, up) affects the coordination and stability of the lower limb during drop landing. Method: 20 female adults (age: 21.1±1.1 yrs, height: 165.7±6.2 cm, weight: 59.4±5.9 kg) participated in this study. Participants performed single-leg drop landing task on a 30 cm height and 20 cm horizontal distance away from the force plate. Kinetic and kinematic data were obtained using 8 motion capture cameras and 1 force plates and leg stiffness, loading rate, DPSI were calculated. All statistical analyses were computed by using SPSS 25.0 program. One-way repeated ANOVA was used to compared the differences between the variables in the direction of gaze. To locate the differences, Bonferroni post hoc was applied if significance was observed. Results: The hip flexion angle and ankle plantar flexion angle were significantly smaller when the gaze direction was up. In the kinetic variables, when the gaze direction was up, the loading rate and DPSI were significantly higher than those of other gaze directions. Conclusion: Our results indicated that decreased hip and ankle flexion angles, increased loading rate and DPSI when the gaze direction was up. This suggests that the difference in visual information can increase the risk of injury to the lower limb during landing.

Robust pupil detection and gaze tracking under occlusion of eyes

  • Lee, Gyung-Ju;Kim, Jin-Suh;Kim, Gye-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.10
    • /
    • pp.11-19
    • /
    • 2016
  • The size of a display is large, The form becoming various of that do not apply to previous methods of gaze tracking and if setup gaze-track-camera above display, can solve the problem of size or height of display. However, This method can not use of infrared illumination information of reflected cornea using previous methods. In this paper, Robust pupil detecting method for eye's occlusion, corner point of inner eye and center of pupil, and using the face pose information proposes a method for calculating the simply position of the gaze. In the proposed method, capture the frame for gaze tracking that according to position of person transform camera mode of wide or narrow angle. If detect the face exist in field of view(FOV) in wide mode of camera, transform narrow mode of camera calculating position of face. The frame captured in narrow mode of camera include gaze direction information of person in long distance. The method for calculating the gaze direction consist of face pose estimation and gaze direction calculating step. Face pose estimation is estimated by mapping between feature point of detected face and 3D model. To calculate gaze direction the first, perform ellipse detect using splitting from iris edge information of pupil and if occlusion of pupil, estimate position of pupil with deformable template. Then using center of pupil and corner point of inner eye, face pose information calculate gaze position at display. In the experiment, proposed gaze tracking algorithm in this paper solve the constraints that form of a display, to calculate effectively gaze direction of person in the long distance using single camera, demonstrate in experiments by distance.

Deep Learning-based Gaze Direction Vector Estimation Network Integrated with Eye Landmark Localization (딥 러닝 기반의 눈 랜드마크 위치 검출이 통합된 시선 방향 벡터 추정 네트워크)

  • Joo, Heeyoung;Ko, Min-Soo;Song, Hyok
    • Journal of Broadcast Engineering
    • /
    • v.26 no.6
    • /
    • pp.748-757
    • /
    • 2021
  • In this paper, we propose a gaze estimation network in which eye landmark position detection and gaze direction vector estimation are integrated into one deep learning network. The proposed network uses the Stacked Hourglass Network as a backbone structure and is largely composed of three parts: a landmark detector, a feature map extractor, and a gaze direction estimator. The landmark detector estimates the coordinates of 50 eye landmarks, and the feature map extractor generates a feature map of the eye image for estimating the gaze direction. And the gaze direction estimator estimates the final gaze direction vector by combining each output result. The proposed network was trained using virtual synthetic eye images and landmark coordinate data generated through the UnityEyes dataset, and the MPIIGaze dataset consisting of real human eye images was used for performance evaluation. Through the experiment, the gaze estimation error showed a performance of 3.9, and the estimation speed of the network was 42 FPS (Frames per second).

An Implementation of Gaze Recognition System Based on SVM (SVM 기반의 시선 인식 시스템의 구현)

  • Lee, Kue-Bum;Kim, Dong-Ju;Hong, Kwang-Seok
    • The KIPS Transactions:PartB
    • /
    • v.17B no.1
    • /
    • pp.1-8
    • /
    • 2010
  • The researches about gaze recognition which current user gazes and finds the location have increasingly developed to have many application. The gaze recognition of existence all about researches have got problems because of using equipment that Infrared(IR) LED, IR camera and head-mounted of high price. This study propose and implement the gaze recognition system based on SVM using a single PC Web camera. The proposed system that divide the gaze location of 36 per 9 and 4 to recognize gaze location of 4 direction and 9 direction recognize user's gaze. Also, the proposed system had apply on image filtering method using difference image entropy to improve performance of gaze recognition. The propose system was implements experiments on the comparison of proposed difference image entropy gaze recognition system, gaze recognition system using eye corner and eye's center and gaze recognition system based on PCA to evaluate performance of proposed system. The experimental results, recognition rate of 4 direction was 94.42% and 9 direction was 81.33% for the gaze recognition system based on proposed SVM. 4 direction was 95.37% and 9 direction was 82.25%, when image filtering method using difference image entropy implemented. The experimental results proved the high performance better than existed gaze recognition system.

Human Activity Recognition using Model-based Gaze Direction Estimation (모델 기반의 시선 방향 추정을 이용한 사람 행동 인식)

  • Jung, Do-Joon;Yoon, Jeong-Oh
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.16 no.4
    • /
    • pp.9-18
    • /
    • 2011
  • In this paper, we propose a method which recognizes human activity using model-based gaze direction estimation in an indoor environment. The method consists of two steps. First, we detect a head region and estimate its gaze direction as prior information in the human activity recognition. We use color and shape information for the detection of head region and use Bayesian Network model representing relationships between a head and a face for the estimation of gaze direction. Second, we recognize event and scenario describing the human activity. We use change of human state for the event recognition and use a rule-based method with combination of events and some constraints. We define 4 types of scenarios related to the gaze direction. We show performance of the gaze direction estimation and human activity recognition with results of experiments.

A Gaze Tracking based on the Head Pose in Computer Monitor (얼굴 방향에 기반을 둔 컴퓨터 화면 응시점 추적)

  • 오승환;이희영
    • Proceedings of the IEEK Conference
    • /
    • 2002.06c
    • /
    • pp.227-230
    • /
    • 2002
  • In this paper we concentrate on overall direction of the gaze based on a head pose for human computer interaction. To decide a gaze direction of user in a image, it is important to pick up facial feature exactly. For this, we binarize the input image and search two eyes and the mouth through the similarity of each block ( aspect ratio, size, and average gray value ) and geometric information of face at the binarized image. We create a imaginary plane on the line made by features of the real face and the pin hole of the camera to decide the head orientation. We call it the virtual facial plane. The position of a virtual facial plane is estimated through projected facial feature on the image plane. We find a gaze direction using the surface normal vector of the virtual facial plane. This study using popular PC camera will contribute practical usage of gaze tracking technology.

  • PDF

Difference in Rotation Pattern of Toric Soft Contact Lenses with Different Axis Stabilization Design (축 안정화 디자인이 상이한 토릭소프트콘택트렌즈의 회전 양상 차이)

  • Park, So Hyun;Kim, Dong Yeon;Choi, Joo Hee;Byun, Hyun Young;Kim, So Ra;Park, Mijung
    • Journal of Korean Ophthalmic Optics Society
    • /
    • v.20 no.2
    • /
    • pp.133-140
    • /
    • 2015
  • Purpose: It was investigated whether two different stabilization designs of toric contact lenses changed the rotational axis and degree of toric lenses according to body posture and gaze direction in the present study. Methods: Toric soft contact lenses with Lo-Torque$^{TM}$ design and ASD design (accelerated stabilized design) were fitted on 52 eyes aged in 20s-30s. Then, rotational degree was measured at the five gaze directions including front gaze and the lying position. Results: When gazing the front and vertical directions in the upright posture, lens was much rotated to nasal side for the Lo-Torque$^{TM}$ design and temporal side for the ASD design. When gazing horizontal direction, both design lenses were rotated against to the gaze direction. Rotation degree was the smallest at superior direction gaze and the largest at nasal gaze. In case of the rotation degree less than $5^{\circ}$, Lo-Torque$^{TM}$ design was more frequent when gazing front and vertical directions, and ASD design was more frequent when gazing horizontal direction. In addition, the lens with Lo-Torque$^{TM}$ design was lesser rotation degree than with ASD design immediately after lying. On the other hand, the lens with ASD design was lesser rotation degree than with Lo-Torque$^{TM}$ design 1 minute later after lying. Conclusions: This study confirmed that axis rotation of the lens induced by gaze direction and posture was different according to axis stabilization design during wearing toric soft contact lens.

Robot Control Interface Using Gaze Recognition (시선 인식을 이용한 로봇 인터페이스 개발)

  • Park, Se Hyun
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.7 no.1
    • /
    • pp.33-39
    • /
    • 2012
  • In this paper, we propose robot control interface using gaze recognition which is not limited by head motion. Most of the existing gaze recognition methods are working well only if the head is fixed. Furthermore the methods require a correction process per each person. The interface in this paper uses a camera with built-in infrared filter and 2 LED light sources to see what direction the pupils turn to and can send command codes to control the system, thus it doesn't need any correction process per each person. The experimental results showed that the proposed interface can control the system exactly by recognizing user's gaze direction.

3D First Person Shooting Game by Using Eye Gaze Tracking (눈동자 시선 추적에 의한 3차원 1인칭 슈팅 게임)

  • Lee, Eui-Chul;Park, Kang-Ryoung
    • The KIPS Transactions:PartB
    • /
    • v.12B no.4 s.100
    • /
    • pp.465-472
    • /
    • 2005
  • In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMB. The proposed method is composed of 3 parts. At first, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, when the user gaze on the monitor plane, the geometric relationship between the gazing position of monitor and the detected position of pupil center is determined. In the last part, the final gaze position on the HMD monitor is tracked and the 3D view in game is controlled by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his(or her) hand. Also, it can Increase the interest and the immersion by synchronizing the gaze direction of game player and the view direction of game character.

Effects of ad endorser's gaze directions on social perceptions and advertising effectiveness (광고 모델의 시선 효과: 모델의 사회적 특성 지각과 광고 효과성)

  • Kang, Jungsuk
    • Science of Emotion and Sensibility
    • /
    • v.18 no.1
    • /
    • pp.3-14
    • /
    • 2015
  • An ad endorser's gaze direction is a salient nonverbal cue that consumers use in responding to advertisements. The gaze direction influences consumers' social perceptions (e.g., attractiveness, credibility) of the endorser and advertising effectiveness (e.g., advertising attitudes, brand attitudes). Especially, the cerebral emotional asymmetry hypothesis suggests that an ad endorser's left-averted gaze can produce more positive social perceptions and advertising effectiveness than the right-averted gaze for right-handed consumers. This study examined the effects of three gaze directions (direct, left-averted and right-averted gaze directions) of unknown female ad endorser on Korean males' advertising responses (attractiveness-, credibility- and ad-effectiveness-related responses), using online experimental method. The results indicated that the ad endorser's direct gaze was more likely to increase both positive (correspondence bias) and negative (suspicion, deceptiveness) social perceptions of her than the right-averted gaze. The direct gaze also created more positive advertising effectiveness (advertising attitudes) than the right-averted gaze. However, the study failed to find consistent differences in responses between left-averted gaze and either direct or right-averted gaze.