• Title/Summary/Keyword: 3D gaze

Search Result 50, Processing Time 0.026 seconds

Object detection within the region of interest based on gaze estimation (응시점 추정 기반 관심 영역 내 객체 탐지)

  • Seok-Ho Han;Hoon-Seok Jang
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.16 no.3
    • /
    • pp.117-122
    • /
    • 2023
  • Gaze estimation, which automatically recognizes where a user is currently staring, and object detection based on estimated gaze point, can be a more accurate and efficient way to understand human visual behavior. in this paper, we propose a method to detect the objects within the region of interest around the gaze point. Specifically, after estimating the 3D gaze point, a region of interest based on the estimated gaze point is created to ensure that object detection occurs only within the region of interest. In our experiments, we compared the performance of general object detection, and the proposed object detection based on region of interest, and found that the processing time per frame was 1.4ms and 1.1ms, respectively, indicating that the proposed method was faster in terms of processing speed.

Style Synthesis of Speech Videos Through Generative Adversarial Neural Networks (적대적 생성 신경망을 통한 얼굴 비디오 스타일 합성 연구)

  • Choi, Hee Jo;Park, Goo Man
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.11 no.11
    • /
    • pp.465-472
    • /
    • 2022
  • In this paper, the style synthesis network is trained to generate style-synthesized video through the style synthesis through training Stylegan and the video synthesis network for video synthesis. In order to improve the point that the gaze or expression does not transfer stably, 3D face restoration technology is applied to control important features such as the pose, gaze, and expression of the head using 3D face information. In addition, by training the discriminators for the dynamics, mouth shape, image, and gaze of the Head2head network, it is possible to create a stable style synthesis video that maintains more probabilities and consistency. Using the FaceForensic dataset and the MetFace dataset, it was confirmed that the performance was increased by converting one video into another video while maintaining the consistent movement of the target face, and generating natural data through video synthesis using 3D face information from the source video's face.

The Effect of Gaze Angle on Muscle Activity and Kinematic Variables during Treadmill Walking

  • Kim, Bo-Suk;Jung, Jae-Hu;Chae, Woen-Sik
    • Korean Journal of Applied Biomechanics
    • /
    • v.27 no.1
    • /
    • pp.35-43
    • /
    • 2017
  • Objective: The purpose of this study was to determine how gaze angle affects muscle activity and kinematic variables during treadmill walking and to offer scientific information for effective and safe treadmill training environment. Method: Ten male subjects who have no musculoskeletal disorder were recruited. Eight pairs of surface electrodes were attached to the right side of the body to monitor the upper trapezius (UT), rectus abdominis (RA), erector spinae (ES), rectus femoris (RF), bicep femoris (BF), tibialis anterior (TA), medialis gastrocnemius (MG), and lateral gastrocnemius (LG). Two digital camcorders were used to obtain 3-D kinematics of the lower extremity. Each subject walked on a treadmill with a TV monitor at three different heights (eye level; EL, 20% above eye level; AE, 20% below eye level; BE) at speed of 5.0 km/h. For each trial being analyzed, five critical instants and four phases were identified from the video recording. For each dependent variable, one-way ANOVA with repeated measures was used to determine whether there were significant differences among three different conditions (p<.05). When a significant difference was found, post hoc analyses were performed using the contrast procedure. Results: This study found that average and peak IEMG values for EL were generally smaller than the corresponding values for AE and BE but the differences were not statically significant. There were also no significant changes in kinematic variables among three different gaze angles. Conclusion: Based on the results of this study, gaze angle does not affect muscle activity and kinematic variables during treadmill walking. However, it is interesting to note that walking with BE may increase the muscle activity of the trapezius and the lower extremity. Moreover, it may hinder proper dorsiflexion during landing phase. Thus, it seems to reasonable to suggest that inappropriate gaze angle should be avoided in treadmill walking. It is obvious that increased walking speed may cause a significant changes in biomechanical parameters used in this study. It is recommended that future studies be conducted which are similar to the present investigation but using different walking speed.

Depth Video Post-processing for Immersive Teleconference (원격 영상회의 시스템을 위한 깊이 영상 후처리 기술)

  • Lee, Sang-Beom;Yang, Seung-Jun;Ho, Yo-Sung
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.37 no.6A
    • /
    • pp.497-502
    • /
    • 2012
  • In this paper, we present an immersive videoconferencing system that enables gaze correction between users in the internet protocol TV (IPTV) environment. The proposed system synthesizes the gaze corrected images using the depth estimation and the virtual view synthesis algorithms as one of the most important techniques of 3D video system. The conventional processes, however, causes several problems, especially temporal inconsistency of a depth video. This problem leads to flickering artifacts discomforting viewers. Therefore, in order to reduce the temporal inconsistency problem, we exploit the joint bilateral filter which is extended to the temporal domain. In addition, we apply an outlier reduction operation in the temporal domain. From experimental results, we have verified that the proposed system is sufficient to generate the natural gaze-corrected image and realize immersive videoconferencing.

Gaze Detection System using Real-time Active Vision Camera (실시간 능동 비전 카메라를 이용한 시선 위치 추적 시스템)

  • 박강령
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.12
    • /
    • pp.1228-1238
    • /
    • 2003
  • This paper presents a new and practical method based on computer vision for detecting the monitor position where the user is looking. In general, the user tends to move both his face and eyes in order to gaze at certain monitor position. Previous researches use only one wide view camera, which can capture a whole user's face. In such a case, the image resolution is too low and the fine movements of user's eye cannot be exactly detected. So, we implement the gaze detection system with dual camera systems(a wide and a narrow view camera). In order to locate the user's eye position accurately, the narrow view camera has the functionalities of auto focusing and auto panning/tilting based on the detected 3D facial feature positions from the wide view camera. In addition, we use dual R-LED illuminators in order to detect facial features and especially eye features. As experimental results, we can implement the real-time gaze detection system and the gaze position accuracy between the computed positions and the real ones is about 3.44 cm of RMS error.

MyWorkspace: VR Platform with an Immersive User Interface (MyWorkspace: 몰입형 사용자 인터페이스를 이용한 가상현실 플랫폼)

  • Yoon, Jong-Won;Hong, Jin-Hyuk;Cho, Sung-Bae
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.52-55
    • /
    • 2009
  • With the recent development of virtual reality, it has been actively investigated to develop user interfaces for immersive interaction. Immersive user interfaces improve the efficiency and the capability of information processing in the virtual environment providing various services, and provide effective interaction in the field of ubiquitous and mobile computing. In this paper, we propose an virtual reality platform "My Workspace" which renders an 3D virtual workspace by using an immersive user interface. We develop an interface that integrates an optical see-through head-mounted display, a Wii remote controller, and a helmet with infrared LEDs. It estimates the user's gaze direction in terms of horizontal and vertical angles based on the model of head movements. My Workspace expands the current 2D workspace based on monitors into the layered 3D workspace, and renders a part of 3D virtual workspace corresponding to the gaze direction. The user can arrange various tasks on the virtual workspace and switch each task by moving his head. In this paper, we will also verify the performance of the immersive user interface as well as its usefulness with the usability test.

  • PDF

Eye Contact System Using Depth Fusion for Immersive Videoconferencing (실감형 화상 회의를 위해 깊이정보 혼합을 사용한 시선 맞춤 시스템)

  • Jang, Woo-Seok;Lee, Mi Suk;Ho, Yo-Sung
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.52 no.7
    • /
    • pp.93-99
    • /
    • 2015
  • In this paper, we propose a gaze correction method for realistic video teleconferencing. Typically, cameras used in teleconferencing are installed at the side of the display monitor, but not in the center of the monitor. This system makes it too difficult for users to contact each eyes. Therefore, eys contact is the most important in the immersive videoconferencing. In the proposed method, we use the stereo camera and the depth camera to correct the eye contact. The depth camera is the kinect camera, which is the relatively cheap price, and estimate the depth information efficiently. However, the kinect camera has some inherent disadvantages. Therefore, we fuse the kinect camera with stereo camera to compensate the disadvantages of the kinect camera. Consecutively, for the gaze-corrected image, view synthesis is performed by 3D warping according to the depth information. Experimental results verify that the proposed system is effective in generating natural gaze-corrected images.

Active eye system for tracking a moving object (이동물체 추적을 위한 능동시각 시스템 구축)

  • 백문홍
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1996.10b
    • /
    • pp.257-259
    • /
    • 1996
  • This paper presents the active eye system for tracking a moving object in 3D space. A prototype system able to track a moving object is designed and implemented. The mechanical system ables the control of platform that consists of binocular camera and also the control of the vergence angle of each camera by step motor. Each camera has two degrees of freedom. The image features of the object are extracted from complicated environment by using zero disparity filtering(ZDF). From the cnetroid of the image features the gaze point on object is calculated and the vergence angle of each camera is controlled by step motor. The Proposed method is implemented on the prototype with robust and fast calculation time.

  • PDF

3-D Facial Motion Estimation using Extended Kalman Filter (확장 칼만 필터를 이용한 얼굴의 3차원 움직임량 추정)

  • 한승철;박강령김재희
    • Proceedings of the IEEK Conference
    • /
    • 1998.10a
    • /
    • pp.883-886
    • /
    • 1998
  • In order to detect the user's gaze position on a monitor by computer vision, the accurate estimations of 3D positions and 3D motion of facial features are required. In this paper, we apply a EKF(Extended Kalman Filter) to estimate 3D motion estimates and assumes that its motion is "smooth" in the sense of being represented as constant velocity translational and rotational model. Rotational motion is defined about the orgin of an face-centered coordinate system, while translational motion is defined about that of a camera centered coordinate system. For the experiments, we use the 3D facial motion data generated by computer simulation. Experiment results show that the simulation data andthe estimation results of EKF are similar.e similar.

  • PDF

Gaze Tracking System Using Feature Points of Pupil and Glints Center (동공과 글린트의 특징점 관계를 이용한 시선 추적 시스템)

  • Park Jin-Woo;Kwon Yong-Moo;Sohn Kwang-Hoon
    • Journal of Broadcast Engineering
    • /
    • v.11 no.1 s.30
    • /
    • pp.80-90
    • /
    • 2006
  • A simple 2D gaze tracking method using single camera and Purkinje image is proposed. This method employs single camera with infrared filter to capture one eye and two infrared light sources to make reflection points for estimating corresponding gaze point on the screen from user's eyes. Single camera, infrared light sources and user's head can be slightly moved. Thus, it renders simple and flexible system without using any inconvenient fixed equipments or assuming fixed head. The system also includes a simple and accurate personal calibration procedure. Before using the system, each user only has to stare at two target points for a few seconds so that the system can initiate user's individual factors of estimating algorithm. The proposed system has been developed to work in real-time providing over 10 frames per second with XGA $(1024{\times}768)$ resolution. The test results of nine objects of three subjects show that the system is achieving an average estimation error less than I degree.