• Title/Summary/Keyword: Eye direction

Search Result 200, Processing Time 0.028 seconds

Fish-eye camera calibration and artificial landmarks detection for the self-charging of a mobile robot (이동로봇의 자동충전을 위한 어안렌즈 카메라의 보정 및 인공표지의 검출)

  • Kwon, Oh-Sang
    • Journal of Sensor Science and Technology
    • /
    • v.14 no.4
    • /
    • pp.278-285
    • /
    • 2005
  • This paper describes techniques of camera calibration and artificial landmarks detection for the automatic charging of a mobile robot, equipped with a fish-eye camera in the direction of its operation for movement or surveillance purposes. For its identification from the surrounding environments, three landmarks employed with infrared LEDs, were installed at the charging station. When the robot reaches a certain point, a signal is sent to the LEDs for activation, which allows the robot to easily detect the landmarks using its vision camera. To eliminate the effects of the outside light interference during the process, a difference image was generated by comparing the two images taken when the LEDs are on and off respectively. A fish-eye lens was used for the vision camera of the robot but the wide-angle lens resulted in a significant image distortion. The radial lens distortion was corrected after linear perspective projection transformation based on the pin-hole model. In the experiment, the designed system showed sensing accuracy of ${\pm}10$ mm in position and ${\pm}1^{\circ}$ in orientation at the distance of 550 mm.

Effect of the Cylindrical Fly-eye Lens's Precision on Long-axis Uniformity and Steepness of a Line Beam (실린더 잠자리 눈 렌즈의 정밀도가 선형 빔의 장축 균일도 및 경사도에 미치는 영향)

  • Lee, Seungmin;Song, Hyunsu;Woo, Hee;Kim, Daeyong;Jung, Jinho
    • Korean Journal of Optics and Photonics
    • /
    • v.32 no.6
    • /
    • pp.296-305
    • /
    • 2021
  • This paper reports a study on the long axis performance of the line beam optics used in laser lift-off equipment for the OLED manufacturing process. The centration errors of the cylindrical lens are classified and defined in seven categories, and the measurement methods are presented. The cylindrical fly-eye lens is analyzed theoretically and experimentally to find the influence of the surface shape error and decentering error on the long axis performance of the line beam optics system. A future research direction is also presented to improve the long axis performance.

3D First Person Shooting Game by Using Eye Gaze Tracking (눈동자 시선 추적에 의한 3차원 1인칭 슈팅 게임)

  • Lee, Eui-Chul;Park, Kang-Ryoung
    • The KIPS Transactions:PartB
    • /
    • v.12B no.4 s.100
    • /
    • pp.465-472
    • /
    • 2005
  • In this paper, we propose the method of manipulating the gaze direction of 3D FPS game's character by using eye gaze detection from the successive images captured by USB camera, which is attached beneath HMB. The proposed method is composed of 3 parts. At first, we detect user's pupil center by real-time image processing algorithm from the successive input images. In the second part of calibration, when the user gaze on the monitor plane, the geometric relationship between the gazing position of monitor and the detected position of pupil center is determined. In the last part, the final gaze position on the HMD monitor is tracked and the 3D view in game is controlled by the gaze position based on the calibration information. Experimental results show that our method can be used for the handicapped game player who cannot use his(or her) hand. Also, it can Increase the interest and the immersion by synchronizing the gaze direction of game player and the view direction of game character.

Eye Detection Using Texture Filters (질감 필터를 이용한 눈 검출)

  • Park, Chan-Woo;Kim, Yong-Min;Park, Ki-Tae;Moon, Young-Shik
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.46 no.6
    • /
    • pp.70-78
    • /
    • 2009
  • In this paper, we propose a novel method for eye detection using two texture filters considering textural and structural characteristics of eye regions. The human eyes have two characteristics: 1) the eyes are horizontally long and 2) the pupas are of circular shapes. By considering these two characteristics of human eyes, two texture filters are utilized for the eye detection. One is Gabor filter for detecting eye shapes in horizontal direction. The other is ART descriptor for detecting pupils of circular shape. In order to effectively detect eye regions, the proposed method consists of four steps. The first step is to extract facial regions using AdaBoost method. The second step is to normalize the illumination by considering local information. The third step is to estimate candidate regions for eyes, by merging the results from two texture filters. The final step is to locate exact eye regions by using geometric information of the face. As experimental results, the performance of the proposed method has been improved by 2.9~4.4%, compared to the existing methods.

A Study for Rationalization of Lifting Lug Design of Ship Block (선박블록 탑재용 러그구조의 설계합리화를 위한 연구)

  • 함주혁
    • Journal of Ocean Engineering and Technology
    • /
    • v.11 no.4
    • /
    • pp.249-261
    • /
    • 1997
  • A basic study on the lifting lug design has performed through the rational and systematic process. In order to evaluate the proper design-load distribution around lug eye investigation of contact force between lifting lug and shackle pin is performed using non-linear parametric analysis idealized by gap element models. Gap element modeling and nonlinear analysis procedures are illustrated and discussed based on MSC/NASTRAN. Some analysis and design guides are suggested through the consideration of several important effects such as stress distribution pattern, circumferential contact force distribution along the lug eye face, loading share rate between lug main plate and doubler, effect of loading direction, relation between applied force and deflection and size effect of shackle pin radius. Additionally optimum design studies are performed and general trends according to the variation of design parameters are suggested.

  • PDF

The Vectorization of EOG for Man-Machine Interfacing (Man-Machine Interfacing을 위한 EOG의 벡터화)

  • Park, Jong-Hwan;Cheon, Woo-Young;Park, Hyung-Jun
    • Proceedings of the KIEE Conference
    • /
    • 1998.07b
    • /
    • pp.604-606
    • /
    • 1998
  • As a basic study for Man-Machine interfacing technics, this paper purposed the vertorization of EOG(electrooculogram) that is generated by eye movement. EOG is electric potential difference between the positive potential of cornea and the negative potential of retina. The magnitude and the polarity are depend on the direction of eye movement and degree of gaze angle. In order to vectorize EOG, EOG signal is measured about vertical and horizontal movement of eyes. This vectorization of EOG is expected to help Man-Machine Interfacing technics and development of other useful equipment.

  • PDF

Eye Movements Produced by the Inferior Oblique Muscle in the Rabbits (가토(家兎)에 있어서 하사근(下斜筋)의 작용(作用)으로 초래(招來)되는 안구운동(眼球運動))

  • Kim, Jae-Hyub
    • The Korean Journal of Physiology
    • /
    • v.11 no.2
    • /
    • pp.73-78
    • /
    • 1977
  • In urethane anaesthetized rabbits, reflex contraction of the inferior oblique muscle of a unilateral rye was evoked by the stimulation of a relevant vestibular canal nerve. Eye movement evoked by the inferior oblique muscle contracion was carefully observed with naked eyes, and recorded by means of the electrooculographic and electronystagmographic methods. The following results were obtained. 1) Contraction of the inferior oblique muscle evoked by a canal nerve excitation produced excycloduction of the eyeball associated with depression (downward rotation) instead of elevation. 2) Such depression of the eyeball was demonstrated even after the resection of the inferior oblique muscle. These experimental evidences indicate that tile rotatory action (secondary action of the muscle) of the inferior oblique muscle in the rabbits is apparently different in its direction from those already reported in the binocular animals such as cat, dog and monkey.

  • PDF

Evaluation of Postural Control by Off-vertical Axis Rotation (탈수직축 회전자극을 이용한 자세조절기능의 평가)

  • 김규겸;이태호;김주환;고종선;박병림
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 1999.11a
    • /
    • pp.111-114
    • /
    • 1999
  • Off-vertical axis rotator was developed to differentiate each function of the canal and otolith in the vestibular system and evaluate subjective symptoms during postural change. Eye movement induced by various types of rotation was measured in normal subjects. Nystagmus with fast component corresponding to direction of rotation was occurred by sinusoidal earth vertical axis rotation, and the gain of eye movement in vestibuloocular reflex (VOR) was lower than in visual vestibuloocular reflex (VVOR) and higher than in visual fixed vestibuloocular reflex (VFX). Degree of dizziness was proportioned to degree of gain. off-vertical axis rotation was produced severe dizziness than earth vertical axis rotation. These results suggest stimulation of the otolith should be minimized to make a stable and pleasant condition in work and travel.

  • PDF

An Implementation of Gaze Recognition System Based on SVM (SVM 기반의 시선 인식 시스템의 구현)

  • Lee, Kue-Bum;Kim, Dong-Ju;Hong, Kwang-Seok
    • The KIPS Transactions:PartB
    • /
    • v.17B no.1
    • /
    • pp.1-8
    • /
    • 2010
  • The researches about gaze recognition which current user gazes and finds the location have increasingly developed to have many application. The gaze recognition of existence all about researches have got problems because of using equipment that Infrared(IR) LED, IR camera and head-mounted of high price. This study propose and implement the gaze recognition system based on SVM using a single PC Web camera. The proposed system that divide the gaze location of 36 per 9 and 4 to recognize gaze location of 4 direction and 9 direction recognize user's gaze. Also, the proposed system had apply on image filtering method using difference image entropy to improve performance of gaze recognition. The propose system was implements experiments on the comparison of proposed difference image entropy gaze recognition system, gaze recognition system using eye corner and eye's center and gaze recognition system based on PCA to evaluate performance of proposed system. The experimental results, recognition rate of 4 direction was 94.42% and 9 direction was 81.33% for the gaze recognition system based on proposed SVM. 4 direction was 95.37% and 9 direction was 82.25%, when image filtering method using difference image entropy implemented. The experimental results proved the high performance better than existed gaze recognition system.

Omni-directional Vision SLAM using a Motion Estimation Method based on Fisheye Image (어안 이미지 기반의 움직임 추정 기법을 이용한 전방향 영상 SLAM)

  • Choi, Yun Won;Choi, Jeong Won;Dai, Yanyan;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.8
    • /
    • pp.868-874
    • /
    • 2014
  • This paper proposes a novel mapping algorithm in Omni-directional Vision SLAM based on an obstacle's feature extraction using Lucas-Kanade Optical Flow motion detection and images obtained through fish-eye lenses mounted on robots. Omni-directional image sensors have distortion problems because they use a fish-eye lens or mirror, but it is possible in real time image processing for mobile robots because it measured all information around the robot at one time. In previous Omni-Directional Vision SLAM research, feature points in corrected fisheye images were used but the proposed algorithm corrected only the feature point of the obstacle. We obtained faster processing than previous systems through this process. The core of the proposed algorithm may be summarized as follows: First, we capture instantaneous $360^{\circ}$ panoramic images around a robot through fish-eye lenses which are mounted in the bottom direction. Second, we remove the feature points of the floor surface using a histogram filter, and label the candidates of the obstacle extracted. Third, we estimate the location of obstacles based on motion vectors using LKOF. Finally, it estimates the robot position using an Extended Kalman Filter based on the obstacle position obtained by LKOF and creates a map. We will confirm the reliability of the mapping algorithm using motion estimation based on fisheye images through the comparison between maps obtained using the proposed algorithm and real maps.