• Title/Summary/Keyword: Vision-Based Tracking

Search Result 405, Processing Time 0.03 seconds

Recognition Performance of Vestibular-Ocular Reflex Based Vision Tracking System for Mobile Robot (이동 로봇을 위한 전정안반사 기반 비젼 추적 시스템의 인식 성능 평가)

  • Park, Jae-Hong;Bhan, Wook;Choi, Tae-Young;Kwon, Hyun-Il;Cho, Dong-Il;Kim, Kwang-Soo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.15 no.5
    • /
    • pp.496-504
    • /
    • 2009
  • This paper presents a recognition performance of VOR (Vestibular-Ocular Reflex) based vision tracking system for mobile robot. The VOR is a reflex eye movement which, during head movements, produces an eye movement in the direction opposite to the head movement, thus maintaining the image of interested objects placed on the center of retina. We applied this physiological concept to the vision tracking system for high recognition performance in mobile environments. The proposed method was implemented in a vision tracking system consisting of a motion sensor module and an actuation module with vision sensor. We tested the developed system on an x/y stage and a rate table for linear motion and angular motion, respectively. The experimental results show that the recognition rates of the VOR-based method are three times more than non-VOR conventional vision system, which is mainly due to the fact that VOR-based vision tracking system has the line of sight of vision system to be fixed to the object, eventually reducing the blurring effect of images under the dynamic environment. It suggests that the VOR concept proposed in this paper can be applied efficiently to the vision tracking system for mobile robot.

Hybrid Inertial and Vision-Based Tracking for VR applications (가상 현실 어플리케이션을 위한 관성과 시각기반 하이브리드 트래킹)

  • Gu, Jae-Pil;An, Sang-Cheol;Kim, Hyeong-Gon;Kim, Ik-Jae;Gu, Yeol-Hoe
    • Proceedings of the KIEE Conference
    • /
    • 2003.11b
    • /
    • pp.103-106
    • /
    • 2003
  • In this paper, we present a hybrid inertial and vision-based tracking system for VR applications. One of the most important aspects of VR (Virtual Reality) is providing a correspondence between the physical and virtual world. As a result, accurate and real-time tracking of an object's position and orientation is a prerequisite for many applications in the Virtual Environments. Pure vision-based tracking has low jitter and high accuracy but cannot guarantee real-time pose recovery under all circumstances. Pure inertial tracking has high update rates and full 6DOF recovery but lacks long-term stability due to sensor noise. In order to overcome the individual drawbacks and to build better tracking system, we introduce the fusion of vision-based and inertial tracking. Sensor fusion makes the proposal tracking system robust, fast, accurate, and low jitter and noise. Hybrid tracking is implemented with Kalman Filter that operates in a predictor-corrector manner. Combining bluetooth serial communication module gives the system a full mobility and makes the system affordable, lightweight energy-efficient. and practical. Full 6DOF recovery and the full mobility of proposal system enable the user to interact with mobile device like PDA and provide the user with natural interface.

  • PDF

OnBoard Vision Based Object Tracking Control Stabilization Using PID Controller

  • Mariappan, Vinayagam;Lee, Minwoo;Cho, Juphil;Cha, Jaesang
    • International Journal of Advanced Culture Technology
    • /
    • v.4 no.4
    • /
    • pp.81-86
    • /
    • 2016
  • In this paper, we propose a simple and effective vision-based tracking controller design for autonomous object tracking using multicopter. The multicopter based automatic tracking system usually unstable when the object moved so the tracking process can't define the object position location exactly that means when the object moves, the system can't track object suddenly along to the direction of objects movement. The system will always looking for the object from the first point or its home position. In this paper, PID control used to improve the stability of tracking system, so that the result object tracking became more stable than before, it can be seen from error of tracking. A computer vision and control strategy is applied to detect a diverse set of moving objects on Raspberry Pi based platform and Software defined PID controller design to control Yaw, Throttle, Pitch of the multicopter in real time. Finally based series of experiment results and concluded that the PID control make the tracking system become more stable in real time.

A Hybrid Solar Tracking System using Weather Condition Estimates with a Vision Camera and GPS (날씨인식 결과를 이용한 GPS 와 비전센서기반 하이브리드 방식의 태양추적 시스템 개발)

  • Yoo, Jeongjae;Kang, Yeonsik
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.5
    • /
    • pp.557-562
    • /
    • 2014
  • It is well known that solar tracking systems can increase the efficiency of exiting solar panels significantly. In this paper, a hybrid solar tracking system has been developed by using both astronomical estimates from a GPS and the image processing results of a camera vision system. A decision making process is also proposed to distinguish current weather conditions using camera images. Based on the decision making results, the proposed hybrid tracking system switches two tracking control methods. The one control method is based on astronomical estimates of the current solar position. And the other control method is based on the solar image processing result. The developed hybrid solar tracking system is implemented on an experimental platform and the performance of the developed control methods are verified.

Vision-based Line Tracking and steering control of AGVs

  • Lee, Hyeon-Ho;Lee, Chang-Goo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.180.4-180
    • /
    • 2001
  • This paper describes a vision-based line-tracking system for AGV and steering control scheme. For detect the guideline quickly and exactly, We use four line-points which complement and predict each other. This low-cost line-tracking system is efficiently using PC-based real-time vision processing, Steering control is studied through an steering controller with guide-line angle and line-point error. This method is tested via a typical AGV with a single camera in laboratory environment.

  • PDF

Entity Matching for Vision-Based Tracking of Construction Workers Using Epipolar Geometry (영상 내 건설인력 위치 추적을 위한 등극선 기하학 기반의 개체 매칭 기법)

  • Lee, Yong-Joo;Kim, Do-Wan;Park, Man-Woo
    • Journal of KIBIM
    • /
    • v.5 no.2
    • /
    • pp.46-54
    • /
    • 2015
  • Vision-based tracking has been proposed as a means to efficiently track a large number of construction resources operating in a congested site. In order to obtain 3D coordinates of an object, it is necessary to employ stereo-vision theories. Detecting and tracking of multiple objects require an entity matching process that finds corresponding pairs of detected entities across the two camera views. This paper proposes an efficient way of entity matching for tracking of construction workers. The proposed method basically uses epipolar geometry which represents the relationship between the two fixed cameras. Each pixel coordinate in a camera view is projected onto the other camera view as an epipolar line. The proposed method finds the matching pair of a worker entity by comparing the proximity of the all detected entities in the other view to the epipolar line. Experimental results demonstrate its suitability for automated entity matching for 3D vision-based tracking of construction workers.

Particle Filter Based Feature Points Tracking for Vision Based Navigation System (영상기반항법을 위한 파티클 필터 기반의 특징점 추적 필터 설계)

  • Won, Dae-Hee;Sung, Sang-Kyung;Lee, Young-Jae
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.40 no.1
    • /
    • pp.35-42
    • /
    • 2012
  • In this study, a feature-points-tracking algorithm is suggested using a particle filter for vision based navigation system. By applying a dynamic model of the feature point, the tracking performance is increased in high dynamic condition, whereas a conventional KLT (Kanade-Lucas-Tomasi) cannot give a solution. Futhermore, the particle filter is introduced to cope with irregular characteristics of vision data. Post-processing of recorded vision data shows that the tracking performance of suggested algorithm is more robust than that of KLT in high dynamic condition.

Image-based structural dynamic displacement measurement using different multi-object tracking algorithms

  • Ye, X.W.;Dong, C.Z.;Liu, T.
    • Smart Structures and Systems
    • /
    • v.17 no.6
    • /
    • pp.935-956
    • /
    • 2016
  • With the help of advanced image acquisition and processing technology, the vision-based measurement methods have been broadly applied to implement the structural monitoring and condition identification of civil engineering structures. Many noncontact approaches enabled by different digital image processing algorithms are developed to overcome the problems in conventional structural dynamic displacement measurement. This paper presents three kinds of image processing algorithms for structural dynamic displacement measurement, i.e., the grayscale pattern matching (GPM) algorithm, the color pattern matching (CPM) algorithm, and the mean shift tracking (MST) algorithm. A vision-based system programmed with the three image processing algorithms is developed for multi-point structural dynamic displacement measurement. The dynamic displacement time histories of multiple vision points are simultaneously measured by the vision-based system and the magnetostrictive displacement sensor (MDS) during the laboratory shaking table tests of a three-story steel frame model. The comparative analysis results indicate that the developed vision-based system exhibits excellent performance in structural dynamic displacement measurement by use of the three different image processing algorithms. The field application experiments are also carried out on an arch bridge for the measurement of displacement influence lines during the loading tests to validate the effectiveness of the vision-based system.

Vision-Based Mobile Robot Navigation by Robust Path Line Tracking (시각을 이용한 이동 로봇의 강건한 경로선 추종 주행)

  • Son, Min-Hyuk;Do, Yong-Tae
    • Journal of Sensor Science and Technology
    • /
    • v.20 no.3
    • /
    • pp.178-186
    • /
    • 2011
  • Line tracking is a well defined method of mobile robot navigation. It is simple in concept, technically easy to implement, and already employed in many industrial sites. Among several different line tracking methods, magnetic sensing is widely used in practice. In comparison, vision-based tracking is less popular due mainly to its sensitivity to surrounding conditions such as brightness and floor characteristics although vision is the most powerful robotic sensing capability. In this paper, a vision-based robust path line detection technique is proposed for the navigation of a mobile robot assuming uncontrollable surrounding conditions. The technique proposed has four processing steps; color space transformation, pixel-level line sensing, block-level line sensing, and robot navigation control. This technique effectively uses hue and saturation color values in the line sensing so to be insensitive to the brightness variation. Line finding in block-level makes not only the technique immune from the error of line pixel detection but also the robot control easy. The proposed technique was tested with a real mobile robot and proved its effectiveness.

Classification between Intentional and Natural Blinks in Infrared Vision Based Eye Tracking System

  • Kim, Song-Yi;Noh, Sue-Jin;Kim, Jin-Man;Whang, Min-Cheol;Lee, Eui-Chul
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.601-607
    • /
    • 2012
  • Objective: The aim of this study is to classify between intentional and natural blinks in vision based eye tracking system. Through implementing the classification method, we expect that the great eye tracking method will be designed which will perform well both navigation and selection interactions. Background: Currently, eye tracking is widely used in order to increase immersion and interest of user by supporting natural user interface. Even though conventional eye tracking system is well focused on navigation interaction by tracking pupil movement, there is no breakthrough selection interaction method. Method: To determine classification threshold between intentional and natural blinks, we performed experiment by capturing eye images including intentional and natural blinks from 12 subjects. By analyzing successive eye images, two features such as eye closed duration and pupil size variation after eye open were collected. Then, the classification threshold was determined by performing SVM(Support Vector Machine) training. Results: Experimental results showed that the average detection accuracy of intentional blinks was 97.4% in wearable eye tracking system environments. Also, the detecting accuracy in non-wearable camera environment was 92.9% on the basis of the above used SVM classifier. Conclusion: By combining two features using SVM, we could implement the accurate selection interaction method in vision based eye tracking system. Application: The results of this research might help to improve efficiency and usability of vision based eye tracking method by supporting reliable selection interaction scheme.