• Title/Summary/Keyword: Visual tracking

Search Result 523, Processing Time 0.031 seconds

Object Tracking System Using Kalman Filter (칼만 필터를 이용한 물체 추적 시스템)

  • Xu, Yanan;Ban, Tae-Hak;Yuk, Jung-Soo;Park, Dong-Won;Jung, Hoe-kyung
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2013.10a
    • /
    • pp.1015-1017
    • /
    • 2013
  • Object tracking, in general, is a challenging problem. Difficulties in tracking objects can arise due to abrupt object motion, changing appearance patterns of both the object and the scene, non-rigid object structures, object-to-object and object-to-scene occlusions, and camera motion. Tracking is usually performed in the context of higher-level applications that require the location or the shape of the object in every frame. This paper describes an object tracking system based on active vision with two cameras, into algorithm of single camera tracking system an object active visual tracking and object locked system based on Extend Kalman Filter (EKF) is introduced, by analyzing data from which the next running state of the object can be figured out and after the tracking is performed at each of the cameras, the individual tracks are to be fused (combined) to obtain the final system object track.

  • PDF

Development of Intelligent Filler Wire Feeding Device for Improvement of Weld quality (용접부 품질향상을 위한 지능형 용접 와이어 공급 장치 개발)

  • Lee J.S.;Sohn Y.I.;Park K.Y.;Lee K.D.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2005.06a
    • /
    • pp.950-955
    • /
    • 2005
  • This paper describes an intelligent filler wire feeding device which can control 3- dimensional seam tracking and the filler wire speed by measuring the gap position and the joint gap width in laser welding. By means of visual sensor controlled filling the missing material into the joint gap and 3 dimensional seam tracking, lineup errors from manufacturing tolerances and the repeatability of lineup jigs and weld robot can be balanced and at an even seam quality which avoids weld defects. In this paper, we assessed weld quality in 2mm sheets of A16061 which had various gap width by using the intelligent filler wire feeding device.

  • PDF

Analysis of Visual Attention in Mobile Messenger Emoticons using Eye-Tracking (시선추적장치를 활용한 모바일 메신저 이모티콘의 시각적 주의집중 분석)

  • Park, Min Hee;Hwang, Mi Kyung;Kwon, Mahn Woo
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.3
    • /
    • pp.508-515
    • /
    • 2020
  • For the success of mobile messenger emoticons, it is important to grab the attentions of users or consumers and identify the influence factors that can satisfy empathy and emotional satisfaction. In this study, first, subjective evaluation of the mobile messenger emoticons of the subjects was examined through a preliminary survey, and then Eye-tracking experiments were conducted to identify the influence factors that can attention of the subject's eyes in the emoticons. The study revealed that emoticons such as Ompangi and Onaeui yeosin highlighting their characters mainly focus on characters(face). Secondly, Gyuiyomjueui and Handprinting emoticons focused on Text. Contrary to earlier studies, such results showed that people are presumed to focus on characteristic elements such as size, form, color and location of visually exposed elements rather than primarily having a keen interest in characters.

Robust Control of Robot Manipulators using Vision Systems

  • Lee, Young-Chan;Jie, Min-Seok;Lee, Kang-Woong
    • Journal of Advanced Navigation Technology
    • /
    • v.7 no.2
    • /
    • pp.162-170
    • /
    • 2003
  • In this paper, we propose a robust controller for trajectory control of n-link robot manipulators using feature based on visual feedback. In order to reduce tracking error of the robot manipulator due to parametric uncertainties, integral action is included in the dynamic control part of the inner control loop. The desired trajectory for tracking is generated from feature extraction by the camera mounted on the end effector. The stability of the robust state feedback control system is shown by the Lyapunov method. Simulation and experimental results on a 5-link robot manipulator with two degree of freedom show that the proposed method has good tracking performance.

  • PDF

Visual Target Tracking and Relative Navigation for Unmanned Aerial Vehicles in a GPS-Denied Environment

  • Kim, Youngjoo;Jung, Wooyoung;Bang, Hyochoong
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.15 no.3
    • /
    • pp.258-266
    • /
    • 2014
  • We present a system for the real-time visual relative navigation of a fixed-wing unmanned aerial vehicle in a GPS-denied environment. An extended Kalman filter is used to construct a vision-aided navigation system by fusing the image processing results with barometer and inertial sensor measurements. Using a mean-shift object tracking algorithm, an onboard vision system provides pixel measurements to the navigation filter. The filter is slightly modified to deal with delayed measurements from the vision system. The image processing algorithm and the navigation filter are verified by flight tests. The results show that the proposed aerial system is able to maintain circling around a target without using GPS data.

Implementation of tracking and grasping the moving object using visual feedback (영상궤환을 이용한 이동체의 주적 및 잡기 작업의 구현)

  • Kwon, Chul;Kang, Hyung-Jin;Park, Mig-Non
    • Proceedings of the KIEE Conference
    • /
    • 1995.11a
    • /
    • pp.579-582
    • /
    • 1995
  • Recently, the vision system has the wide and growing' application field on account of the vast information from that visual mechanism. Especially, in the control field, the vision system has been applied to the industrial robot. In this paper, the object tracking and grasping task is accomplished by the robot vision system with a camera in the robot hand. The camera setting method is proposed to implement that task in a simple way. In spite of the calibration error, the stable grasping task is achieved using the tracking control algorithm based on the vision feature.

  • PDF

Mashup Web of Visual Structure using the Mechanism of GPIS Tracking (GPIS Tracking 메커니즘의 시각구조 Mashup Web)

  • An, Sung-Eun;Kim, Jung-Joong
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.2 no.4
    • /
    • pp.65-71
    • /
    • 2009
  • Web Contents have been evolved from statics, dynamics contents to Mashup service. In recent years, Mashup which can creates new contents by mixing different services is topic on the web service. But, Mashup is only concentrate on making new contents to provide for client. In this paper, we propose the policy of composing services and client-friendly UI.

  • PDF

Pose Tracking of Moving Sensor using Monocular Camera and IMU Sensor

  • Jung, Sukwoo;Park, Seho;Lee, KyungTaek
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.8
    • /
    • pp.3011-3024
    • /
    • 2021
  • Pose estimation of the sensor is important issue in many applications such as robotics, navigation, tracking, and Augmented Reality. This paper proposes visual-inertial integration system appropriate for dynamically moving condition of the sensor. The orientation estimated from Inertial Measurement Unit (IMU) sensor is used to calculate the essential matrix based on the intrinsic parameters of the camera. Using the epipolar geometry, the outliers of the feature point matching are eliminated in the image sequences. The pose of the sensor can be obtained from the feature point matching. The use of IMU sensor can help initially eliminate erroneous point matches in the image of dynamic scene. After the outliers are removed from the feature points, these selected feature points matching relations are used to calculate the precise fundamental matrix. Finally, with the feature point matching relation, the pose of the sensor is estimated. The proposed procedure was implemented and tested, comparing with the existing methods. Experimental results have shown the effectiveness of the technique proposed in this paper.

A study on visual tracking of the underwater mobile robot for nuclear reactor vessel inspection

  • Cho, Jai-Wan;Kim, Chang-Hoi;Choi, Young-Soo;Seo, Yong-Chil;Kim, Seung-Ho
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.1244-1248
    • /
    • 2003
  • This paper describes visual tracking procedure of the underwater mobile robot for nuclear reactor vessel inspection, which is required to find the foreign objects such as loose parts. The yellowish underwater robot body tends to present a big contrast to boron solute cold water of nuclear reactor vessel, tinged with indigo by Cerenkov effect. In this paper, we have found and tracked the positions of underwater mobile robot using the two color information, yellow and indigo. The center coordinates extraction procedures are as follows. The first step is to segment the underwater robot body to cold water with indigo background. From the RGB color components of the entire monitoring image taken with the color CCD camera, we have selected the red color component. In the selected red image, we extracted the positions of the underwater mobile robot using the following process sequences; binarization, labelling, and centroid extraction techniques. In the experiment carried out at the Youngkwang unit 5 nuclear reactor vessel, we have tracked the center positions of the underwater robot submerged near the cold leg and the hot leg way, which is fathomed to 10m deep in depth.

  • PDF

Image-based Visual Servoing Through Range and Feature Point Uncertainty Estimation of a Target for a Manipulator (목표물의 거리 및 특징점 불확실성 추정을 통한 매니퓰레이터의 영상기반 비주얼 서보잉)

  • Lee, Sanghyob;Jeong, Seongchan;Hong, Young-Dae;Chwa, Dongkyoung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.22 no.6
    • /
    • pp.403-410
    • /
    • 2016
  • This paper proposes a robust image-based visual servoing scheme using a nonlinear observer for a monocular eye-in-hand manipulator. The proposed control method is divided into a range estimation phase and a target-tracking phase. In the range estimation phase, the range from the camera to the target is estimated under the non-moving target condition to solve the uncertainty of an interaction matrix. Then, in the target-tracking phase, the feature point uncertainty caused by the unknown motion of the target is estimated and feature point errors converge sufficiently near to zero through compensation for the feature point uncertainty.