• 제목/요약/키워드: Visual tracking

검색결과 528건 처리시간 0.032초

칼만 필터를 이용한 물체 추적 시스템 (Object Tracking System Using Kalman Filter)

  • 서아남;반태학;육정수;박동원;정회경
    • 한국정보통신학회:학술대회논문집
    • /
    • 한국정보통신학회 2013년도 추계학술대회
    • /
    • pp.1015-1017
    • /
    • 2013
  • 물체의 움직임에 관한 추적방법은 여러 가지 문제점을 갖고 있다. 물체의 움직임에 관한 추적방법은 물제의 장면, 비 강체 물체의 구조, 물체와 물체 및 물체의 장면 폐색 및 카메라의 움직임과 모두 움직이는 물체의 패턴변화에 의해 결정되기 때문이다. 추적방법은 일반적으로 매 프레임의 위치나 물체의 형상을 필요로 하는 높은 수준의 응용프로그램이나 시스템 내에서 처리된다. 본 논문에서는 확장 칼만 필터(EKF)에 따라 물체의 활성 시각 추적 물체 잠금 시스템을 실행하고, 실행된 데이터를 바탕으로 분석하여 도입된 단일 카메라 추적 시스템 알고리즘에 2대의 카메라와 각각의 비전에 따라 물체 추적 시스템을 설명하고, 물체의 상태를 파악하여 각 카메라에서 움직임에 관한 추적이 실행된 후 개별 트랙에 최종 시스템 물체의 움직임 트랙과 결합하여 사용되는 추적시스템에 대해 연구하였다.

  • PDF

용접부 품질향상을 위한 지능형 용접 와이어 공급 장치 개발 (Development of Intelligent Filler Wire Feeding Device for Improvement of Weld quality)

  • 이재석;손영일;박기영;이경돈
    • 한국정밀공학회:학술대회논문집
    • /
    • 한국정밀공학회 2005년도 춘계학술대회 논문집
    • /
    • pp.950-955
    • /
    • 2005
  • This paper describes an intelligent filler wire feeding device which can control 3- dimensional seam tracking and the filler wire speed by measuring the gap position and the joint gap width in laser welding. By means of visual sensor controlled filling the missing material into the joint gap and 3 dimensional seam tracking, lineup errors from manufacturing tolerances and the repeatability of lineup jigs and weld robot can be balanced and at an even seam quality which avoids weld defects. In this paper, we assessed weld quality in 2mm sheets of A16061 which had various gap width by using the intelligent filler wire feeding device.

  • PDF

시선추적장치를 활용한 모바일 메신저 이모티콘의 시각적 주의집중 분석 (Analysis of Visual Attention in Mobile Messenger Emoticons using Eye-Tracking)

  • 박민희;황미경;권만우
    • 한국멀티미디어학회논문지
    • /
    • 제23권3호
    • /
    • pp.508-515
    • /
    • 2020
  • For the success of mobile messenger emoticons, it is important to grab the attentions of users or consumers and identify the influence factors that can satisfy empathy and emotional satisfaction. In this study, first, subjective evaluation of the mobile messenger emoticons of the subjects was examined through a preliminary survey, and then Eye-tracking experiments were conducted to identify the influence factors that can attention of the subject's eyes in the emoticons. The study revealed that emoticons such as Ompangi and Onaeui yeosin highlighting their characters mainly focus on characters(face). Secondly, Gyuiyomjueui and Handprinting emoticons focused on Text. Contrary to earlier studies, such results showed that people are presumed to focus on characteristic elements such as size, form, color and location of visually exposed elements rather than primarily having a keen interest in characters.

Robust Control of Robot Manipulators using Vision Systems

  • 이영찬;지민석;이강웅
    • 한국항행학회논문지
    • /
    • 제7권2호
    • /
    • pp.162-170
    • /
    • 2003
  • In this paper, we propose a robust controller for trajectory control of n-link robot manipulators using feature based on visual feedback. In order to reduce tracking error of the robot manipulator due to parametric uncertainties, integral action is included in the dynamic control part of the inner control loop. The desired trajectory for tracking is generated from feature extraction by the camera mounted on the end effector. The stability of the robust state feedback control system is shown by the Lyapunov method. Simulation and experimental results on a 5-link robot manipulator with two degree of freedom show that the proposed method has good tracking performance.

  • PDF

Visual Target Tracking and Relative Navigation for Unmanned Aerial Vehicles in a GPS-Denied Environment

  • Kim, Youngjoo;Jung, Wooyoung;Bang, Hyochoong
    • International Journal of Aeronautical and Space Sciences
    • /
    • 제15권3호
    • /
    • pp.258-266
    • /
    • 2014
  • We present a system for the real-time visual relative navigation of a fixed-wing unmanned aerial vehicle in a GPS-denied environment. An extended Kalman filter is used to construct a vision-aided navigation system by fusing the image processing results with barometer and inertial sensor measurements. Using a mean-shift object tracking algorithm, an onboard vision system provides pixel measurements to the navigation filter. The filter is slightly modified to deal with delayed measurements from the vision system. The image processing algorithm and the navigation filter are verified by flight tests. The results show that the proposed aerial system is able to maintain circling around a target without using GPS data.

영상궤환을 이용한 이동체의 주적 및 잡기 작업의 구현 (Implementation of tracking and grasping the moving object using visual feedback)

  • 권철;강형진;박민용
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 1995년도 추계학술대회 논문집 학회본부
    • /
    • pp.579-582
    • /
    • 1995
  • Recently, the vision system has the wide and growing' application field on account of the vast information from that visual mechanism. Especially, in the control field, the vision system has been applied to the industrial robot. In this paper, the object tracking and grasping task is accomplished by the robot vision system with a camera in the robot hand. The camera setting method is proposed to implement that task in a simple way. In spite of the calibration error, the stable grasping task is achieved using the tracking control algorithm based on the vision feature.

  • PDF

GPIS Tracking 메커니즘의 시각구조 Mashup Web (Mashup Web of Visual Structure using the Mechanism of GPIS Tracking)

  • 안성은;김정중
    • 한국정보전자통신기술학회논문지
    • /
    • 제2권4호
    • /
    • pp.65-71
    • /
    • 2009
  • 인터넷 웹 콘텐츠는 정적, 동적 데이터로부터 Mashup 서비스까지 진화되어왔다. 최근에 화두가 되고 있는 Mashup 서비스는 서로 다른 서비스들을 혼합하여 새로운 형태의 정보를 제공할 수 있는 콘텐츠를 생성해낸다, 하지만 서비스 구성에 있어 시스템 자원의 활용보다는 콘텐츠 자체에만 초점을 맞추고 있는 현실이다. 논 본문에서는 Mashup 서비스를 구성하는 정책과 사용자 친화적인 화면 구성방안을 제안한다.

  • PDF

Pose Tracking of Moving Sensor using Monocular Camera and IMU Sensor

  • Jung, Sukwoo;Park, Seho;Lee, KyungTaek
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제15권8호
    • /
    • pp.3011-3024
    • /
    • 2021
  • Pose estimation of the sensor is important issue in many applications such as robotics, navigation, tracking, and Augmented Reality. This paper proposes visual-inertial integration system appropriate for dynamically moving condition of the sensor. The orientation estimated from Inertial Measurement Unit (IMU) sensor is used to calculate the essential matrix based on the intrinsic parameters of the camera. Using the epipolar geometry, the outliers of the feature point matching are eliminated in the image sequences. The pose of the sensor can be obtained from the feature point matching. The use of IMU sensor can help initially eliminate erroneous point matches in the image of dynamic scene. After the outliers are removed from the feature points, these selected feature points matching relations are used to calculate the precise fundamental matrix. Finally, with the feature point matching relation, the pose of the sensor is estimated. The proposed procedure was implemented and tested, comparing with the existing methods. Experimental results have shown the effectiveness of the technique proposed in this paper.

A study on visual tracking of the underwater mobile robot for nuclear reactor vessel inspection

  • Cho, Jai-Wan;Kim, Chang-Hoi;Choi, Young-Soo;Seo, Yong-Chil;Kim, Seung-Ho
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2003년도 ICCAS
    • /
    • pp.1244-1248
    • /
    • 2003
  • This paper describes visual tracking procedure of the underwater mobile robot for nuclear reactor vessel inspection, which is required to find the foreign objects such as loose parts. The yellowish underwater robot body tends to present a big contrast to boron solute cold water of nuclear reactor vessel, tinged with indigo by Cerenkov effect. In this paper, we have found and tracked the positions of underwater mobile robot using the two color information, yellow and indigo. The center coordinates extraction procedures are as follows. The first step is to segment the underwater robot body to cold water with indigo background. From the RGB color components of the entire monitoring image taken with the color CCD camera, we have selected the red color component. In the selected red image, we extracted the positions of the underwater mobile robot using the following process sequences; binarization, labelling, and centroid extraction techniques. In the experiment carried out at the Youngkwang unit 5 nuclear reactor vessel, we have tracked the center positions of the underwater robot submerged near the cold leg and the hot leg way, which is fathomed to 10m deep in depth.

  • PDF

목표물의 거리 및 특징점 불확실성 추정을 통한 매니퓰레이터의 영상기반 비주얼 서보잉 (Image-based Visual Servoing Through Range and Feature Point Uncertainty Estimation of a Target for a Manipulator)

  • 이상협;정성찬;홍영대;좌동경
    • 제어로봇시스템학회논문지
    • /
    • 제22권6호
    • /
    • pp.403-410
    • /
    • 2016
  • This paper proposes a robust image-based visual servoing scheme using a nonlinear observer for a monocular eye-in-hand manipulator. The proposed control method is divided into a range estimation phase and a target-tracking phase. In the range estimation phase, the range from the camera to the target is estimated under the non-moving target condition to solve the uncertainty of an interaction matrix. Then, in the target-tracking phase, the feature point uncertainty caused by the unknown motion of the target is estimated and feature point errors converge sufficiently near to zero through compensation for the feature point uncertainty.