• Title/Summary/Keyword: vision-based tracking

Search Result 405, Processing Time 0.031 seconds

Target Tracking of the Wheeled Mobile Robot using the Combined Visual Servo Control Method (혼합 비주얼 서보 제어 기법을 이용한 이동로봇의 목표물 추종)

  • Lee, Ho-Won;Kwon, Ji-Wook;Hong, Suk-Kyo;Chwa, Dong-Kyoung
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.60 no.6
    • /
    • pp.1245-1254
    • /
    • 2011
  • This paper proposes a target tracking algorithm for wheeled mobile robots using in various fields. For the stable tracking, we apply a vision system to a mobile robot which can extract targets through image processing algorithms. Furthermore, this paper presents an algorithm to position the mobile robot at the desired location from the target by estimating its relative position and attitude. We show the problem in the tracking method using the Position-Based Visual Servo(PBVS) control, and propose a tracking method, which can achieve the stable tracking performance by combining the PBVS control with Image-Based Visual Servo(IBVS) control. When the target is located around the outskirt of the camera image, the target can disappear from the field of view. Thus the proposed algorithm combines the control inputs with of the hyperbolic form the switching function to solve this problem. Through both simulations and experiments for the mobile robot we have confirmed that the proposed visual servo control method is able to enhance the stability compared to of the method using only either PBVS or IBVS control method.

Development of Low-Cost Vision-based Eye Tracking Algorithm for Information Augmented Interactive System

  • Park, Seo-Jeon;Kim, Byung-Gyu
    • Journal of Multimedia Information System
    • /
    • v.7 no.1
    • /
    • pp.11-16
    • /
    • 2020
  • Deep Learning has become the most important technology in the field of artificial intelligence machine learning, with its high performance overwhelming existing methods in various applications. In this paper, an interactive window service based on object recognition technology is proposed. The main goal is to implement an object recognition technology using this deep learning technology to remove the existing eye tracking technology, which requires users to wear eye tracking devices themselves, and to implement an eye tracking technology that uses only usual cameras to track users' eye. We design an interactive system based on efficient eye detection and pupil tracking method that can verify the user's eye movement. To estimate the view-direction of user's eye, we initialize to make the reference (origin) coordinate. Then the view direction is estimated from the extracted eye pupils from the origin coordinate. Also, we propose a blink detection technique based on the eye apply ratio (EAR). With the extracted view direction and eye action, we provide some augmented information of interest without the existing complex and expensive eye-tracking systems with various service topics and situations. For verification, the user guiding service is implemented as a proto-type model with the school map to inform the location information of the desired location or building.

Voting based Cue Integration for Visual Servoing

  • Cho, Che-Seung;Chung, Byeong-Mook
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.798-802
    • /
    • 2003
  • The robustness and reliability of vision algorithms is the key issue in robotic research and industrial applications. In this paper, the robust real time visual tracking in complex scene is considered. A common approach to increase robustness of a tracking system is to use different models (CAD model etc.) known a priori. Also fusion of multiple features facilitates robust detection and tracking of objects in scenes of realistic complexity. Because voting is a very simple or no model is needed for fusion, voting-based fusion of cues is applied. The approach for this algorithm is tested in a 3D Cartesian robot which tracks a toy vehicle moving along 3D rail, and the Kalman filter is used to estimate the motion parameters, namely the system state vector of moving object with unknown dynamics. Experimental results show that fusion of cues and motion estimation in a tracking system has a robust performance.

  • PDF

Emulation of Anti-alias Filtering in Vision Based Motion Mmeasurement (비전 센서의 앨리어싱 방지 필터링 모방 기법)

  • Kim, Jung-Hyun
    • The Journal of Korea Robotics Society
    • /
    • v.6 no.1
    • /
    • pp.18-26
    • /
    • 2011
  • This paper presents a method, Exposure Controlled Temporal Filtering (ECF), applied to visual motion tracking, that can cancel the temporal aliasing of periodic vibrations of cameras and fluctuations in illumination through the control of exposure time. We first present a theoretical analysis of the exposure induced image time integration process and how it samples sensor impingent light that is periodically fluctuating. Based on this analysis we develop a simple method to cancel high frequency vibrations that are temporally aliased onto sampled image sequences and thus to subsequent motion tracking measurements. Simulations and experiments using the 'Center of Gravity' and Normalized Cross-Correlation motion tracking methods were performed on a microscopic motion tracking system to validate the analytical predictions.

Preceding Vehicle Detection and Tracking with Motion Estimation by Radar-vision Sensor Fusion (레이더와 비전센서 융합기반의 움직임추정을 이용한 전방차량 검출 및 추적)

  • Jang, Jaehwan;Kim, Gyeonghwan
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.49 no.12
    • /
    • pp.265-274
    • /
    • 2012
  • In this paper, we propose a method for preceding vehicle detection and tracking with motion estimation by radar-vision sensor fusion. The motion estimation proposed results in not only correction of inaccurate lateral position error observed on a radar target, but also adaptive detection and tracking of a preceding vehicle by compensating the changes in the geometric relation between the ego-vehicle and the ground due to the driving. Furthermore, the feature-based motion estimation employed to lessen computational burden reduces the number of deployment of the vehicle validation procedure. Experimental results prove that the correction by the proposed motion estimation improves the performance of the vehicle detection and makes the tracking accurate with high temporal consistency under various road conditions.

Study of Intelligent Vision Sensor for the Robotic Laser Welding

  • Kim, Chang-Hyun;Choi, Tae-Yong;Lee, Ju-Jang;Suh, Jeong;Park, Kyoung-Taik;Kang, Hee-Shin
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.22 no.4
    • /
    • pp.447-457
    • /
    • 2019
  • The intelligent sensory system is required to ensure the accurate welding performance. This paper describes the development of an intelligent vision sensor for the robotic laser welding. The sensor system includes a PC based vision camera and a stripe-type laser diode. A set of robust image processing algorithms are implemented. The laser-stripe sensor can measure the profile of the welding object and obtain the seam line. Moreover, the working distance of the sensor can be changed and other configuration is adjusted accordingly. The robot, the seam tracking system, and CW Nd:YAG laser are used for the laser welding robot system. The simple and efficient control scheme of the whole system is also presented. The profile measurement and the seam tracking experiments were carried out to validate the operation of the system.

Implementation of Improved Object Detection and Tracking based on Camshift and SURF for Augmented Reality Service (증강현실 서비스를 위한 Camshift와 SURF를 개선한 객체 검출 및 추적 구현)

  • Lee, Yong-Hwan;Kim, Heung-Jun
    • Journal of the Semiconductor & Display Technology
    • /
    • v.16 no.4
    • /
    • pp.97-102
    • /
    • 2017
  • Object detection and tracking have become one of the most active research areas in the past few years, and play an important role in computer vision applications over our daily life. Many tracking techniques are proposed, and Camshift is an effective algorithm for real time dynamic object tracking, which uses only color features, so that the algorithm is sensitive to illumination and some other environmental elements. This paper presents and implements an effective moving object detection and tracking to reduce the influence of illumination interference, which improve the performance of tracking under similar color background. The implemented prototype system recognizes object using invariant features, and reduces the dimension of feature descriptor to rectify the problems. The experimental result shows that that the system is superior to the existing methods in processing time, and maintains better problem ratios in various environments.

  • PDF

Position Improvement of a Mobile Robot by Real Time Tracking of Multiple Moving Objects (실시간 다중이동물체 추적에 의한 이동로봇의 위치개선)

  • Jin, Tae-Seok;Lee, Min-Jung;Tack, Han-Ho;Lee, In-Yong;Lee, Joon-Tark
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.2
    • /
    • pp.187-192
    • /
    • 2008
  • The Intelligent Space(ISpace) provides challenging research fields for surveillance, human-computer interfacing, networked camera conferencing, industrial monitoring or service and training applications. ISpace is the space where many intelligent devices, such as computers and sensors, are distributed. According to the cooperation of many intelligent devices, the environment, it is very important that the system knows the location information to offer the useful services. In order to achieve these goals, we present a method for representing, tracking and human Jollowing by fusing distributed multiple vision systems in ISpace, with application to pedestrian tracking in a crowd. This paper describes appearance based unknown object tracking with the distributed vision system in intelligent space. First, we discuss how object color information is obtained and how the color appearance based model is constructed from this data. Then, we discuss the global color model based on the local color information. The process of learning within global model and the experimental results are also presented.

A Study on Multi-Object Tracking Method using Color Clustering in ISpace (컬러 클러스터링 기법을 이용한 공간지능화의 다중이동물체 추척 기법)

  • Jin, Tae-Seok;Kim, Hyun-Deok
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.11 no.11
    • /
    • pp.2179-2184
    • /
    • 2007
  • The Intelligent Space(ISpace) provides challenging research fields for surveillance, human-computer interfacing, networked camera conferencing, industrial monitoring or service and training applications. ISpace is the space where many intelligent devices, such as computers and sensors, are distributed. According to the cooperation of many intelligent devices, the environment, it is very important that the system knows the location information to offer the useful services. In order to achieve these goals, we present a method for representing, tracking and human following by fusing distributed multiple vision systems in ISpace, with application to pedestrian tracking in a crowd. This paper described appearance based unknown object tracking with the distributed vision system in intelligent space. First, we discuss how object color information is obtained and how the color appearance based model is constructed from this data. Then, we discuss the global color model based on the local color information. The process of learning within global model and the experimental results are also presented.

Improved Tracking System and Realistic Drawing for Real-Time Water-Based Sign Pen (향상된 트래킹 시스템과 실시간 수성 사인펜을 위한 사실적 드로잉)

  • Hur, Hyejung;Lee, Ju-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.19 no.2
    • /
    • pp.125-132
    • /
    • 2014
  • In this paper, we present marker-less fingertip and brush tracking system with inexpensive web camera. Parallel computation using CUDA is applied to the tracking system. This tracking system can run on inexpensive environment such as a laptop or a desktop and support for real-time application. We also present realistic water-based sign pen drawing model and implementation. The realistic drawing application with our inexpensive real-time fingertip and brush tracking system shows us the art class of the future. The realistic drawing application, along with our inexpensive real-time fingertip and brush tracking system, would be utilized in test-bed for the future high-technology education environment.