• Title/Summary/Keyword: camera vision

Search Result 1,377, Processing Time 0.027 seconds

Vision-based Small UAV Indoor Flight Test Environment Using Multi-Camera (멀티카메라를 이용한 영상정보 기반의 소형무인기 실내비행시험환경 연구)

  • Won, Dae-Yeon;Oh, Hyon-Dong;Huh, Sung-Sik;Park, Bong-Gyun;Ahn, Jong-Sun;Shim, Hyun-Chul;Tahk, Min-Jea
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.37 no.12
    • /
    • pp.1209-1216
    • /
    • 2009
  • This paper presents the pose estimation of a small UAV utilizing visual information from low cost cameras installed indoor. To overcome the limitation of the outside flight experiment, the indoor flight test environment based on multi-camera systems is proposed. Computer vision algorithms for the proposed system include camera calibration, color marker detection, and pose estimation. The well-known extended Kalman filter is used to obtain an accurate position and pose estimation for the small UAV. This paper finishes with several experiment results illustrating the performance and properties of the proposed vision-based indoor flight test environment.

Vision Based Position Control of a Robot Manipulator Using an Elitist Genetic Algorithm (엘리트 유전 알고리즘을 이용한 비젼 기반 로봇의 위치 제어)

  • Park, Kwang-Ho;Kim, Dong-Joon;Kee, Seok-Ho;Kee, Chang-Doo
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.19 no.1
    • /
    • pp.119-126
    • /
    • 2002
  • In this paper, we present a new approach based on an elitist genetic algorithm for the task of aligning the position of a robot gripper using CCD cameras. The vision-based control scheme for the task of aligning the gripper with the desired position is implemented by image information. The relationship between the camera space location and the robot joint coordinates is estimated using a camera-space parameter modal that generalizes known manipulator kinematics to accommodate unknown relative camera position and orientation. To find the joint angles of a robot manipulator for reaching the target position in the image space, we apply an elitist genetic algorithm instead of a nonlinear least square error method. Since GA employs parallel search, it has good performance in solving optimization problems. In order to improve convergence speed, the real coding method and geometry constraint conditions are used. Experiments are carried out to exhibit the effectiveness of vision-based control using an elitist genetic algorithm with a real coding method.

Implementation of Enhanced Vision for an Autonomous Map-based Robot Navigation

  • Roland, Cubahiro;Choi, Donggyu;Kim, Minyoung;Jang, Jongwook
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2021.10a
    • /
    • pp.41-43
    • /
    • 2021
  • Robot Operating System (ROS) has been a prominent and successful framework used in robotics business and academia.. However, the framework has long been focused and limited to navigation of robots and manipulation of objects in the environment. This focus leaves out other important field such as speech recognition, vision abilities, etc. Our goal is to take advantage of ROS capacity to integrate additional libraries of programming functions aimed at real-time computer vision with a depth-image camera. In this paper we will focus on the implementation of an upgraded vision with the help of a depth camera which provides a high quality data for a much enhanced and accurate understanding of the environment. The varied data from the cameras are then incorporated in ROS communication structure for any potential use. For this particular case, the system will use OpenCV libraries to manipulate the data from the camera and provide a face-detection capabilities to the robot, while navigating an indoor environment. The whole system has been implemented and tested on the latest technologies of Turtlebot3 and Raspberry Pi4.

  • PDF

Evaluation of Defects in the Bonded Area of Shoes using an Infrared Thermal Vision Camera

  • Kim, Jae-Yeol;Yang, Dong-Jo;Kim, Chang-Hyun
    • International Journal of Control, Automation, and Systems
    • /
    • v.1 no.4
    • /
    • pp.511-514
    • /
    • 2003
  • The Infrared Camera usually detects only Infrared waves emitted from the light in order to illustrate the temperature distribution. An Infrared diagnosis system can be applied to various fields. But the defect discrimination can be automatic or mechanized in the special shoes total inspection system. This study introduces a method for special shoes nondestructive total inspection. Performance of the proposed method is shown through thermo-Image.

An Efficient Camera Calibration Method for Head Pose Tracking (머리의 자세를 추적하기 위한 효율적인 카메라 보정 방법에 관한 연구)

  • Park, Gyeong-Su;Im, Chang-Ju;Lee, Gyeong-Tae
    • Journal of the Ergonomics Society of Korea
    • /
    • v.19 no.1
    • /
    • pp.77-90
    • /
    • 2000
  • The aim of this study is to develop and evaluate an efficient camera calibration method for vision-based head tracking. Tracking head movements is important in the design of an eye-controlled human/computer interface. A vision-based head tracking system was proposed to allow the user's head movements in the design of the eye-controlled human/computer interface. We proposed an efficient camera calibration method to track the 3D position and orientation of the user's head accurately. We also evaluated the performance of the proposed method. The experimental error analysis results showed that the proposed method can provide more accurate and stable pose (i.e. position and orientation) of the camera than the conventional direct linear transformation method which has been used in camera calibration. The results of this study can be applied to the tracking head movements related to the eye-controlled human/computer interface and the virtual reality technology.

  • PDF

A New Hand-eye Calibration Technique to Compensate for the Lens Distortion Effect (렌즈왜곡효과를 보상하는 새로운 Hand-eye 보정기법)

  • Chung, Hoi-Bum
    • Proceedings of the KSME Conference
    • /
    • 2000.11a
    • /
    • pp.596-601
    • /
    • 2000
  • In a robot/vision system, the vision sensor, typically a CCD array sensor, is mounted on the robot hand. The problem of determining the relationship between the camera frame and the robot hand frame is refered to as the hand-eye calibration. In the literature, various methods have been suggested to calibrate camera and for sensor registration. Recently, one-step approach which combines camera calibration and sensor registration is suggested by Horaud & Dornaika. In this approach, camera extrinsic parameters are not need to be determined at all configurations of robot. In this paper, by modifying the camera model and including the lens distortion effect in the perspective transformation matrix, a new one-step approach is proposed in the hand-eye calibration.

  • PDF

Control of an Underwater Stereo Camera Embedded in a Single Canister Capable of Measuring Distance (거리측정이 가능한 단동형 수중 스테레오 카메라의 제어)

  • 이판묵;전봉환;이종무
    • Proceedings of the Korea Committee for Ocean Resources and Engineering Conference
    • /
    • 2000.10a
    • /
    • pp.90-95
    • /
    • 2000
  • This paper presents the vergence control of a parallel stereo camera and its application to underwater stereo camera to enhance the working efficiency of underwater vehicles that equips with manipulators in seabed operation. The stereo camera consists of two parallel lenses mounted on a lateral moving base and two CCD cameras mounted on a longitudinal moving base, which is embedded in a small pressure canister for underwater application. Because the lateral shift is related to the backward shift with a nonlinear relation, only one control input is needed to control the vergence and focus of the camera with a special driving device. We can get a clear stereo vision with the camera for all the range of objects in air and in water, especially in short range objects. The control system of the camera is so simple that we are able to realize a small stereo camera system and to apply it to a stereo vision system for underwater vehicles. This paper also shows how to acquire the distance information of an underwater object with this stereo camera. Whenever we focus on an underwater object with the camera, we can obtain the three-dimensional images and the distance information in real-time.

  • PDF

Control of an Underwater Stereo Camera Embedded in a Single Canister Capable of Measuring Distance (거리측정이 가능한 단동형 수중 스테레오 카메라의 제어)

  • 이판묵;전봉환;이종무
    • Journal of Ocean Engineering and Technology
    • /
    • v.15 no.1
    • /
    • pp.79-84
    • /
    • 2001
  • This paper present the control of the image disparity of a parallel stereo camera and its application to an underwater stereo camera to enhance the working efficiency of underwater vehicles that are equiped with manipulators in seabed operation. The stereo camera consists of two parallel lenses mounted on a lateral moving base and two CCD cameras mounted on a longitudinal moving base, which is embedded in a small pressure canister for underwater application. Because the lateral shift is related to the backward shift with a nonlinear relation, only one control input is needed to control the vergence and focus of the camera with a special driving device. We can get clear stereo vision with the camera for all the range of objects in air and in water, especially in short range object. The control system of the camera is so simple that we are able to realize a small stereo camera system and apply it to a stereo vision system for underwater vehicles. This paper also shows how to acquire the distance information of an underwater object with this stereo camera. Whenever we focus on an underwater object with the camera, we can obtain three-dimensional images and distance information in real-time.

  • PDF

An Improved Fast Camera Calibration Method for Mobile Terminals

  • Guan, Fang-li;Xu, Ai-jun;Jiang, Guang-yu
    • Journal of Information Processing Systems
    • /
    • v.15 no.5
    • /
    • pp.1082-1095
    • /
    • 2019
  • Camera calibration is an important part of machine vision and close-range photogrammetry. Since current calibration methods fail to obtain ideal internal and external camera parameters with limited computing resources on mobile terminals efficiently, this paper proposes an improved fast camera calibration method for mobile terminals. Based on traditional camera calibration method, the new method introduces two-order radial distortion and tangential distortion models to establish the camera model with nonlinear distortion items. Meanwhile, the nonlinear least square L-M algorithm is used to optimize parameters iteration, the new method can quickly obtain high-precise internal and external camera parameters. The experimental results show that the new method improves the efficiency and precision of camera calibration. Terminals simulation experiment on PC indicates that the time consuming of parameter iteration reduced from 0.220 seconds to 0.063 seconds (0.234 seconds on mobile terminals) and the average reprojection error reduced from 0.25 pixel to 0.15 pixel. Therefore, the new method is an ideal mobile terminals camera calibration method which can expand the application range of 3D reconstruction and close-range photogrammetry technology on mobile terminals.

Entity Matching for Vision-Based Tracking of Construction Workers Using Epipolar Geometry (영상 내 건설인력 위치 추적을 위한 등극선 기하학 기반의 개체 매칭 기법)

  • Lee, Yong-Joo;Kim, Do-Wan;Park, Man-Woo
    • Journal of KIBIM
    • /
    • v.5 no.2
    • /
    • pp.46-54
    • /
    • 2015
  • Vision-based tracking has been proposed as a means to efficiently track a large number of construction resources operating in a congested site. In order to obtain 3D coordinates of an object, it is necessary to employ stereo-vision theories. Detecting and tracking of multiple objects require an entity matching process that finds corresponding pairs of detected entities across the two camera views. This paper proposes an efficient way of entity matching for tracking of construction workers. The proposed method basically uses epipolar geometry which represents the relationship between the two fixed cameras. Each pixel coordinate in a camera view is projected onto the other camera view as an epipolar line. The proposed method finds the matching pair of a worker entity by comparing the proximity of the all detected entities in the other view to the epipolar line. Experimental results demonstrate its suitability for automated entity matching for 3D vision-based tracking of construction workers.