• 제목/요약/키워드: Camera orientation

검색결과 312건 처리시간 0.029초

카메라 캘리브레이션을 이용한 이동로봇의 위치 및 자세 추정 (Estimation of the position and orientation of the mobile robot using camera calibration)

  • 정기주;최명환;이범희;고명삼
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1992년도 한국자동제어학술회의논문집(국내학술편); KOEX, Seoul; 19-21 Oct. 1992
    • /
    • pp.786-791
    • /
    • 1992
  • When a mobile robot moves from one place to another, position error occurs due to the limit of accuracy of robot and the effect of environmental noise. In this paper. an accurate method of estimating the position and orientation of a mobile robot using the camera calibration is proposed. Kalman filter is used as the estimation algorithm. The uncertainty in the position of camera with repect to robot base frame is considered well as the position error of the robot. Besides developing the mathematical model for mobile robot calibration system, the effect of relative position between camera and calibration points is analyzed and the method to select the most accurate calibration points is also presented.

  • PDF

인터넷기반의 원격제어 카메라 시스템 개발 (Development of a remote control camera system based on internet)

  • 최기훈;김영탁
    • 한국정밀공학회:학술대회논문집
    • /
    • 한국정밀공학회 1997년도 추계학술대회 논문집
    • /
    • pp.504-506
    • /
    • 1997
  • CCD camera is usually used for monitoring device. In most cases, he monitoring is performed from long distance. In this study, a camera system controlled through internet is developed. Using the system, not only we can get image information in real time but also we can control the orientation of the camera from long distance. Furthermore wireless communication is carried out between the camera and the server computer for verity of the application.

  • PDF

레이저 카메라를 이용한 용접선의 추적 (A seam tracking algorithm based on laser vision)

  • 조현중;류현;오세영
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1996년도 한국자동제어학술회의논문집(국내학술편); 포항공과대학교, 포항; 24-26 Oct. 1996
    • /
    • pp.593-596
    • /
    • 1996
  • A seam tracking control system with a tool position control and a camera orientation control, has been developed here. For the camera orientation contro, SOFNN was used to learn the expert control signal. The SOFNN algorithm can adjust the fuzzy set parameters and determine the fuzzy logic structure.

  • PDF

De Jong-Bouman Camera를 이용한 진동사진을 촬영하기 위하여 결정방향을 속히 맞추는 방법 (A Quick Adjustment Method for Crystal Orientation in Oscillation Photography using do Jong-Bowman Camera)

  • 서일환;이진호
    • 한국결정학회지
    • /
    • 제4권2호
    • /
    • pp.49-53
    • /
    • 1993
  • Dejong-Bourruncamera를 이용 한진동사진술에서 시료의 방향 수E방법의 이론이 논의되었다. 적도선으로부터 Othlayer의 회절반점들의 편차를 재어서 가하고 감하면 그들의 각각이 수평 및 수직 arc의 수정분이다.

  • PDF

Feasibility of Using an Automatic Lens Distortion Correction (ALDC) Camera in a Photogrammetric UAV System

  • Jeong, Hohyun;Ahn, Hoyong;Park, Jinwoo;Kim, Hyungwoo;Kim, Sangseok;Lee, Yangwon;Choi, Chuluong
    • 한국측량학회지
    • /
    • 제33권6호
    • /
    • pp.475-483
    • /
    • 2015
  • This study examined the feasibility of using an automatic lens distortion correction (ALDC) camera as the payload for a photogrammetric unmanned aerial vehicle (UAV) system. First, lens distortion for the interior orientation (IO) parameters was estimated. Although previous studies have largely ignored decentering distortion, this study revealed that more than 50% of the distortion of the ALDC camera was caused by decentering distortion. Second, we compared the accuracy of bundle adjustment for camera calibration using three image types: raw imagery without the ALDC option; imagery corrected using lens profiles; and imagery with the ALDC option. The results of image triangulation, the digital terrain model (DTM), and the orthoimage using the IO parameters for the ALDC camera were similar to or slightly better than the results using self-calibration. These results confirm that the ALDC camera can be used in a photogrammetric UAV system using only self-calibration.

고해상도 카메라와의 동시 운영을 통한 드론 다분광카메라의 외부표정 및 영상 위치 정밀도 개선 연구 (Improving Precision of the Exterior Orientation and the Pixel Position of a Multispectral Camera onboard a Drone through the Simultaneous Utilization of a High Resolution Camera)

  • 백승일;변민수;김원국
    • 한국측량학회지
    • /
    • 제39권6호
    • /
    • pp.541-548
    • /
    • 2021
  • 최근 농업, 산림관리, 해안환경 모니터링 등 다양한 분야에서 다분광 카메라의 활용, 특히 드론에 탑재되어 활용되는 사례가 증대되고 있다. 산출되는 다분광 영상은 위치정보를 위해 주로 드론에 탑재된 GPS (Global Positioning System)나 IMU (Inertial Measurement Unit) 센서를 이용해 지리참조(georeferencing)되는데, 보다 높은 정확도를 위해서는 직접 측량한 지상 기준점을 이용하기도 한다. 하지만, 직접 측량에 드는 비용 및 시간으로 인해 또는 직접 접근이 어려운 지역에 대해서는 지상 참조값을 활용하지 않고 지리참조를 수행해야하는 경우가 자주 발생하게 된다. 본 연구는 지상기준점이 가용하지 않은 경우에 다분광카메라로부터의 영상의 지리참조 정밀도를 향상시키기 위해 같이 탑재된 고해상도 RGB카메라의 영상을 활용하는 방안에 대하여 연구한다. 드론 영상은 우선 번들조정을 통해 카메라의 외부표정 요소를 추정하였고, 이를 지상 기준점을 이용한 경우의 외부표정 및 위치결과와 비교하였다. 실험결과, 고해상도 영상을 포함하여 번들조정을 하게 될 경우, 다분광 카메라 영상을 단독으로 활용할 때보다, 다분광 카메라 영상의 지리참조 오차가 비약적으로 감소하였음을 확인하였다. 추가로 한 지상 지점에서 드론으로의 방향각을 추정할 때의 오차를 분석한 결과, 마찬가지로 고해상도 RGB영상을 포함하여 번들조정하게 되면 기존의 방향각 오차가 한 단위이상 감소하는 것으로 나타났다.

단사진의 외부표정요소 결정을 위한 후방교회법 알고리즘의 비교 (Comparisons of Single Photo Resection Algorithms for the Determination of Exterior Orientation Parameters)

  • 김의명;서홍덕
    • 한국측량학회지
    • /
    • 제38권4호
    • /
    • pp.305-315
    • /
    • 2020
  • 본 연구는 사진측량, 컴퓨터 비전, 로보틱스 등의 분야에서 사용되는 외부표정요소를 결정하는 단사진의 후방교회법의 알고리즘을 비교하는 것이 목적이다. 이를 위해 항공사진 및 근거리 사진측량에서 사용되고 있는 카메라를 기준으로 지형을 시뮬레이션하여 실험 데이터를 생성하여 알고리즘을 비교하였다. 거의 수직으로 촬영되는 항공사진측량용 카메라에 대한 실험을 통해서 3개의 지상기준점으로도 외부표정요소를 결정할 수 있었으나 프로쿠르스테스 알고리즘은 지상기준점 배치에 민감하였다. 카메라의 자세각이 크게 변하는 근거리 사진측량용 카메라를 대상으로한 실험에서도 프로쿠르스테스 알고리즘의 지상기준점 배치에 민감하였으며 모든 알고리즘이 적어도 6개의 지상기준점이 필요하였다. 두 종류의 카메라를 대상으로 한 실험을 통해서 코사인법칙 기반의 후방교회법은 반복회수가 짧고 명시적인 초기값이 필요하지 않기 때문에 전통적인 사진측량 알고리즘과 유사한 성능을 나타내는 것을 알 수 있었다.

Multi-camera System Calibration with Built-in Relative Orientation Constraints (Part 1) Theoretical Principle

  • Lari, Zahra;Habib, Ayman;Mazaheri, Mehdi;Al-Durgham, Kaleel
    • 한국측량학회지
    • /
    • 제32권3호
    • /
    • pp.191-204
    • /
    • 2014
  • In recent years, multi-camera systems have been recognized as an affordable alternative for the collection of 3D spatial data from physical surfaces. The collected data can be applied for different mapping(e.g., mobile mapping and mapping inaccessible locations)or metrology applications (e.g., industrial, biomedical, and architectural). In order to fully exploit the potential accuracy of these systems and ensure successful manipulation of the involved cameras, a careful system calibration should be performed prior to the data collection procedure. The calibration of a multi-camera system is accomplished when the individual cameras are calibrated and the geometric relationships among the different system components are defined. In this paper, a new single-step approach is introduced for the calibration of a multi-camera system (i.e., individual camera calibration and estimation of the lever-arm and boresight angles among the system components). In this approach, one of the cameras is set as the reference camera and the system mounting parameters are defined relative to that reference camera. The proposed approach is easy to implement and computationally efficient. The major advantage of this method, when compared to available multi-camera system calibration approaches, is the flexibility of being applied for either directly or indirectly geo-referenced multi-camera systems. The feasibility of the proposed approach is verified through experimental results using real data collected by a newly-developed indirectly geo-referenced multi-camera system.

Defects Length Measurement using an Estimation Algorithm of the Camera Orientation and an Inclination Angle of a Laser Slit Beam

  • Kim, Young-Hwan;Yoon, Ji-Sup;Kang, E-Sok
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.1452-1457
    • /
    • 2004
  • In this paper, a method of measuring the length of defects on the wall and restructuring the defect image is proposed based on the estimation algorithm of a camera orientation which uses the declination angle of a laser slit beam. The estimation algorithm of the horizontally inclined angle of CCD camera adopts a 3-dimensional coordinate transformation of the image plane where both the laser beam and the original image of the defects exist. The estimation equation is obtained by using the information of the beam projected on the wall and the parameters of this equation are experimentally obtained. With this algorithm, the original image of the defect can be reconstructed to an image normal to the wall. From the result of a series of experiments, the measuring accuracy of the defect is measured within 0.5% error bound of real defect size under 30 degree of the horizontally inclined angle. The proposed algorithm provides the method of reconstructing the image taken at any arbitrary horizontally inclined angle to the image normal to the wall and thus, it enables the accurate measurement of the defect lengths only by using a single camera and a laser slit beam.

  • PDF

불확실한 환경에서 매니퓰레이터 위치제어를 위한 실시간 비젼제어기법에 관한 연구 (A Study on the Real-Time Vision Control Method for Manipulator's position Control in the Uncertain Circumstance)

  • 정완식;김경석;신광수;주철;김재확;윤현권
    • 한국정밀공학회지
    • /
    • 제16권12호
    • /
    • pp.87-98
    • /
    • 1999
  • This study is concentrated on the development of real-time estimation model and vision control method as well as the experimental test. The proposed method permits a kind of adaptability not otherwise available in that the relationship between the camera-space location of manipulable visual cues and the vector of manipulator joint coordinates is estimate in real time. This is done based on a estimation model ta\hat generalizes known manipulator kinematics to accommodate unknown relative camera position and orientation as well as uncertainty of manipulator. This vision control method is roboust and reliable, which overcomes the difficulties of the conventional research such as precise calibration of the vision sensor, exact kinematic modeling of the manipulator, and correct knowledge of position and orientation of CCD camera with respect to the manipulator base. Finally, evidence of the ability of real-time vision control method for manipulator's position control is provided by performing the thin-rod placement in space with 2 cues test model which is completed without a prior knowledge of camera or manipulator positions. This feature opens the door to a range of applications of manipulation, including a mobile manipulator with stationary cameras tracking and providing information for control of the manipulator event.

  • PDF