• Title/Summary/Keyword: camera alignment

Search Result 133, Processing Time 0.026 seconds

A Study of Alignment Tolerance's Definition and Test Method for Airborne Camera (항공기 탑재용 카메라 정렬오차 정의 및 시험방안 연구)

  • Song, Dae-Buem;Yoon, Yong-Eun;Lee, Hang-Bok
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.16 no.2
    • /
    • pp.154-159
    • /
    • 2013
  • Alignment tolerance for EO/IR airborne camera using common optic is an important factor in stabilization accuracy and geo-pointing accuracy. Before airborne camera is mounted on the aircraft, defining alignment tolerance and verification of it is essential in production as well as research and development. In this paper we establish basic concept on the definition and elements of alignment tolerance for airborne camera and propose how to measure each of those elements. Components and the measurement sequence of alignment tolerance are as follows: 1) tolerance of alignment between EO and IR LOS. 2) tolerance of sensor alignment. 3) tolerance of position reporting accuracy. 4) tolerance of mount alignment

Computer-Aided Alignment of an Earth Observation Camera (컴퓨터를 이용한 지구관측 카메라의 광학정렬)

  • Kim, Eugene D.;Choi, Young-Wan;Kang, Myung-Seok;Kim, Ee-Eul;Yang, Ho-Soon
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.32 no.10
    • /
    • pp.142-146
    • /
    • 2004
  • Spaceborne earth observation or astronomical payloads often use Cassegrain-type telescopes due to limits in mass and volume. Precision optical alignment of such a telescope is vital to the success of the mission. This paper describes the alignment simulation and experiment of computer-aided alignment method during the assembly of MAC (Medium-sized Aperture Camera) telescope for spaceborne earth observation.

A Wafer Pre-Alignment System Using One Image of a Whole Wafer (하나의 웨이퍼 전체 영상을 이용한 웨이퍼 Pre-Alignment 시스템)

  • Koo, Ja-Myoung;Cho, Tai-Hoon
    • Journal of the Semiconductor & Display Technology
    • /
    • v.9 no.3
    • /
    • pp.47-51
    • /
    • 2010
  • This paper presents a wafer pre-alignment system which is improved using the image of the entire wafer area. In the previous method, image acquisition for wafer takes about 80% of total pre-alignment time. The proposed system uses only one image of entire wafer area via a high-resolution CMOS camera, and so image acquisition accounts for nearly 1% of total process time. The larger FOV(field of view) to use the image of the entire wafer area worsen camera lens distortion. A camera calibration using high order polynomials is used for accurate lens distortion correction. And template matching is used to find a correct notch's position. The performance of the proposed system was demonstrated by experiments of wafer center alignment and notch alignment.

A Wafer Pre-Alignment System Using a High-Order Polynomial Transformation Based Camera Calibration (고차 다항식 변환 기반 카메라 캘리브레이션을 이용한 웨이퍼 Pre-Alignment 시스템)

  • Lee, Nam-Hee;Cho, Tai-Hoon
    • Journal of the Semiconductor & Display Technology
    • /
    • v.9 no.1
    • /
    • pp.11-16
    • /
    • 2010
  • Wafer Pre-Alignment is to find the center and the orientation of a wafer and to move the wafer to the desired position and orientation. In this paper, an area camera based pre-aligning method is presented that captures 8 wafer images regularly during 360 degrees rotation. From the images, wafer edge positions are extracted and used to estimate the wafer's center and orientation using least squares circle fitting. These data are utilized for the proper alignment of the wafer. For accurate alignments, camera calibration methods using high order polynomials are used for converting pixel coordinates into real-world coordinates. A complete pre-alignment system was constructed using mechanical and optical components and tested. Experimental results show that alignment of wafer center and orientation can be done with the standard deviation of 0.002 mm and 0.028 degree, respectively.

A study on the relationship between rubbing scratches on an alignment film and rubbing cloths using a high-speed camera

  • Inoue, Y.;Kuramoto, Y.;Hattori, M.;Adachi, M.;Kimura, M.;Akahane, T.
    • Journal of Information Display
    • /
    • v.12 no.3
    • /
    • pp.125-128
    • /
    • 2011
  • Alignment failure sometimes occurs during the rubbing process because the rubbing cloth comes in direct contacts with the surface of the alignment film. A number of researches observed and evaluated the surface of the alignment film after the rubbing process had been reported. The real-time rubbing process has not been observed directly yet, though. In this study, the movement of the piles of the rubbing cloth during the rubbing process was observed with a high-speed camera. Furthermore, the relationship between the rubbing scratch on the alignment films and the movement of the pile was investigated. It was found that the movement of the pile affected the rubbing scratches.

Automatic Alignment and Mounting of FPCs Using Machine Vision (머신비전을 이용한 FPC의 자동정렬 및 장착)

  • Shin, Dong-Won
    • Journal of the Korean Society of Manufacturing Process Engineers
    • /
    • v.6 no.3
    • /
    • pp.24-30
    • /
    • 2007
  • The FPCs(Flexible Printed Circuit) are currently used in several electronic products like digital cameras, cellular phones because of flexible material characteristics. Because the FPC is usually small size and flexible, only one FPC should not enter chip mounting process, instead, several FPCs are placed on the large rigid pallette and enter into the chip mounting process. Currently the job of mounting FPC on the pallette is carried by totally manual way. Thus, the goals of the research is develop the automatic machine of FPC mounting on pallette using vision alignment. Instead of using two cameras or using moving one camera, the proposed vision system with only one fixed camera is adopted. Moreover, the two picker heads which can handle two FPCs simultaneously are used to make process time shortened. The procedure of operation is firstly to measure alignment error of FPC, correct alignment errors, and finally mount well-aligned FPC on the pallette. The vision technology is used to measure alignment error accurately, and precision motion control is used in correcting errors and mounting FPC.

  • PDF

Camera Calibration and Pose Estimation for Tasks of a Mobile Manipulator (모바일 머니퓰레이터의 작업을 위한 카메라 보정 및 포즈 추정)

  • Choi, Ji-Hoon;Kim, Hae-Chang;Song, Jae-Bok
    • The Journal of Korea Robotics Society
    • /
    • v.15 no.4
    • /
    • pp.350-356
    • /
    • 2020
  • Workers have been replaced by mobile manipulators for factory automation in recent years. One of the typical tasks for automation is that a mobile manipulator moves to a target location and picks and places an object on the worktable. However, due to the pose estimation error of the mobile platform, the robot cannot reach the exact target position, which prevents the manipulator from being able to accurately pick and place the object on the worktable. In this study, we developed an automatic alignment system using a low-cost camera mounted on the end-effector of a collaborative robot. Camera calibration and pose estimation methods were also proposed for the automatic alignment system. This algorithm uses a markerboard composed of markers to calibrate the camera and then precisely estimate the camera pose. Experimental results demonstrate that the mobile manipulator can perform successful pick and place tasks on various conditions.

Depth error correction for maladjusted stereo cameras with the calibrated pixel distance parameter (화소간격 파라미터 교정에 의한 비정렬 스테레오 카메라의 거리오차 보정)

  • 김종만;손홍락;김성중
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1996.10b
    • /
    • pp.268-272
    • /
    • 1996
  • Error correction effect for maladjusted stereo cameras with calibrated pixel distance parameter is presented. The camera calibration is a necessary procedure for stereo vision-based depth computation. Intra and extra parameters should be obtain to determine the relation between image and world coordination through experiment. One difficulty is in camera alignment for parallel installation: placing two CCD arrays in a plane. No effective methods for such alignment have been presented before. Some amount of depth error caused from such non-parallel installation of cameras is inevitable. If the pixel distance parameter which is one of intra parameter is calibrated with known points, such error can be compensated in some amount. Such error compensation effect with the calibrated pixel distance parameter is demonstrated with some experimental results.

  • PDF

A distance perception model for AVG based on a moving camera

  • Ant io Cunha;Jo Barroso;Cruz, Jos-Bulas;Jo L. Monteiro
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2003.09a
    • /
    • pp.248-251
    • /
    • 2003
  • This paper presents a distance perception model based around a moving camera, in the context of driving a self-guidance vehicle. Aligned images, by escape points, and acquired by a moving camera, present objects at different positions depending on its relative distance to camera. The objects that are farthest from the observer(the camera) gradually lose their alignment as the distance diminishes. With the current setup, this lack of alignment is noticeable up to a distance of 10 meters. In the paper, the results of real imagery tests are presented and discussed.

  • PDF

Mathematical Modeling for the Physical Relationship between the Coordinate Systems of IMU/GPS and Camera (IMU/GPS와 카메라 좌표계간의 물리적 관계를 위한 수학적 모델링)

  • Chon, Jae-Choon;Shibasaki, R.
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.26 no.6
    • /
    • pp.611-616
    • /
    • 2008
  • When extracting geo-referenced 3D data from cameras mounted on Mobile Mapping Systems, one of important properties for accuracy of extracted data is the alignment of the relative translation(lever-arm) and rotation(bore-sight) between the coordinate systems of Inertial Measurement Unit(IMU)/Ground Positioning System(GPS) and cameras. Since the conventional method calculates absolute camera orientation using ground control points (GCP), the alignment is determined in one Coordinated System (GPS Coordinated System). It basically require GCP. We proposed a mathematical model for the alignment using the initially uncoupled data of cameras and IMU/GPS without GCPs.