• Title/Summary/Keyword: fisheye-lens

Search Result 61, Processing Time 0.032 seconds

The Study of Fisheye Lens for the Causes of Rapid Illumination Drop and the Ways to Correct on an Image Sensor due to an Ultra Wide Angle of View (어안렌즈 시야각의 광각화에 따른 조도저하의 원인과 개선방안에 관한 연구)

  • Rim, Cheon-Seog
    • Korean Journal of Optics and Photonics
    • /
    • v.23 no.5
    • /
    • pp.179-188
    • /
    • 2012
  • Lenses with an ultra wide angle of view are usually called fisheye lenses since a fish can see an ultra wide panoramic view under water. As the angle of view for these kinds of lenses reaches a wide angle, the illumination on an image sensor is reduced by a rapid drop. In this paper, we discuss the causes and the ways to correct for a rapid drop. First, it is treated for the sign convention of directional cosine vectors and normal vectors on the curved surface by means of analytic geometry. And, from the fundamental discussion for these vectors, the rapid illumination drop is numerically analyzed for various kinds of causes by utilizing geometrical optics and radiometry as well as Fresnel equations derived from electromagnetic boundary conditions. As a result, we are able to get the full understanding for the rapid illumination drop and to propose ways to correct effects due to an wide angle of view.

3D Omni-directional Vision SLAM using a Fisheye Lens Laser Scanner (어안 렌즈와 레이저 스캐너를 이용한 3차원 전방향 영상 SLAM)

  • Choi, Yun Won;Choi, Jeong Won;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.7
    • /
    • pp.634-640
    • /
    • 2015
  • This paper proposes a novel three-dimensional mapping algorithm in Omni-Directional Vision SLAM based on a fisheye image and laser scanner data. The performance of SLAM has been improved by various estimation methods, sensors with multiple functions, or sensor fusion. Conventional 3D SLAM approaches which mainly employed RGB-D cameras to obtain depth information are not suitable for mobile robot applications because RGB-D camera system with multiple cameras have a greater size and slow processing time for the calculation of the depth information for omni-directional images. In this paper, we used a fisheye camera installed facing downwards and a two-dimensional laser scanner separate from the camera at a constant distance. We calculated fusion points from the plane coordinates of obstacles obtained by the information of the two-dimensional laser scanner and the outline of obstacles obtained by the omni-directional image sensor that can acquire surround view at the same time. The effectiveness of the proposed method is confirmed through comparison between maps obtained using the proposed algorithm and real maps.

Comparison the Mapping Accuracy of Construction Sites Using UAVs with Low-Cost Cameras

  • Jeong, Hohyun;Ahn, Hoyong;Shin, Dongyoon;Choi, Chuluong
    • Korean Journal of Remote Sensing
    • /
    • v.35 no.1
    • /
    • pp.1-13
    • /
    • 2019
  • The advent of a fourth industrial revolution, built on advances in digital technology, has coincided with studies using various unmanned aerial vehicles (UAVs) being performed worldwide. However, the accuracy of different sensors and their suitability for particular research studies are factors that need to be carefully evaluated. In this study, we evaluated UAV photogrammetry using smart technology. To assess the performance of digital photogrammetry, the accuracy of common procedures for generating orthomosaic images and digital surface models (DSMs) using terrestrial laser scanning (TLS) techniques was measured. Two different type of non-surveying camera(Smartphone camera, fisheye camera) were attached to UAV platform. For fisheye camera, lens distortion was corrected by considering characteristics of lens. Accuracy of orthoimage and DSM generated were comparatively analyzed using aerial and TLS data. Accuracy comparison analysis proceeded as follows. First, we used Ortho mosaic image to compare the check point with a certain area. In addition, vertical errors of camera DSM were compared and analyzed based on TLS. In this study, we propose and evaluate the feasibility of UAV photogrammetry which can acquire 3 - D spatial information at low cost in a construction site.

An Interpolation Method for a Barrel Distortion Using Nearest Pixels on a Corrected Image (방사왜곡을 고려한 보정 영상 위최근접 화소 이용 보간법)

  • Choi, Changwon;Yi, Joonhwan
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.7
    • /
    • pp.181-190
    • /
    • 2013
  • We propose an interpolation method considering barrel distortion of fisheye lens using nearest pixels on a corrected image. The correction of barrel distortion comprises coordinate transformation and interpolation. This paper focuses on interpolation. The proposed interpolation method uses nearest four coordinates on a corrected image rather than on a distorted image unlike existing techniques. Experimental results show that both subjective and objective image qualities are improved.

Verification Method of Omnidirectional Camera Model by Projected Contours (사영된 컨투어를 이용한 전방향 카메라 모델의 검증 방법)

  • Hwang, Yong-Ho;Lee, Jae-Man;Hong, Hyun-Ki
    • 한국HCI학회:학술대회논문집
    • /
    • 2007.02a
    • /
    • pp.994-999
    • /
    • 2007
  • 전방향(omnidirectional) 카메라 시스템은 보다 적은 수의 영상으로부터 주변 장면(scene)에 대한 많은 정보를 취득할 수 있는 장점이 있기 때문에 전방향 영상을 이용한 자동교정(self-calibration)과 3차원 재구성 등의 연구가 활발히 진행되고 있다. 본 논문에서는 기존에 제안된 교정 방법들을 이용하여 추정된 사영모델(projection model)의 정확성을 검증하기 위한 새로운 방법이 제안된다. 실 세계에서 다양하게 존재하는 직선 성분들은 전방향 영상에 컨투어(contour)의 형태로 사영되며, 사영모델과 컨투어의 양 끝점 좌표 값을 이용하여 그 궤적을 추정할 수 있다. 추정된 컨투어의 궤적과 영상에 존재하는 컨투어와의 거리 오차(distance error)로부터 전방향 카메라의 사영모델의 정확성을 검증할 수 있다. 제안된 방법의 성능을 평가하기 위해서 구 맵핑(spherical mapping)된 합성(synthetic) 영상과 어안렌즈(fisheye lens)로 취득한 실제 영상에 대해 제안된 알고리즘을 적용하여 사영모델의 정확성을 판단하였다.

  • PDF

Localization of Mobile Robots by Full Detection of Ceiling Outlines (천장 외곽선 전체 검출에 의한 모바일 로봇의 위치 인식)

  • Kim, Young-Gyu;Park, Tae-Hyoung
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.65 no.7
    • /
    • pp.1283-1289
    • /
    • 2016
  • In this paper, we propose a new localization system using ceiling outlines. We acquire the entire ceiling image by using fisheye lens camera, and extract the lines by binarization and segmentation. The optical flow algorithm is then applied to identify the ceiling region from the segmented regions. Finally we obtain the position and orientation of the robot by the center position and momentum of ceiling region. Since we use the fully detected outlines, the accuracy and reliability of the localization system is improved. The experimental result are finally presented to show the effectiveness of the proposed method.

Development of Camera Module for Vehicle Safety Support (차량 안전 지원용 카메라 모듈 개발)

  • Shin, Seong-Yoon;Cho, Seung-Pyo;Shin, Kwang-Seong;Lee, Hyun-Chang
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2022.05a
    • /
    • pp.672-673
    • /
    • 2022
  • In this paper, we discuss a camera that is fixed in the same view as the TOF sensor and can be installed horizontally in the vehicle's moving direction. This camera applies 1280×720 resolution to improve object recognition accuracy, outputs images at 30fps, and can apply a wide-angle fisheye lens of 180° or more.

  • PDF

Image Distortion Correction Processing System Realization for Fisheye Lens Camera (어안렌스 카메라의 영상왜곡보정처리 시스템 구현)

  • Ryu, Kwang-Ryol;Kim, Ja-Hwan
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.11 no.11
    • /
    • pp.2116-2120
    • /
    • 2007
  • A realization for image distortion correction processing system with DSP processor is presented in this paper. The image distortion correcting algorithm is realized by DSP processor for focusing on more real time processing than image quality. The lens and camera distortion coefficients are processed by the Lookup Tables and the correcting algorithm is applied to reverse mapping method for geometrical transform. The system experimentation results in the processing time about 31.3 msec $720{\times}480$ wide range image, and the image is stable and spontaneous to be about 8.3% average PSNR variation with changing a wide angle.

Camera Calibration and Barrel Undistortion for Fisheye Lens (차량용 어안렌즈 카메라 캘리브레이션 및 왜곡 보정)

  • Heo, Joon-Young;Lee, Dong-Wook
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.62 no.9
    • /
    • pp.1270-1275
    • /
    • 2013
  • A lot of research about camera calibration and lens distortion for wide-angle lens has been made. Especially, calibration for fish-eye lens which has 180 degree FOV(field of view) or above is more tricky, so existing research employed a huge calibration pattern or even 3D pattern. And it is important that calibration parameters (such as distortion coefficients) are suitably initialized to get accurate calibration results. It can be achieved by using manufacturer information or lease-square method for relatively narrow FOV(135, 150 degree) lens. In this paper, without any previous manufacturer information, camera calibration and barrel undistortion for fish-eye lens with over 180 degree FOV are achieved by only using one calibration pattern image. We applied QR decomposition for initialization and Regularization for optimization. With the result of experiment, we verified that our algorithm can achieve camera calibration and image undistortion successfully.