• Title/Summary/Keyword: fisheye

Search Result 95, Processing Time 0.026 seconds

Real time Omni-directional Object Detection Using Background Subtraction of Fisheye Image (어안 이미지의 배경 제거 기법을 이용한 실시간 전방향 장애물 감지)

  • Choi, Yun-Won;Kwon, Kee-Koo;Kim, Jong-Hyo;Na, Kyung-Jin;Lee, Suk-Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.8
    • /
    • pp.766-772
    • /
    • 2015
  • This paper proposes an object detection method based on motion estimation using background subtraction in the fisheye images obtained through omni-directional camera mounted on the vehicle. Recently, most of the vehicles installed with rear camera as a standard option, as well as various camera systems for safety. However, differently from the conventional object detection using the image obtained from the camera, the embedded system installed in the vehicle is difficult to apply a complicated algorithm because of its inherent low processing performance. In general, the embedded system needs system-dependent algorithm because it has lower processing performance than the computer. In this paper, the location of object is estimated from the information of object's motion obtained by applying a background subtraction method which compares the previous frames with the current ones. The real-time detection performance of the proposed method for object detection is verified experimentally on embedded board by comparing the proposed algorithm with the object detection based on LKOF (Lucas-Kanade optical flow).

Image Distortion Correction Processing System Realization for Fisheye Lens Camera (어안렌스 카메라의 영상왜곡보정처리 시스템 구현)

  • Ryu, Kwang-Ryol;Kim, Ja-Hwan
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.11 no.11
    • /
    • pp.2116-2120
    • /
    • 2007
  • A realization for image distortion correction processing system with DSP processor is presented in this paper. The image distortion correcting algorithm is realized by DSP processor for focusing on more real time processing than image quality. The lens and camera distortion coefficients are processed by the Lookup Tables and the correcting algorithm is applied to reverse mapping method for geometrical transform. The system experimentation results in the processing time about 31.3 msec $720{\times}480$ wide range image, and the image is stable and spontaneous to be about 8.3% average PSNR variation with changing a wide angle.

Panoramic Transform of Fisheye-Lens Image using Bilateral Interpolation (바이래터럴 필터 보간을 활용한 어안렌즈 영상의 파노라마 변환)

  • Choi, Hyeon-Yeong;Ko, Jae-Pil
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2017.04a
    • /
    • pp.1005-1007
    • /
    • 2017
  • 본 논문에서는 어안렌즈를 통해 획득한 전방향 영상을 파노라마 영상으로 변환하기 위한 영역분할 방법을 제안한다. 각 분할된 영역에 대한 변환 과정에서 발생하는 영상 왜곡을 완화하면서 에지를 보존하기 위하여 기존 양성형 보간 방법을 대체하는 바이래터럴 필터 보간 방법을 제안한다. 또한 파노라마 변환 영상에서 본 논문의 제안방법의 결과가 기존 결과보다 에지가 잘 보존됨을 확인하였다.

3D Analysis of Scene and Light Environment Reconstruction for Image Synthesis (영상합성을 위한 3D 공간 해석 및 조명환경의 재구성)

  • Hwang, Yong-Ho;Hong, Hyun-Ki
    • Journal of Korea Game Society
    • /
    • v.6 no.2
    • /
    • pp.45-50
    • /
    • 2006
  • In order to generate a photo-realistic synthesized image, we should reconstruct light environment by 3D analysis of scene. This paper presents a novel method for identifying the positions and characteristics of the lights-the global and local lights-in the real image, which are used to illuminate the synthetic objects. First, we generate High Dynamic Range(HDR) radiance map from omni-directional images taken by a digital camera with a fisheye lens. Then, the positions of the camera and light sources in the scene are identified automatically from the correspondences between images without a priori camera calibration. Types of the light sources are classified according to whether they illuminate the whole scene, and then we reconstruct 3D illumination environment. Experimental results showed that the proposed method with distributed ray tracing makes it possible to achieve photo-realistic image synthesis. It is expected that animators and lighting experts for the film and animation industry would benefit highly from it.

  • PDF

An Experimental Comparison on Visualization Techniques of Long Menu-Lists (긴 메뉴항목 리스트의 시각화 기법 비교에 관한 실험적 연구)

  • Seo, Eun-Gyoung;Sung, Hye-Eun
    • Journal of the Korean Society for information Management
    • /
    • v.24 no.2
    • /
    • pp.71-87
    • /
    • 2007
  • With the rapid change of the Web and E-transaction application, the search interface is providing more powerful search and visualization methods, while offering smoother integration of technology with task. Especially, visualization techniques for long menu-lists are applied in retrieval system with the goal of improving performance in user's ability to select one item from a long list. In order to review visualization techniques appropriate to the types of users and data set, this study compared the five visualization browsers such as the Tree-structured menu, the Table-of-contents menu, the Roll-over menu, the Click menu, and Fisheye menu. The result of general analyses shows that among the hierarchical methods, the experienced group prefers the Table-of-contents method menu, whereas the novice's group prefers the Tree-structure method menu. Among the linear methods, the two groups prefer the Roll-over menu. The Roll-over menu is most preferred among the five browsers by the two groups.

Collision Avoidance Using Omni Vision SLAM Based on Fisheye Image (어안 이미지 기반의 전방향 영상 SLAM을 이용한 충돌 회피)

  • Choi, Yun Won;Choi, Jeong Won;Im, Sung Gyu;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.22 no.3
    • /
    • pp.210-216
    • /
    • 2016
  • This paper presents a novel collision avoidance technique for mobile robots based on omni-directional vision simultaneous localization and mapping (SLAM). This method estimates the avoidance path and speed of a robot from the location of an obstacle, which can be detected using the Lucas-Kanade Optical Flow in images obtained through fish-eye cameras mounted on the robots. The conventional methods suggest avoidance paths by constructing an arbitrary force field around the obstacle found in the complete map obtained through the SLAM. Robots can also avoid obstacles by using the speed command based on the robot modeling and curved movement path of the robot. The recent research has been improved by optimizing the algorithm for the actual robot. However, research related to a robot using omni-directional vision SLAM to acquire around information at once has been comparatively less studied. The robot with the proposed algorithm avoids obstacles according to the estimated avoidance path based on the map obtained through an omni-directional vision SLAM using a fisheye image, and returns to the original path. In particular, it avoids the obstacles with various speed and direction using acceleration components based on motion information obtained by analyzing around the obstacles. The experimental results confirm the reliability of an avoidance algorithm through comparison between position obtained by the proposed algorithm and the real position collected while avoiding the obstacles.

Localization using Ego Motion based on Fisheye Warping Image (어안 워핑 이미지 기반의 Ego motion을 이용한 위치 인식 알고리즘)

  • Choi, Yun Won;Choi, Kyung Sik;Choi, Jeong Won;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.1
    • /
    • pp.70-77
    • /
    • 2014
  • This paper proposes a novel localization algorithm based on ego-motion which used Lucas-Kanade Optical Flow and warping image obtained through fish-eye lenses mounted on the robots. The omnidirectional image sensor is a desirable sensor for real-time view-based recognition of a robot because the all information around the robot can be obtained simultaneously. The preprocessing (distortion correction, image merge, etc.) of the omnidirectional image which obtained by camera using reflect in mirror or by connection of multiple camera images is essential because it is difficult to obtain information from the original image. The core of the proposed algorithm may be summarized as follows: First, we capture instantaneous $360^{\circ}$ panoramic images around a robot through fish-eye lenses which are mounted in the bottom direction. Second, we extract motion vectors using Lucas-Kanade Optical Flow in preprocessed image. Third, we estimate the robot position and angle using ego-motion method which used direction of vector and vanishing point obtained by RANSAC. We confirmed the reliability of localization algorithm using ego-motion based on fisheye warping image through comparison between results (position and angle) of the experiment obtained using the proposed algorithm and results of the experiment measured from Global Vision Localization System.

Calibration of Omnidirectional Camera by Considering Inlier Distribution (인라이어 분포를 이용한 전방향 카메라의 보정)

  • Hong, Hyun-Ki;Hwang, Yong-Ho
    • Journal of Korea Game Society
    • /
    • v.7 no.4
    • /
    • pp.63-70
    • /
    • 2007
  • Since the fisheye lens has a wide field of view, it can capture the scene and illumination from all directions from far less number of omnidirectional images. Due to these advantages of the omnidirectional camera, it is widely used in surveillance and reconstruction of 3D structure of the scene In this paper, we present a new self-calibration algorithm of omnidirectional camera from uncalibrated images by considering the inlier distribution. First, one parametric non-linear projection model of omnidirectional camera is estimated with the known rotation and translation parameters. After deriving projection model, we can compute an essential matrix of the camera with unknown motions, and then determine the camera information: rotation and translations. The standard deviations are used as a quantitative measure to select a proper inlier set. The experimental results showed that we can achieve a precise estimation of the omnidirectional camera model and extrinsic parameters including rotation and translation.

  • PDF

Camera Calibration and Barrel Undistortion for Fisheye Lens (차량용 어안렌즈 카메라 캘리브레이션 및 왜곡 보정)

  • Heo, Joon-Young;Lee, Dong-Wook
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.62 no.9
    • /
    • pp.1270-1275
    • /
    • 2013
  • A lot of research about camera calibration and lens distortion for wide-angle lens has been made. Especially, calibration for fish-eye lens which has 180 degree FOV(field of view) or above is more tricky, so existing research employed a huge calibration pattern or even 3D pattern. And it is important that calibration parameters (such as distortion coefficients) are suitably initialized to get accurate calibration results. It can be achieved by using manufacturer information or lease-square method for relatively narrow FOV(135, 150 degree) lens. In this paper, without any previous manufacturer information, camera calibration and barrel undistortion for fish-eye lens with over 180 degree FOV are achieved by only using one calibration pattern image. We applied QR decomposition for initialization and Regularization for optimization. With the result of experiment, we verified that our algorithm can achieve camera calibration and image undistortion successfully.

Spherical arrangement of biomimetic polymer photonic structures (자연모사를 통한 미세 고분자 포토닉 구조의 구면배열에 관한 연구)

  • Jeong, Ki-Hun
    • Proceedings of the KSME Conference
    • /
    • 2007.05a
    • /
    • pp.403-404
    • /
    • 2007
  • Compound eyes in nature present intriguing topics in physiological optics due to their unique optical scheme for imaging. For example, a bee's eye has thousands of integrated photonic units called ommatidia spherically arranged along a curvilinear surface so that each unit points in a different direction. The omni-directionally arranged ommatidium collects incident light with a narrow range of angular acceptance and independently contributes to the capability of wide field-of-view (FOV) detection. Artificial implementation of compound eyes has attracted a great deal of research interest because the wide FOV exhibits a huge potential for medical, industrial, and military applications. So far, imaging with a FOV over $90^{\circ}$ has been achieved only with fisheye lenses which rely on bulky and expensive multiple lenses and require stringent alignment. In this talk, we will discuss about the spherical 3D arrangement of the photonic structures of biologically inspired artificial compound eyes in a small form-factor to have and the functional and anatomical similiarity with natural compound eyes.

  • PDF