• Title/Summary/Keyword: Fisheye Lens

Search Result 61, Processing Time 0.028 seconds

3D Analysis of Scene and Light Environment Reconstruction for Image Synthesis (영상합성을 위한 3D 공간 해석 및 조명환경의 재구성)

  • Hwang, Yong-Ho;Hong, Hyun-Ki
    • Journal of Korea Game Society
    • /
    • v.6 no.2
    • /
    • pp.45-50
    • /
    • 2006
  • In order to generate a photo-realistic synthesized image, we should reconstruct light environment by 3D analysis of scene. This paper presents a novel method for identifying the positions and characteristics of the lights-the global and local lights-in the real image, which are used to illuminate the synthetic objects. First, we generate High Dynamic Range(HDR) radiance map from omni-directional images taken by a digital camera with a fisheye lens. Then, the positions of the camera and light sources in the scene are identified automatically from the correspondences between images without a priori camera calibration. Types of the light sources are classified according to whether they illuminate the whole scene, and then we reconstruct 3D illumination environment. Experimental results showed that the proposed method with distributed ray tracing makes it possible to achieve photo-realistic image synthesis. It is expected that animators and lighting experts for the film and animation industry would benefit highly from it.

  • PDF

A Study on a Comparison of Sky View Factors and a Correlation with Air Temperature in the City (하늘시계지수 비교 및 도시기온 상관성 연구: 강남 선정릉지역을 중심으로)

  • Yi, Chaeyeon;Shin, Yire;An, Seung Man
    • Atmosphere
    • /
    • v.27 no.4
    • /
    • pp.483-498
    • /
    • 2017
  • Sky view factor can quantify the influence of complex obstructions. This study aims to evaluate the best available SVF method that represents an urban thermal condition with land cover in complex city of Korea and also to quantify a correlation between SVF and mean air temperature; the results are as follows. First, three SVF methods comparison result shows that urban thermal study should consider forest canopy induced effects because the forest canopy test (on/off) on SVF reveals significant difference range (0.8, between maximum value and minimum value) in comparison with the range (0.1~0.3) of SVFs (Fisheye, SOLWEIG and 3DPC) difference. The significance is bigger as a forest cover proportion become larger. Second, R-square between SVF methods and urban local mean air temperature seems more reliable at night than a day. And as the value of SVF increased, it showed a positive slope in summer day and a negative slope in winter night. In the SVF calculation method, Fisheye SVF, which is the observed value, is close to the 3DPC SVF, but the grid-based SWG SVF is higher in correlation with the temperature. However, both urban climate monitoring and model/analysis study need more development because of the different between SVF and mean air temperature correlation results in the summer night period, which imply other major factors such as cooling air by the forest canopy, warming air by anthropogenic heat emitted from fuel oil combustion and so forth.

An Observation System of Hemisphere Space with Fish eye Image and Head Motion Detector

  • Sudo, Yoshie;Hashimoto, Hiroshi;Ishii, Chiharu
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.663-668
    • /
    • 2003
  • This paper presents a new observation system which is useful to observe the scene of the remote controlled robot vision. This system is composed of a motionless camera and head motion detector with a motion sensor. The motionless camera has a fish eye lens and is for observing a hemisphere space. The head motion detector has a motion sensor is for defining an arbitrary subspace of the hemisphere space from fish eye lens. Thus processing the angular information from the motion sensor appropriately, the direction of face is estimated. However, since the fisheye image is distorted, it is unclear image. The partial domain of a fish eye image is selected by head motion, and this is converted to perspective image. However, since this conversion enlarges the original image spatially and is based on discrete data, crevice is generated in the converted image. To solve this problem, interpolation based on an intensity of the image is performed for the crevice in the converted image (space problem). This paper provides the experimental results of the proposed observation system with the head motion detector and perspective image conversion using the proposed conversion and interpolation methods, and the adequacy and improving point of the proposed techniques are discussed.

  • PDF

Improved Image Restoration Algorithm about Vehicle Camera for Corresponding of Harsh Conditions (가혹한 조건에 대응하기 위한 차량용 카메라의 개선된 영상복원 알고리즘)

  • Jang, Young-Min;Cho, Sang-Bock;Lee, Jong-Hwa
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.51 no.2
    • /
    • pp.114-123
    • /
    • 2014
  • Vehicle Black Box (Event Data Recorder EDR) only recognizes the general surrounding environments of load. In addition, general EDR is difficult to recognize the images of a sudden illumination change. It appears that the lens is being a severe distortion. Therefore, general EDR does not provide the clues of the circumstances of the accident. To solve this problem, we estimate the value of Normalized Luminance Descriptor(NLD) and Normalized Contrast Descriptor(NCD). Illumination change is corrected using Normalized Image Quality(NIQ). Second, we are corrected lens distortion using model of Field Of View(FOV) based on designed method of fisheye lens. As a result, we propose integration algorithm of two methods that correct distortions of images using each Gamma Correction and Lens Correction in parallel.

Design of an Achromatic Optical System Using a Symmetry Graphical Method (대칭 그래픽 방식을 이용한 광학계의 색수차 보정 설계)

  • Lim, Tae-Yeon;Ahn, Byoung-In;Jo, Sun-Hyoung;Kim, Jeongyun;Park, Sung-Chan
    • Korean Journal of Optics and Photonics
    • /
    • v.29 no.1
    • /
    • pp.13-18
    • /
    • 2018
  • In this study, we present a symmetry graphical method to design an achromatic optical system composed of many lenses on an achromatic glass map. To take into account the lens spacing and the number of lenses, we use the relative ratio of paraxial ray height at each lens and the concept of an equivalent single lens. Converting an arbitrary optical system into various doublet systems, the most effective doublet is then selected to correct the color aberration, through material selection and the redistribution of the optical power. By designing a fisheye lens using this approach, an achromatic optical system is effectively obtained over the visible waveband.

Localization of a Mobile Robot Using Multiple Ceiling Lights (여러 개의 조명등을 이용한 이동 로봇의 위치 추정)

  • Han, Yeon-Ju;Park, Tae-Hyoung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.19 no.4
    • /
    • pp.379-384
    • /
    • 2013
  • We propose a new global positioning method for the indoor mobile robots. The multiple indoor lights fixed in ceiling are used as the landmarks of positioning system. The ceiling images are acquired by the fisheye lens camera mounted on the moving robot. The position and orientation of the lights are extracted by binarization and labeling techniques. Also the boundary lines between ceiling and walls are extracted to identify the order of each light. The robot position is then calculated from the extracted position and known position of the lights. The proposed system can increase the accuracy and reduce the computation time comparing with the other positioning methods using natural landmark. Experimental results are presented to show the performance of the method.

Quality Benchmark of 360 Panoramic Image Generation (360 도 파노라마 영상 생성 기법의 품질 측정 기법 비교)

  • Kim, Soo Jie;Park, In Kyu
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2021.06a
    • /
    • pp.212-215
    • /
    • 2021
  • 본 논문에서는 6 Fisheye lens 원본 영상에 대하여 Insta360 stitcher, AutoStitch[4], As-Projective-AsPossible(APAP)[5] 스티칭 방법으로 360 도 파노라마 영상을 생성하고 기하학적 왜곡과 컬러 왜곡을 비교 평가한다. 360 도 파노라마 Image Quality Assessment(IQA) 메트릭으로 Natural Image Quality Evaluator(NIQE)[6], Blind/Referenceless Image Spatial Quality Evaluator (BRISQUE)[7], Perception based Image Quality Evaluator(PIQE)[8], Feature Similarity(FSIM)[9] 그리고 high frequency feature 에 대한 Structural Similarity(SSIM)[10]을 측정하여 정량적 평가를 하며 정성적인 비교를 통하여 파노라마 영상의 품질과 평가 메트릭에 대한 벤치마크를 제공한다.

  • PDF

Collision Avoidance Using Omni Vision SLAM Based on Fisheye Image (어안 이미지 기반의 전방향 영상 SLAM을 이용한 충돌 회피)

  • Choi, Yun Won;Choi, Jeong Won;Im, Sung Gyu;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.22 no.3
    • /
    • pp.210-216
    • /
    • 2016
  • This paper presents a novel collision avoidance technique for mobile robots based on omni-directional vision simultaneous localization and mapping (SLAM). This method estimates the avoidance path and speed of a robot from the location of an obstacle, which can be detected using the Lucas-Kanade Optical Flow in images obtained through fish-eye cameras mounted on the robots. The conventional methods suggest avoidance paths by constructing an arbitrary force field around the obstacle found in the complete map obtained through the SLAM. Robots can also avoid obstacles by using the speed command based on the robot modeling and curved movement path of the robot. The recent research has been improved by optimizing the algorithm for the actual robot. However, research related to a robot using omni-directional vision SLAM to acquire around information at once has been comparatively less studied. The robot with the proposed algorithm avoids obstacles according to the estimated avoidance path based on the map obtained through an omni-directional vision SLAM using a fisheye image, and returns to the original path. In particular, it avoids the obstacles with various speed and direction using acceleration components based on motion information obtained by analyzing around the obstacles. The experimental results confirm the reliability of an avoidance algorithm through comparison between position obtained by the proposed algorithm and the real position collected while avoiding the obstacles.

Localization using Ego Motion based on Fisheye Warping Image (어안 워핑 이미지 기반의 Ego motion을 이용한 위치 인식 알고리즘)

  • Choi, Yun Won;Choi, Kyung Sik;Choi, Jeong Won;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.1
    • /
    • pp.70-77
    • /
    • 2014
  • This paper proposes a novel localization algorithm based on ego-motion which used Lucas-Kanade Optical Flow and warping image obtained through fish-eye lenses mounted on the robots. The omnidirectional image sensor is a desirable sensor for real-time view-based recognition of a robot because the all information around the robot can be obtained simultaneously. The preprocessing (distortion correction, image merge, etc.) of the omnidirectional image which obtained by camera using reflect in mirror or by connection of multiple camera images is essential because it is difficult to obtain information from the original image. The core of the proposed algorithm may be summarized as follows: First, we capture instantaneous $360^{\circ}$ panoramic images around a robot through fish-eye lenses which are mounted in the bottom direction. Second, we extract motion vectors using Lucas-Kanade Optical Flow in preprocessed image. Third, we estimate the robot position and angle using ego-motion method which used direction of vector and vanishing point obtained by RANSAC. We confirmed the reliability of localization algorithm using ego-motion based on fisheye warping image through comparison between results (position and angle) of the experiment obtained using the proposed algorithm and results of the experiment measured from Global Vision Localization System.

Calibration of Omnidirectional Camera by Considering Inlier Distribution (인라이어 분포를 이용한 전방향 카메라의 보정)

  • Hong, Hyun-Ki;Hwang, Yong-Ho
    • Journal of Korea Game Society
    • /
    • v.7 no.4
    • /
    • pp.63-70
    • /
    • 2007
  • Since the fisheye lens has a wide field of view, it can capture the scene and illumination from all directions from far less number of omnidirectional images. Due to these advantages of the omnidirectional camera, it is widely used in surveillance and reconstruction of 3D structure of the scene In this paper, we present a new self-calibration algorithm of omnidirectional camera from uncalibrated images by considering the inlier distribution. First, one parametric non-linear projection model of omnidirectional camera is estimated with the known rotation and translation parameters. After deriving projection model, we can compute an essential matrix of the camera with unknown motions, and then determine the camera information: rotation and translations. The standard deviations are used as a quantitative measure to select a proper inlier set. The experimental results showed that we can achieve a precise estimation of the omnidirectional camera model and extrinsic parameters including rotation and translation.

  • PDF