• Title/Summary/Keyword: Omnidirectional lens system

Search Result 16, Processing Time 0.022 seconds

Optical Design of a Subminiature Catadioptric Omnidirectional Optical System with an LED Illumination System for a Capsule Endoscope (LED 조명계를 결합한 캡슐내시경용 초소형 반사굴절식 전방위 광학계의 설계)

  • Moon, Tae Sung;Jo, Jae Heung
    • Korean Journal of Optics and Photonics
    • /
    • v.32 no.2
    • /
    • pp.68-78
    • /
    • 2021
  • A subminiature catadioptric omnidirectional optical system (SCOOS) with 2 mirrors, 6 plastic aspherical lenses, and an illumination system of 6 light emitting diodes, to observe the 360° panoramic image of the inner intestine, is optically designed and evaluated for a capsule endoscope. The total length, overall length, half field of view (HFOV), and F-number of the SCOOS are 14.3 mm, 8.93 mm, 51°~120°, and 3.5, respectively. The optical system has a complementary metal-oxide-semiconductor sensor with 0.1 megapixels, and an illumination system of 6 light-emitting diodes (LEDs) with 0.25 lm to illuminate on the 360° side view of the intestine along the optical axis. As a result, the spatial frequency at the modulation transfer function (MTF) of 0.3, the depth of focus, and the cumulative probability of tolerance at the Nyquist frequency of 44 lp/mm and MTF of 0.3 of the optimized optical system are obtained as 130 lp/mm, -0.097 mm to +0.076 mm, and 90.5%, respectively. Additionally, the simulated illuminance of the LED illumination system at the inner surface of the intestine within HFOV, at a distance of 15.0 mm from the optical axis, is from a minimum of 315 lx to a maximum of 725 lx, which is a sufficient illumination and visibility.

Mixing Collaborative and Hybrid Vision Devices for Robotic Applications (로봇 응용을 위한 협력 및 결합 비전 시스템)

  • Bazin, Jean-Charles;Kim, Sung-Heum;Choi, Dong-Geol;Lee, Joon-Young;Kweon, In-So
    • The Journal of Korea Robotics Society
    • /
    • v.6 no.3
    • /
    • pp.210-219
    • /
    • 2011
  • This paper studies how to combine devices such as monocular/stereo cameras, motors for panning/tilting, fisheye lens and convex mirrors, in order to solve vision-based robotic problems. To overcome the well-known trade-offs between optical properties, we present two mixed versions of the new systems. The first system is the robot photographer with a conventional pan/tilt perspective camera and fisheye lens. The second system is the omnidirectional detector for a complete 360-degree field-of-view surveillance system. We build an original device that combines a stereo-catadioptric camera and a pan/tilt stereo-perspective camera, and also apply it in the real environment. Compared to the previous systems, we show benefits of two proposed systems in aspects of maintaining both high-speed and high resolution with collaborative moving cameras and having enormous search space with hybrid configuration. The experimental results are provided to show the effectiveness of the mixing collaborative and hybrid systems.

Localization using Ego Motion based on Fisheye Warping Image (어안 워핑 이미지 기반의 Ego motion을 이용한 위치 인식 알고리즘)

  • Choi, Yun Won;Choi, Kyung Sik;Choi, Jeong Won;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.1
    • /
    • pp.70-77
    • /
    • 2014
  • This paper proposes a novel localization algorithm based on ego-motion which used Lucas-Kanade Optical Flow and warping image obtained through fish-eye lenses mounted on the robots. The omnidirectional image sensor is a desirable sensor for real-time view-based recognition of a robot because the all information around the robot can be obtained simultaneously. The preprocessing (distortion correction, image merge, etc.) of the omnidirectional image which obtained by camera using reflect in mirror or by connection of multiple camera images is essential because it is difficult to obtain information from the original image. The core of the proposed algorithm may be summarized as follows: First, we capture instantaneous $360^{\circ}$ panoramic images around a robot through fish-eye lenses which are mounted in the bottom direction. Second, we extract motion vectors using Lucas-Kanade Optical Flow in preprocessed image. Third, we estimate the robot position and angle using ego-motion method which used direction of vector and vanishing point obtained by RANSAC. We confirmed the reliability of localization algorithm using ego-motion based on fisheye warping image through comparison between results (position and angle) of the experiment obtained using the proposed algorithm and results of the experiment measured from Global Vision Localization System.

3D Omni-directional Vision SLAM using a Fisheye Lens Laser Scanner (어안 렌즈와 레이저 스캐너를 이용한 3차원 전방향 영상 SLAM)

  • Choi, Yun Won;Choi, Jeong Won;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.7
    • /
    • pp.634-640
    • /
    • 2015
  • This paper proposes a novel three-dimensional mapping algorithm in Omni-Directional Vision SLAM based on a fisheye image and laser scanner data. The performance of SLAM has been improved by various estimation methods, sensors with multiple functions, or sensor fusion. Conventional 3D SLAM approaches which mainly employed RGB-D cameras to obtain depth information are not suitable for mobile robot applications because RGB-D camera system with multiple cameras have a greater size and slow processing time for the calculation of the depth information for omni-directional images. In this paper, we used a fisheye camera installed facing downwards and a two-dimensional laser scanner separate from the camera at a constant distance. We calculated fusion points from the plane coordinates of obstacles obtained by the information of the two-dimensional laser scanner and the outline of obstacles obtained by the omni-directional image sensor that can acquire surround view at the same time. The effectiveness of the proposed method is confirmed through comparison between maps obtained using the proposed algorithm and real maps.

The Study on the Fire Monitoring Dystem for Full-scale Surveillance and Video Tracking (전방위 감시와 영상추적이 가능한 화재감시시스템에 관한 연구)

  • Baek, Dong-hyun
    • Fire Science and Engineering
    • /
    • v.32 no.6
    • /
    • pp.40-45
    • /
    • 2018
  • The omnidirectional surveillance camera uses the object detection algorithm to level the object by unit so that broadband surveillance can be performed using a fisheye lens and then, it was a field experiment with a system composed of an omnidirectional surveillance camera and a tracking (PTZ) camera. The omnidirectional surveillance camera accurately detects the moving object, displays the squarely, and tracks it in close cooperation with the tracking camera. In the field test of flame detection and temperature of the sensing camera, when the flame is detected during the auto scan, the detection camera stops and the temperature is displayed by moving the corresponding spot part to the central part of the screen. It is also possible to measure the distance of the flame from the distance of 1.5 km, which exceeds the standard of calorific value of 1 km 2,340 kcal. In the performance test of detecting the flame along the distance, it is possible to be 1.5 km in width exceeding $56cm{\times}90cm$ at a distance of 1km, and so it is also adaptable to forest fire. The system is expected to be very useful for safety such as prevention of intrinsic or surrounding fire and intrusion monitoring if it is installed in a petroleum gas storage facility or a storing place for oil in the future.

Optical Design of an Omnidirectional Illumination System Using an Ultra Wide Converter (초광각 변환기를 이용한 전방위 조명 광학계의 설계)

  • Juho Lee;Jae Myung Ryu
    • Korean Journal of Optics and Photonics
    • /
    • v.35 no.1
    • /
    • pp.18-23
    • /
    • 2024
  • In exhibition spaces such as art museums, lighting should primarily illuminate the walls where exhibits are displayed rather than the floor. Commonly used LED lighting consists of an LED and a diffusion plate that closely resembles a Lambertian light source with uniform light distribution at every angle. This type of illumination focuses on the floor surface where normal incidence occurs. Consequently, this general illumination method is not well-suited for effectively lighting the wall surface. Specifically, to illuminate a wall, it is necessary to increase the light intensity in areas with a large incident angle in the light distribution. In response to this issue, our study proposes an illumination system that uses an ultra wide converter to adjust the divergence angle from the light source to 180 degrees.