• Title/Summary/Keyword: Fisheye Lens

Search Result 61, Processing Time 0.036 seconds

Realtime Vehicle Tracking and Region Detection in Indoor Parking Lot for Intelligent Parking Control (지능형 주차 관제를 위한 실내주차장에서 실시간 차량 추적 및 영역 검출)

  • Yeon, Seungho;Kim, Jaemin
    • Journal of Korea Multimedia Society
    • /
    • v.19 no.2
    • /
    • pp.418-427
    • /
    • 2016
  • A smart parking management requires to track a vehicle in a indoor parking lot and to detect the place where the vehicle is parked. An advanced parking system watches all space of the parking lot with CCTV cameras. We can use these cameras for vehicles tracking and detection. In order to cover a wide area with a camera, a fisheye lens is used. In this case the shape and size of an moving vehicle vary much with distance and angle to the camera. This makes vehicle detection and tracking difficult. In addition to the fisheye lens, the vehicle headlights also makes vehicle detection and tracking difficult. This paper describes a method of realtime vehicle detection and tracking robust to the harsh situation described above. In each image frame, we update the region of a vehicle and estimate the vehicle movement. First we approximate the shape of a car with a quadrangle and estimate the four sides of the car using multiple histograms of oriented gradient. Second we create a template by applying a distance transform to the car region and estimate the motion of the car with a template matching method.

Mixing Collaborative and Hybrid Vision Devices for Robotic Applications (로봇 응용을 위한 협력 및 결합 비전 시스템)

  • Bazin, Jean-Charles;Kim, Sung-Heum;Choi, Dong-Geol;Lee, Joon-Young;Kweon, In-So
    • The Journal of Korea Robotics Society
    • /
    • v.6 no.3
    • /
    • pp.210-219
    • /
    • 2011
  • This paper studies how to combine devices such as monocular/stereo cameras, motors for panning/tilting, fisheye lens and convex mirrors, in order to solve vision-based robotic problems. To overcome the well-known trade-offs between optical properties, we present two mixed versions of the new systems. The first system is the robot photographer with a conventional pan/tilt perspective camera and fisheye lens. The second system is the omnidirectional detector for a complete 360-degree field-of-view surveillance system. We build an original device that combines a stereo-catadioptric camera and a pan/tilt stereo-perspective camera, and also apply it in the real environment. Compared to the previous systems, we show benefits of two proposed systems in aspects of maintaining both high-speed and high resolution with collaborative moving cameras and having enormous search space with hybrid configuration. The experimental results are provided to show the effectiveness of the mixing collaborative and hybrid systems.

Omni-directional Vision SLAM using a Motion Estimation Method based on Fisheye Image (어안 이미지 기반의 움직임 추정 기법을 이용한 전방향 영상 SLAM)

  • Choi, Yun Won;Choi, Jeong Won;Dai, Yanyan;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.8
    • /
    • pp.868-874
    • /
    • 2014
  • This paper proposes a novel mapping algorithm in Omni-directional Vision SLAM based on an obstacle's feature extraction using Lucas-Kanade Optical Flow motion detection and images obtained through fish-eye lenses mounted on robots. Omni-directional image sensors have distortion problems because they use a fish-eye lens or mirror, but it is possible in real time image processing for mobile robots because it measured all information around the robot at one time. In previous Omni-Directional Vision SLAM research, feature points in corrected fisheye images were used but the proposed algorithm corrected only the feature point of the obstacle. We obtained faster processing than previous systems through this process. The core of the proposed algorithm may be summarized as follows: First, we capture instantaneous $360^{\circ}$ panoramic images around a robot through fish-eye lenses which are mounted in the bottom direction. Second, we remove the feature points of the floor surface using a histogram filter, and label the candidates of the obstacle extracted. Third, we estimate the location of obstacles based on motion vectors using LKOF. Finally, it estimates the robot position using an Extended Kalman Filter based on the obstacle position obtained by LKOF and creates a map. We will confirm the reliability of the mapping algorithm using motion estimation based on fisheye images through the comparison between maps obtained using the proposed algorithm and real maps.

Achromatic and Athermal Design of an Optical System with Corrected Petzval Curvature on a Three-dimensional Glass Chart

  • Lim, Tae-Yeon;Kim, Yeong-Sik;Park, Sung-Chan
    • Current Optics and Photonics
    • /
    • v.1 no.4
    • /
    • pp.378-388
    • /
    • 2017
  • We present a graphical method for determining a pair of optical materials and powers to design an achromatic and athermal lens system with corrected Petzval curvature. To graphically obtain the solutions, a three-dimensional (3D) glass chart is proposed. Even if a particular material combination is unavailable, we can select an element suitable for a specific lens and continuously change the element powers of an equivalent single lens for aberrations correction. Thus, we can iteratively identify the materials and powers on a 3D glass chart. By designing a fisheye lens using this method, an achromatic and athermal system with flat Petzval curvature is obtained, over the specified waveband and temperature ranges.

Multi-robot Formation based on Object Tracking Method using Fisheye Images (어안 영상을 이용한 물체 추적 기반의 한 멀티로봇의 대형 제어)

  • Choi, Yun Won;Kim, Jong Uk;Choi, Jeong Won;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.19 no.6
    • /
    • pp.547-554
    • /
    • 2013
  • This paper proposes a novel formation algorithm of identical robots based on object tracking method using omni-directional images obtained through fisheye lenses which are mounted on the robots. Conventional formation methods of multi-robots often use stereo vision system or vision system with reflector instead of general purpose camera which has small angle of view to enlarge view angle of camera. In addition, to make up the lack of image information on the environment, robots share the information on their positions through communication. The proposed system estimates the region of robots using SURF in fisheye images that have $360^{\circ}$ of image information without merging images. The whole system controls formation of robots based on moving directions and velocities of robots which can be obtained by applying Lucas-Kanade Optical Flow Estimation for the estimated region of robots. We confirmed the reliability of the proposed formation control strategy for multi-robots through both simulation and experiment.

De-blurring Algorithm for Performance Improvement of Searching a Moving Vehicle on Fisheye CCTV Image (어안렌즈사용 CCTV이미지에서 차량 정보 수집의 성능개선을 위한 디블러링 알고리즘)

  • Lee, In-Jung
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.35 no.4C
    • /
    • pp.408-414
    • /
    • 2010
  • When we are collecting traffic information on CCTV images, we have to install the detect zone in the image area during pan-tilt system is on duty. An automation of detect zone with pan-tilt system is not easy because of machine error. So the fisheye lens attached camera or convex mirror camera is needed for getting wide area images. In this situation some troubles are happened, that is a decreased system speed or image distortion. This distortion is caused by occlusion of angled ray as like trembled snapshot in digital camera. In this paper, we propose two methods of de-blurring to overcome distortion, the one is image segmentation by nonlinear diffusion equation and the other is deformation for some segmented area. As the results of doing de-blurring methods, the de-blurring image has 15 decibel increased PSNR and the detection rate of collecting traffic information is more than 5% increasing than in distorted images.

Multi-robot Mapping Using Omnidirectional-Vision SLAM Based on Fisheye Images

  • Choi, Yun-Won;Kwon, Kee-Koo;Lee, Soo-In;Choi, Jeong-Won;Lee, Suk-Gyu
    • ETRI Journal
    • /
    • v.36 no.6
    • /
    • pp.913-923
    • /
    • 2014
  • This paper proposes a global mapping algorithm for multiple robots from an omnidirectional-vision simultaneous localization and mapping (SLAM) approach based on an object extraction method using Lucas-Kanade optical flow motion detection and images obtained through fisheye lenses mounted on robots. The multi-robot mapping algorithm draws a global map by using map data obtained from all of the individual robots. Global mapping takes a long time to process because it exchanges map data from individual robots while searching all areas. An omnidirectional image sensor has many advantages for object detection and mapping because it can measure all information around a robot simultaneously. The process calculations of the correction algorithm are improved over existing methods by correcting only the object's feature points. The proposed algorithm has two steps: first, a local map is created based on an omnidirectional-vision SLAM approach for individual robots. Second, a global map is generated by merging individual maps from multiple robots. The reliability of the proposed mapping algorithm is verified through a comparison of maps based on the proposed algorithm and real maps.

Geometric Correction of Vehicle Fish-eye Lens Images (차량용 어안렌즈영상의 기하학적 왜곡 보정)

  • Kim, Sung-Hee;Cho, Young-Ju;Son, Jin-Woo;Lee, Joong-Ryoul;Kim, Myoung-Hee
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.601-605
    • /
    • 2009
  • Due to the fact that fish-eye lens can provide super wide angles with the minimum number of cameras, field-of-view over 180 degrees, many vehicles are attempting to mount the camera system. Camera calibration should be preceded, and geometrical correction on the radial distortion is needed to provide the images for the driver's assistance. However, vehicle fish-eye cameras have diagonal output images rather than circular images and have asymmetric distortion beyond the horizontal angle. In this paper, we introduce a camera model and metric calibration method for vehicle cameras which uses feature points of the image. And undistort the input image through a perspective projection, where straight lines should appear straight. The method fitted vehicle fish-eye lens with different field of views.

  • PDF

Image Data Loss Minimized Geometric Correction for Asymmetric Distortion Fish-eye Lens (비대칭 왜곡 어안렌즈를 위한 영상 손실 최소화 왜곡 보정 기법)

  • Cho, Young-Ju;Kim, Sung-Hee;Park, Ji-Young;Son, Jin-Woo;Lee, Joong-Ryoul;Kim, Myoung-Hee
    • Journal of the Korea Society for Simulation
    • /
    • v.19 no.1
    • /
    • pp.23-31
    • /
    • 2010
  • Due to the fact that fisheye lens can provide super wide angles with the minimum number of cameras, field-of-view over 180 degrees, many vehicles are attempting to mount the camera system. Not only use the camera as a viewing system, but also as a camera sensor, camera calibration should be preceded, and geometrical correction on the radial distortion is needed to provide the images for the driver's assistance. In this thesis, we introduce a geometric correction technique to minimize the loss of the image data from a vehicle fish-eye lens having a field of view over $180^{\circ}$, and a asymmetric distortion. Geometric correction is a process in which a camera model with a distortion model is established, and then a corrected view is generated after camera parameters are calculated through a calibration process. First, the FOV model to imitate a asymmetric distortion configuration is used as the distortion model. Then, we need to unify the axis ratio because a horizontal view of the vehicle fish-eye lens is asymmetrically wide for the driver, and estimate the parameters by applying a non-linear optimization algorithm. Finally, we create a corrected view by a backward mapping, and provide a function to optimize the ratio for the horizontal and vertical axes. This minimizes image data loss and improves the visual perception when the input image is undistorted through a perspective projection.

Panoramic Transform of Fisheye-Lens Image using Bilateral Interpolation (바이래터럴 필터 보간을 활용한 어안렌즈 영상의 파노라마 변환)

  • Choi, Hyeon-Yeong;Ko, Jae-Pil
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2017.04a
    • /
    • pp.1005-1007
    • /
    • 2017
  • 본 논문에서는 어안렌즈를 통해 획득한 전방향 영상을 파노라마 영상으로 변환하기 위한 영역분할 방법을 제안한다. 각 분할된 영역에 대한 변환 과정에서 발생하는 영상 왜곡을 완화하면서 에지를 보존하기 위하여 기존 양성형 보간 방법을 대체하는 바이래터럴 필터 보간 방법을 제안한다. 또한 파노라마 변환 영상에서 본 논문의 제안방법의 결과가 기존 결과보다 에지가 잘 보존됨을 확인하였다.