• Title/Summary/Keyword: Fish eye

Search Result 133, Processing Time 0.024 seconds

Omni-directional Vision SLAM using a Motion Estimation Method based on Fisheye Image (어안 이미지 기반의 움직임 추정 기법을 이용한 전방향 영상 SLAM)

  • Choi, Yun Won;Choi, Jeong Won;Dai, Yanyan;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.8
    • /
    • pp.868-874
    • /
    • 2014
  • This paper proposes a novel mapping algorithm in Omni-directional Vision SLAM based on an obstacle's feature extraction using Lucas-Kanade Optical Flow motion detection and images obtained through fish-eye lenses mounted on robots. Omni-directional image sensors have distortion problems because they use a fish-eye lens or mirror, but it is possible in real time image processing for mobile robots because it measured all information around the robot at one time. In previous Omni-Directional Vision SLAM research, feature points in corrected fisheye images were used but the proposed algorithm corrected only the feature point of the obstacle. We obtained faster processing than previous systems through this process. The core of the proposed algorithm may be summarized as follows: First, we capture instantaneous $360^{\circ}$ panoramic images around a robot through fish-eye lenses which are mounted in the bottom direction. Second, we remove the feature points of the floor surface using a histogram filter, and label the candidates of the obstacle extracted. Third, we estimate the location of obstacles based on motion vectors using LKOF. Finally, it estimates the robot position using an Extended Kalman Filter based on the obstacle position obtained by LKOF and creates a map. We will confirm the reliability of the mapping algorithm using motion estimation based on fisheye images through the comparison between maps obtained using the proposed algorithm and real maps.

An Analysis and Evaluation of Urban Landscapes Using Images Taken with a Fish-eye Lens (천공사진(天空寫眞)을 이용한 도시경관의 분석 및 평가)

  • Han Gab-Soo;Yoon Young-Hwal;Jo Hyun-Kil
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.33 no.4 s.111
    • /
    • pp.11-21
    • /
    • 2005
  • The purpose of this study was to analyze and evaluate landscape characteristics by classification of landscapes in Chuncheon. A system was developed to convert images taken with a fish-eye lens to panoramic pictures. Landscape characteristics were analyzed by appearance rate and area distribution rate of landscape elements on panorama picture. Landscape characteristics were analyzed according to the number of times landscape elements appeared and the amount of area that each element occupied in the panoramic picture. Each panoramic picture was classified into five types based on these landscape element factors. Landscape evaluation was carried out using dynamic images converted from picture by fish-eye lens. The results of this study can be summarized as follows. The urban landscape can be characterized by four essential factors: interconnectedness, nature, urban centrality and landscape scale. Five types of landscapes were determined: detached residential building landscape (type 1), street landscape with various elements (type 2), street landscape in the center of a city (type 3), landscape of housing complex (type 4), and landscape of green space (type 5). Type 5 had the highest degree of landscape satisfaction and the landscape satisfaction increased with the number of appearances of natural elements. The amount of peen space had a high relation with a landscape satisfaction.

Fish-eye camera calibration and artificial landmarks detection for the self-charging of a mobile robot (이동로봇의 자동충전을 위한 어안렌즈 카메라의 보정 및 인공표지의 검출)

  • Kwon, Oh-Sang
    • Journal of Sensor Science and Technology
    • /
    • v.14 no.4
    • /
    • pp.278-285
    • /
    • 2005
  • This paper describes techniques of camera calibration and artificial landmarks detection for the automatic charging of a mobile robot, equipped with a fish-eye camera in the direction of its operation for movement or surveillance purposes. For its identification from the surrounding environments, three landmarks employed with infrared LEDs, were installed at the charging station. When the robot reaches a certain point, a signal is sent to the LEDs for activation, which allows the robot to easily detect the landmarks using its vision camera. To eliminate the effects of the outside light interference during the process, a difference image was generated by comparing the two images taken when the LEDs are on and off respectively. A fish-eye lens was used for the vision camera of the robot but the wide-angle lens resulted in a significant image distortion. The radial lens distortion was corrected after linear perspective projection transformation based on the pin-hole model. In the experiment, the designed system showed sensing accuracy of ${\pm}10$ mm in position and ${\pm}1^{\circ}$ in orientation at the distance of 550 mm.

Molecular Characterization of the Ocular EST Clones from Olive Flounder, Paralichthys olivaceus

  • Lee, Jeong-Ho;Noh, Jae-Koo;Kim, Hyun-Chul;Park, Choul-Ji;Min, Byung-Hwa;Ha, Su-Jin;Park, Jong-Won;Kim, Young-Ok;Kim, Jong-Hyun;Kim, Kyung-Kil;Kim, Woo-Jin;Myeong, Jeong-In
    • Development and Reproduction
    • /
    • v.14 no.2
    • /
    • pp.107-113
    • /
    • 2010
  • The olive flounder (Paralichthys olivaceus) is one of the most widely cultured flatfish in Korea and Japan. During development, in a process known as metamorphosis, this fish reorients itself to lie on one side, the body flattens, and the eye migrates to the other side of the body. However, few studies have focused on molecule regulation mechanism of eye development in olive flounder. To reveal the molecular mechanism of eye development, we performed the studies on identification of genes expressed in the eye of olive flounder using EST and RT-PCR strategy. A total of 270 ESTs were sequenced, and 178 (65.9%) clones were identified as known genes and 92 (34.1%) as unknown genes. Among the 178 EST clones, 29 (16.3%) clones were representing 9 unique genes identified as homologous to the previously reported olive flounder ESTs, 131 (73.6%) clones representing 107 unique genes were identified as orthologs of known genes from other organisms. We also identified a kind of eye development associated proteins, indicating EST as a powerful method for identifying eye development-related genes of fish as well as identifying novel genes. Further functional studies on these genes will provide more information on molecule regulation mechanism of eye development in olive flounder.

Characteristics of Fatigue Crack Initiation and Fatigue Strength of Nitrided 1 Cr- 1Mo-0.25V Turbine Rotor Steels

  • Suh, Chang-Min;Hwang, Byung-Won;Murakami, Ri-Ichi
    • Journal of Mechanical Science and Technology
    • /
    • v.16 no.8
    • /
    • pp.1109-1116
    • /
    • 2002
  • To investigate the effect of nitriding layer on both fatigue crack initiation and fatigue life, turbine rotor steel ( IC.- 1Mo-0.25V steel) specimens were nitrided by the nitemper method and then put to a rotary bending fatigue test at room and elevated temperatures. In nitriding, temperature and time were controlled to obtain a different nitrided thickness. Microstructure analysis, micro-Victors hardness test, and scanning electron microscope observation were carried out for evaluating experiments. In results, the fatigue cracks of nitrided specimens were initiated at inclusion near the interface between nitrided layer and substrate, which showed fish-eye type appearance in fractograph. The fatigue life of nitrided specimens at every temperature was prolonged compared to that of the non-nitrided. However, there was not observable improvement in fatigue characteristics with the increase of a nitrided thickness.

Fish Eye OLSR Scaling Properties

  • Adjih, Cedric;Baccelli, Emmanuel;Clausen, Thomas Heide;Jacquet, Philippe;Rodolakis, Georgios
    • Journal of Communications and Networks
    • /
    • v.6 no.4
    • /
    • pp.343-351
    • /
    • 2004
  • Scalability is one of the toughest challenges in ad hoc networking. Recent work outlines theoretical bounds on how well routing protocols could scale in this environment. However, none of the popular routing solutions really scales to large networks, by coming close enough to these bounds. In this paper, we study the case of link state routing and OLSR, one of the strongest candidates for standardization. We analyze how these bounds are not reached in this case, and we study how much the scalability is enhanced with the use of Fish eye techniques in addition to the link state routing framework. We show that with this enhancement, the theoretical scalability bounds are reached.

Phrixocephalus umbellatus (Copepoda : Lernaeidae) from Marine Fish, Branchiostegus japonicus of the Korea Southern Sea

  • Choi, Sang-Duk;Lee, Chang-Hoon;Chang, Dae-Soo;Ha, Dong-Soo
    • Journal of Aquaculture
    • /
    • v.13 no.1
    • /
    • pp.9-12
    • /
    • 2000
  • A species of the parasitic copepods Phrixcephalus unbellatus (Lernaeide ; Cyclopoidea) from Branchiostegus japonicus is described and reported for the first time in Korea. The parasite was recovered from the eye of host. P. umbellatus was easily identified by the body shape extensive ramification of the antennal processes and numerous branches on the thoracic horns, The parasite inserted its head and the anterior portion of thorax up to the 4th segment in the eye ball of the host through a narrow hole which it usually burrowed near the upper margin of the cornea above the crystalline lens. Prevaklence of the parasite increased from 3.3% January to 11.9% in June.

  • PDF

The Development of a Panorama System with Fish-Eye Lens (어안 렌즈를 이용한 파노라마 시스템 개발에 관한 연구)

  • Yi, Un-Kun;Cho, Seog-Bin;Baek, Kwang-Ryul;Kang, Bum-Soo
    • Proceedings of the KIEE Conference
    • /
    • 2001.07d
    • /
    • pp.2428-2430
    • /
    • 2001
  • 일반적인 카메라의 시야는 사람에 비하여 매우 좁기 때문에 큰 물체를 한 화면으로 얻기 힘들며, 그 움직임도 넓게 감시하기 어려움 점이 많다. 이에 본 논문에서는 어안 렌즈(Fish-Eye Lens)를 사용하여 넓은 시야의 영상을 획득하여 perspective 영상과 panorama 영상을 실시간 복원하는 파노라마 시스템을 구현하였다. 또한 어안 렌즈의 특성으로 인한 영상 변환과정에서 발생되는 해상도 차이를 보완하기 위하여 여러 가지 interpolation 방법을 적용하여 이의 결과를 비교하였다.

  • PDF

Position Detection and Gathering Swimming Control of Fish Robot Using Color Detection Algorithm (색상 검출 알고리즘을 활용한 물고기로봇의 위치인식과 군집 유영제어)

  • Akbar, Muhammad;Shin, Kyoo Jae
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2016.10a
    • /
    • pp.510-513
    • /
    • 2016
  • Detecting of the object in image processing is substantial but it depends on the object itself and the environment. An object can be detected either by its shape or color. Color is an essential for pattern recognition and computer vision. It is an attractive feature because of its simplicity and its robustness to scale changes and to detect the positions of the object. Generally, color of an object depends on its characteristics of the perceiving eye and brain. Physically, objects can be said to have color because of the light leaving their surfaces. Here, we conducted experiment in the aquarium fish tank. Different color of fish robots are mimic the natural swim of fish. Unfortunately, in the underwater medium, the colors are modified by attenuation and difficult to identify the color for moving objects. We consider the fish motion as a moving object and coordinates are found at every instinct of the aquarium to detect the position of the fish robot using OpenCV color detection. In this paper, we proposed to identify the position of the fish robot by their color and use the position data to control the fish robot gathering in one point in the fish tank through serial communication using RF module. It was verified by the performance test of detecting the position of the fish robot.

Localization using Ego Motion based on Fisheye Warping Image (어안 워핑 이미지 기반의 Ego motion을 이용한 위치 인식 알고리즘)

  • Choi, Yun Won;Choi, Kyung Sik;Choi, Jeong Won;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.1
    • /
    • pp.70-77
    • /
    • 2014
  • This paper proposes a novel localization algorithm based on ego-motion which used Lucas-Kanade Optical Flow and warping image obtained through fish-eye lenses mounted on the robots. The omnidirectional image sensor is a desirable sensor for real-time view-based recognition of a robot because the all information around the robot can be obtained simultaneously. The preprocessing (distortion correction, image merge, etc.) of the omnidirectional image which obtained by camera using reflect in mirror or by connection of multiple camera images is essential because it is difficult to obtain information from the original image. The core of the proposed algorithm may be summarized as follows: First, we capture instantaneous $360^{\circ}$ panoramic images around a robot through fish-eye lenses which are mounted in the bottom direction. Second, we extract motion vectors using Lucas-Kanade Optical Flow in preprocessed image. Third, we estimate the robot position and angle using ego-motion method which used direction of vector and vanishing point obtained by RANSAC. We confirmed the reliability of localization algorithm using ego-motion based on fisheye warping image through comparison between results (position and angle) of the experiment obtained using the proposed algorithm and results of the experiment measured from Global Vision Localization System.