• Title/Summary/Keyword: fish robot

Search Result 57, Processing Time 0.027 seconds

A Study of Detecting Fish Robot Position using the Comparing Image Data Algorithm (이미지 비교 알고리즘을 이용한 물고기 로봇 위치 탐지 연구)

  • Musunuri, Yogendra Rao;Jeon, UYeol;Shin, KyooJae
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2015.10a
    • /
    • pp.1341-1344
    • /
    • 2015
  • In this paper, the designed fish robot is researched and developed for aquarium underwater robot. This paper is a study on how the outside technology merely to find the location of fish robots without specific sensor or internal devices. This model is designed to detect the position of the Robotic Fish in the Mat lab and Simulink. This intends to recognize the shape of the tank via a video device such as a camera or camcorder using an image processing technique to identify the location of the robotic fishes. Here, we are applied the two methods, one is Hom - Schunk Method and second one is newly proposed method that is the comparing image data algorithm. The Horn - Schunck Method is used to obtain the velocity for each pixel in the image and the comparing image data algorithm is proposed to obtain the position with comparing two video frames and assumes a constant velocity in each video frame.

A study on the straight cruise of fish robot according to biological mimic (생물학적 모방에 따른 물고기 로봇의 직진유영 연구)

  • Park, Jin-Hyun;Lee, Tae-Hwan;Choi, Young-Kiu
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.8
    • /
    • pp.1756-1763
    • /
    • 2011
  • This paper was researched the straight cruise of fish robot according to biological mimic, and it was compared the proposed method which was considered up to 7th order components in fourier series of Liu's tail motion function with the approximate method which was used general sine function by simulation. If fish robot has a large number of links and if the length of tail link is long. The end rotary joint trajectory of tail motion function generally is different from sine function. Therefore The approximate method which expresses tail motion trajectories as fundamental component in fourier series has a problem. Through the computer simulation, the proposed method showed 10% excellent propulsion and velocity than the conventional method.

NUMERICAL ANALYSIS OF THE AIRFOIL IN SELF-PROPELLED FISH MOTION USING IMMERSED BOUNDARY LATTICE BOLTZMANN METHOD (가상경계볼쯔만법을 이용한 자력추진 물고기 운동 익의 유영해석)

  • Kim, Hyung-Min
    • Journal of computational fluids engineering
    • /
    • v.16 no.2
    • /
    • pp.24-29
    • /
    • 2011
  • Immersed boundary lattice Boltzmann method has been applied to analyze the characteristics of the self-propelled fish motion swimming robot. The airfoil NACA0012 with caudal fin stroke model was considered to examine the characteristics. The foil in steady forward motion and a combination of steady-state harmonic deformation produces thrust through the formation of a flow downstream from the trailing edge. The harmonic motion of the foil causes unsteady shedding of vorticity from the trailing edge, while forming the vortices at the leading edge as well. The resultant thrust is developed by the pressure difference formed on the upper and lower surface of the airfoil. and the time averaged thrust coefficient increases as Re increase in the region of $Re{\leqq}700$. The suggested numerical method is suitable to develop the fish-motion model to control the swimming robot, however It would need to extend in 3D analysis to examine the higher Re and to determine the more detail mechanism of thrust production.

Fuzzy Distance Estimation for a Fish Robot

  • Shin, Daejung;Na, Seung-You;Kim, Jin-Young
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.5 no.4
    • /
    • pp.316-321
    • /
    • 2005
  • We designed and implemented fish robots for various purposes such as autonomous navigation, maneuverability control, posture balancing and improvement of quick turns in a tank of 120 X 120 X 180cm size. Typically, fish robots have 30-50 X 15-25 X 10-20cm dimensions; length, width and height, respectively. It is essential to have the ability of quick and smooth turning to avoid collision with obstacles or walls of the water pool at a close distance. Infrared distance sensors are used to detect obstacles, magneto-resistive sensors are used to read direction information, and a two-axis accelerometer is mounted to compensate output of direction sensors. Because of the swing action of its head due to the tail fin movement, the outputs of an infrared distance sensor contain a huge amount of noise around true distances. With the information from accelerometers and e-compass, much improved distance data can be obtained by fuzzy logic based estimation. Successful swimming and smooth turns without collision demonstrated the effectiveness of the distance estimation.

Online Trajectory Planning for a PUMA Robot

  • Kang, Chul-Goo
    • International Journal of Precision Engineering and Manufacturing
    • /
    • v.8 no.4
    • /
    • pp.16-21
    • /
    • 2007
  • Robotic applications, such as automatic fish cutting, require online trajectory planning because the material properties of the object, such as the bone or flesh conditions, are not known in advance. Different trajectories are required when the material properties vary. An effective online trajectory-planning algorithm is proposed using quaternions to determine the position and orientation of a robot manipulator with a spherical wrist. Quaternions are free of representation singularities and permit computationally efficient orientation interpolations. To prevent singular configurations, the exact locations of the kinematic singularities of the PUMA 560 manipulator are derived and geometrically illustrated when a forearm offset exists and the third link length is not zero.

Omni-directional Vision SLAM using a Motion Estimation Method based on Fisheye Image (어안 이미지 기반의 움직임 추정 기법을 이용한 전방향 영상 SLAM)

  • Choi, Yun Won;Choi, Jeong Won;Dai, Yanyan;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.8
    • /
    • pp.868-874
    • /
    • 2014
  • This paper proposes a novel mapping algorithm in Omni-directional Vision SLAM based on an obstacle's feature extraction using Lucas-Kanade Optical Flow motion detection and images obtained through fish-eye lenses mounted on robots. Omni-directional image sensors have distortion problems because they use a fish-eye lens or mirror, but it is possible in real time image processing for mobile robots because it measured all information around the robot at one time. In previous Omni-Directional Vision SLAM research, feature points in corrected fisheye images were used but the proposed algorithm corrected only the feature point of the obstacle. We obtained faster processing than previous systems through this process. The core of the proposed algorithm may be summarized as follows: First, we capture instantaneous $360^{\circ}$ panoramic images around a robot through fish-eye lenses which are mounted in the bottom direction. Second, we remove the feature points of the floor surface using a histogram filter, and label the candidates of the obstacle extracted. Third, we estimate the location of obstacles based on motion vectors using LKOF. Finally, it estimates the robot position using an Extended Kalman Filter based on the obstacle position obtained by LKOF and creates a map. We will confirm the reliability of the mapping algorithm using motion estimation based on fisheye images through the comparison between maps obtained using the proposed algorithm and real maps.

Localization using Ego Motion based on Fisheye Warping Image (어안 워핑 이미지 기반의 Ego motion을 이용한 위치 인식 알고리즘)

  • Choi, Yun Won;Choi, Kyung Sik;Choi, Jeong Won;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.1
    • /
    • pp.70-77
    • /
    • 2014
  • This paper proposes a novel localization algorithm based on ego-motion which used Lucas-Kanade Optical Flow and warping image obtained through fish-eye lenses mounted on the robots. The omnidirectional image sensor is a desirable sensor for real-time view-based recognition of a robot because the all information around the robot can be obtained simultaneously. The preprocessing (distortion correction, image merge, etc.) of the omnidirectional image which obtained by camera using reflect in mirror or by connection of multiple camera images is essential because it is difficult to obtain information from the original image. The core of the proposed algorithm may be summarized as follows: First, we capture instantaneous $360^{\circ}$ panoramic images around a robot through fish-eye lenses which are mounted in the bottom direction. Second, we extract motion vectors using Lucas-Kanade Optical Flow in preprocessed image. Third, we estimate the robot position and angle using ego-motion method which used direction of vector and vanishing point obtained by RANSAC. We confirmed the reliability of localization algorithm using ego-motion based on fisheye warping image through comparison between results (position and angle) of the experiment obtained using the proposed algorithm and results of the experiment measured from Global Vision Localization System.

Design of an Autonomous Eating Pet Robot

  • Park, Ch.S.;Choi, B.J.;Park, S.H.;Lee, Y.J.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.855-858
    • /
    • 2003
  • The trends of recent developed a pet robot which interacts with people are increased gradually. There are a few pet robots that are a robot dog, robot cat, and robot fish. The pet robot is featured that it is possible to sympathize and give pleasure to human. The pet robots express delight, sorrow, surprise, and hunger through the artificial intelligence. Previously, the pet robot has to exchange the battery when it is exhausted. Commercialized robots have a self-recharging function, which express hunger. Robot dog AIBO, SONY in Japan, checks the battery for expressing hunger. They find an energy station for recharge. While operation time of AIBO is 1 hour 30 minutes, recharging time is 2 hours. Recharging time is longer than operation time. During the recharge, they don't operate. We obtain a motivation for eating the battery when find the problem. In this paper, introduce an Autonomous Eating Pet Robot and propose a design for realization. The Autonomous Eating Pet Robot has a function that is the most basic instinct that is finding a food and evacuating.

  • PDF