• 제목/요약/키워드: image-based visual servoing

검색결과 53건 처리시간 0.024초

마커인식과 혼합 비주얼 서보잉 기법을 통한 이동로봇의 자세 안정화 제어 (Posture Stabilization Control for Mobile Robot using Marker Recognition and Hybrid Visual Servoing)

  • 이성구;권지욱;홍석교;좌동경
    • 전기학회논문지
    • /
    • 제60권8호
    • /
    • pp.1577-1585
    • /
    • 2011
  • This paper proposes a posture stabilization control algorithm for a wheeled mobile robot using hybrid visual servo control method with a position based and an image based visual servoing (PBVS and IBVS). To overcome chattering phenomena which were shown in the previous researches using a simple switching function based on a threshold, the proposed hybrid visual servo control law introduces the fusion function based on a blending function. Then, the chattering problem and rapid motion of the mobile robot can be eliminated. Also, we consider the nonlinearity of the wheeled mobile robot unlike the previous visual servo control laws using linear control methods to improve the performances of the visual servo control law. The proposed posture stabilization control law using hybrid visual servoing is verified by a theoretical analysis and simulation and experimental results.

자율주행 모바일 역진자의 비주얼서보잉에 대한 연구 (A Study on the Visual Servoing of Autonomous Mobile Inverted Pendulum)

  • 이준민;이장명
    • 제어로봇시스템학회논문지
    • /
    • 제19권3호
    • /
    • pp.240-247
    • /
    • 2013
  • This paper proposes an optimal three-dimensional coordinate implementation of the vision sensor using two CCD cameras. The PBVS (Position based visual servoing) is implemented using the positional information obtained from images. Stereo vision by PBVS method that has enhanced every frame using calibration parameters is effective in the distance calculation. The IBVS (Image based visual servoing) is also implemented using the difference between reference and obtained images. Stereo vision by IBVS method calculates the distance using rotation angle of motors that correspond eyes and neck without enhanced images. The PBVS method is compared with the IBVS method in terms of advantages, disadvantages, computing time, and performances. Finally, the IBVS method is applied for the dual arm manipulator on the mobile inverted pendulum. The autonomous mobile inverted pendulum is successfully demonstrated using the center of the manipulator's mass.

Robot Manipulator Visual Servoing via Kalman Filter- Optimized Extreme Learning Machine and Fuzzy Logic

  • Zhou, Zhiyu;Hu, Yanjun;Ji, Jiangfei;Wang, Yaming;Zhu, Zefei;Yang, Donghe;Chen, Ji
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제16권8호
    • /
    • pp.2529-2551
    • /
    • 2022
  • Visual servoing (VS) based on the Kalman filter (KF) algorithm, as in the case of KF-based image-based visual servoing (IBVS) systems, suffers from three problems in uncalibrated environments: the perturbation noises of the robot system, error of noise statistics, and slow convergence. To solve these three problems, we use an IBVS based on KF, African vultures optimization algorithm enhanced extreme learning machine (AVOA-ELM), and fuzzy logic (FL) in this paper. Firstly, KF online estimation of the Jacobian matrix. We propose an AVOA-ELM error compensation model to compensate for the sub-optimal estimation of the KF to solve the problems of disturbance noises and noise statistics error. Next, an FL controller is designed for gain adaptation. This approach addresses the problem of the slow convergence of the IBVS system with the KF. Then, we propose a visual servoing scheme combining FL and KF-AVOA-ELM (FL-KF-AVOA-ELM). Finally, we verify the algorithm on the 6-DOF robotic manipulator PUMA 560. Compared with the existing methods, our algorithm can solve the three problems mentioned above without camera parameters, robot kinematics model, and target depth information. We also compared the proposed method with other KF-based IBVS methods under different disturbance noise environments. And the proposed method achieves the best results under the three evaluation metrics.

카메라 디포커싱을 이용한 로보트의 시각 서보

  • 신진우;고국현;조형석
    • 한국정밀공학회:학술대회논문집
    • /
    • 한국정밀공학회 1994년도 추계학술대회 논문집
    • /
    • pp.559-564
    • /
    • 1994
  • Recently, a visual servoing for an eye-in-hand robot has become an interesting problem. A distance between a camera and a task object is very useful information for visual servoing. In the previous works for visual servoing, the distance can be obtained from the difference between a reference and a measured feature value of the object such as area on image plane. However, since this feature depends on the object, the reference feature value must be changed when other task object is taken. To overcome this difficulty, this paper presents a novel method for visual servoing. In the proposed method, a blur is used to obtain the distance. The blur, one of the most important features, depends on the focal length of camera. Since it is not affected by the change of object, the reference feature value is not changed although other task object is taken. In this paper, we show a relationship between the distance and the blur, and define the feature jacobian matrix based on camera defocusing to operate the robot. A series of experiments is performed to verify the proposed method.

  • PDF

A Study on Feature-Based Visual Servoing Control of Robot System by Utilizing Redundant Feature

  • Han, Sung-Hyun;Hideki Hashimoto
    • Journal of Mechanical Science and Technology
    • /
    • 제16권6호
    • /
    • pp.762-769
    • /
    • 2002
  • This paper presents how effective it is to use many features for improving the speed and accuracy of visual servo systems. Some rank conditions which relate the image Jacobian to the control performance are derived. The focus is to describe that the accuracy of the camera position control in the world coordinate system is increased by utilizing redundant features in this paper. It is also proven that the accuracy is improved by increasing the number of features involved. Effectiveness of the redundant features is evaluated by the smallest singular value of the image Jacobian which is closely related to the accuracy with respect to the world coordinate system. Usefulness of the redundant features is verified by the real time experiments on a Dual-Arm robot manipulator made by Samsung Electronic Co. Ltd..

로봇 OLP 보상을 위한 시각 서보잉 응용에 관한 연구 (A Study on Visual Servoing Application for Robot OLP Compensation)

  • 김진대;신찬배;이재원
    • 한국정밀공학회지
    • /
    • 제21권4호
    • /
    • pp.95-102
    • /
    • 2004
  • It is necessary to improve the exactness and adaptation of the working environment in the intelligent robot system. The vision sensor have been studied for this reason fur a long time. However, it is very difficult to perform the camera and robot calibrations because the three dimensional reconstruction and many processes are required for the real usages. This paper suggests the image based visual servoing to solve the problem of old calibration technique and supports OLP(Off-Line-Programming) path compensation. Virtual camera can be modeled from the real factors and virtual images obtained from virtual camera gives more easy perception process. Also, Initial path generated from OLP could be compensated by the pixel level acquired from the real and virtual, respectively. Consequently, the proposed visually assisted OLP teaching remove the calibration and reconstruction process in real working space. With a virtual simulation, the better performance is observed and the robot path error is calibrated by the image differences.

로봇의 운동특성을 고려한 새로운 시각구동 방법 (A novel visual servoing techniques considering robot dynamics)

  • 이준수;서일홍;김태원
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1996년도 한국자동제어학술회의논문집(국내학술편); 포항공과대학교, 포항; 24-26 Oct. 1996
    • /
    • pp.410-414
    • /
    • 1996
  • A visual servoing algorithm is proposed for a robot with a camera in hand. Specifically, novel image features are suggested by employing a viewing model of perspective projection to estimate relative pitching and yawing angles between the object and the camera. To compensate dynamic characteristics of the robot, desired feature trajectories for the learning of visually guided line-of-sight robot motion are obtained by measuring features by the camera in hand not in the entire workspace, but on a single linear path along which the robot moves under the control of a, commercially provided function of linear motion. And then, control actions of the camera are approximately found by fuzzy-neural networks to follow such desired feature trajectories. To show the validity of proposed algorithm, some experimental results are illustrated, where a four axis SCARA robot with a B/W CCD camera is used.

  • PDF

Controlling robot by image-based visual servoing with stereo cameras

  • Fan, Jun-Min;Won, Sang-Chul
    • 한국정보기술응용학회:학술대회논문집
    • /
    • 한국정보기술응용학회 2005년도 6th 2005 International Conference on Computers, Communications and System
    • /
    • pp.229-232
    • /
    • 2005
  • In this paper, an image-based "approach-align -grasp" visual servo control design is proposed for the problem of object grasping, which is based on the binocular stand-alone system. The basic idea consists of considering a vision system as a specific sensor dedicated a task and included in a control servo loop, and we perform automatic grasping follows the classical approach of splitting the task into preparation and execution stages. During the execution stage, once the image-based control modeling is established, the control task can be performed automatically. The proposed visual servoing control scheme ensures the convergence of the image-features to desired trajectories by using the Jacobian matrix, which is proved by the Lyapunov stability theory. And we also stress the importance of projective invariant object/gripper alignment. The alignment between two solids in 3-D projective space can be represented with view-invariant, more precisely; it can be easily mapped into an image set-point without any knowledge about the camera parameters. The main feature of this method is that the accuracy associated with the task to be performed is not affected by discrepancies between the Euclidean setups at preparation and at task execution stages. Then according to the projective alignment, the set point can be computed. The robot gripper will move to the desired position with the image-based control law. In this paper we adopt a constant Jacobian online. Such method describe herein integrate vision system, robotics and automatic control to achieve its goal, it overcomes disadvantages of discrepancies between the different Euclidean setups and proposes control law in binocular-stand vision case. The experimental simulation shows that such image-based approach is effective in performing the precise alignment between the robot end-effector and the object.

  • PDF

무인 시스템의 자율 주행을 위한 영상기반 항법기술 동향 (Survey on Visual Navigation Technology for Unmanned Systems)

  • 김현진;서호성;김표진;이충근
    • 한국항행학회논문지
    • /
    • 제19권2호
    • /
    • pp.133-139
    • /
    • 2015
  • 이 논문에서는 영상정보를 기반으로 한 무인 시스템의 자율 항법기술에 대한 동향을 요약한다. 영상기반 항법기술로는 비주얼 서보잉, 비주얼 오도메트리, 영상 기반 SLAM(simultaneous localization and mapping)이 있다. 비주얼 서보잉은 목표 이미지와 현재 이미지 사이의 피쳐 차이로부터 원하는 속도 입력을 계산하여 무인 로봇을 목표 자세로 유도하는 데 사용된다. 비주얼 오도메트리는 무인 시스템이 영상정보를 바탕으로 자신의 이동 궤적을 추정하는 기술로, 기존의 dead-reckoning 방식보다 정확성을 향상시킬 수 있다. 영상 기반 SLAM은 무인 시스템이 영상 정보를 활용하여 미지의 환경에 대한 지도를 구축함과 동시에 자신의 위치를 결정해 나가는 기술로, 정확히 알지 못하는 환경에서 무인차량이나 무인기를 운용하는데 필수적이다. 이러한 기술들이 적용된 해외의 연구 사례들을 살펴봄으로써 영상기반 항법기술의 동향을 파악할 수 있었다.