• 제목/요약/키워드: robot's visual system

검색결과 74건 처리시간 0.034초

로봇의 시각시스템을 위한 물체의 거리 및 크기측정 알고리즘 개발 (Development of a Robot's Visual System for Measuring Distance and Width of Object Algorism)

  • 김회인;김갑순
    • 제어로봇시스템학회논문지
    • /
    • 제17권2호
    • /
    • pp.88-92
    • /
    • 2011
  • This paper looks at the development of the visual system of robots, and the development of image processing algorism to measure the size of an object and the distance from robot to an object for the visual system. Robots usually get the visual systems with a camera for measuring the size of an object and the distance to an object. The visual systems are accurately impossible the size and distance in case of that the locations of the systems is changed and the objects are not on the ground. Thus, in this paper, we developed robot's visual system to measure the size of an object and the distance to an object using two cameras and two-degree robot mechanism. And, we developed the image processing algorism to measure the size of an object and the distance from robot to an object for the visual system, and finally, carried out the characteristics test of the developed visual system. As a result, it is thought that the developed system could accurately measure the size of an object and the distance to an object.

이동 물체 포착을 위한 비젼 서보 제어 시스템 개발 (Development of Visual Servo Control System for the Tracking and Grabbing of Moving Object)

  • 최규종;조월상;안두성
    • 동력기계공학회지
    • /
    • 제6권1호
    • /
    • pp.96-101
    • /
    • 2002
  • In this paper, we address the problem of controlling an end-effector to track and grab a moving target using the visual servoing technique. A visual servo mechanism based on the image-based servoing principle, is proposed by using visual feedback to control an end-effector without calibrated robot and camera models. Firstly, we consider the control problem as a nonlinear least squares optimization and update the joint angles through the Taylor Series Expansion. And to track a moving target in real time, the Jacobian estimation scheme(Dynamic Broyden's Method) is used to estimate the combined robot and image Jacobian. Using this algorithm, we can drive the objective function value to a neighborhood of zero. To show the effectiveness of the proposed algorithm, simulation results for a six degree of freedom robot are presented.

  • PDF

지면 특징점을 이용한 영상 주행기록계에 관한 연구 (A Study on the Visual Odometer using Ground Feature Point)

  • 이윤섭;노경곤;김진걸
    • 한국정밀공학회지
    • /
    • 제28권3호
    • /
    • pp.330-338
    • /
    • 2011
  • Odometry is the critical factor to estimate the location of the robot. In the mobile robot with wheels, odometry can be performed using the information from the encoder. However, the information of location in the encoder is inaccurate because of the errors caused by the wheel's alignment or slip. In general, visual odometer has been used to compensate for the kinetic errors of robot. In case of using the visual odometry under some robot system, the kinetic analysis is required for compensation of errors, which means that the conventional visual odometry cannot be easily applied to the implementation of the other type of the robot system. In this paper, the novel visual odometry, which employs only the single camera toward the ground, is proposed. The camera is mounted at the center of the bottom of the mobile robot. Feature points of the ground image are extracted by using median filter and color contrast filter. In addition, the linear and angular vectors of the mobile robot are calculated with feature points matching, and the visual odometry is performed by using these linear and angular vectors. The proposed odometry is verified through the experimental results of driving tests using the encoder and the new visual odometry.

Human-Robot Interaction in Real Environments by Audio-Visual Integration

  • Kim, Hyun-Don;Choi, Jong-Suk;Kim, Mun-Sang
    • International Journal of Control, Automation, and Systems
    • /
    • 제5권1호
    • /
    • pp.61-69
    • /
    • 2007
  • In this paper, we developed not only a reliable sound localization system including a VAD(Voice Activity Detection) component using three microphones but also a face tracking system using a vision camera. Moreover, we proposed a way to integrate three systems in the human-robot interaction to compensate errors in the localization of a speaker and to reject unnecessary speech or noise signals entering from undesired directions effectively. For the purpose of verifying our system's performances, we installed the proposed audio-visual system in a prototype robot, called IROBAA(Intelligent ROBot for Active Audition), and demonstrated how to integrate the audio-visual system.

조립용 로봇이 3차원 그래픽 시뮬레이터 개발 (Development of a 3D Graphic Simulator for Assembling Robot)

  • 장영희
    • 한국공작기계학회:학술대회논문집
    • /
    • 한국공작기계학회 1998년도 춘계학술대회 논문집
    • /
    • pp.227-232
    • /
    • 1998
  • We developed a Off-Line Graphic Simulator which can simulate a robot model in 3D graphics space in Windows 95 version. 4 axes SCARA robot was adopted as an objective model. Forward kinematics, inverse kinematics and robot dynamics modeling were included in the developed program. The interface between users and the off-line program system in the Windows 95's graphic user interface environment was also studied. The developing language is Microsoft Visual C++. Graphic libraries, OpenGL, by Silicon Graphics, Inc. were utilized for 3D graphics.

  • PDF

신경 회로망을 이용한 로보트의 동력학적 시각 서보 제어 (Dynamic Visual Servo Control of Robot Manipulators Using Neural Networks)

  • 박재석;오세영
    • 전자공학회논문지B
    • /
    • 제29B권10호
    • /
    • pp.37-45
    • /
    • 1992
  • For a precise manipulator control in the presence of environmental uncertainties, it has long been recognized that the robot should be controlled in a task-referenced space. In this respect, an effective visual servo control system for robot manipulators based on neural networks is proposed. In the proposed control system, a Backpropagation neural network is used first to learn the mapping relationship between the robot's joint space and the video image space. However, in the real control loop, this network is not used in itself, but its first and second derivatives are used to generate servo commands for the robot. Second, and Adaline neural network is used to identify the approximately linear dynamics of the robot and also to generate the proper joint torque commands. Computer simulation has been performed demonstrating the proposed method's superior performance. Futrhermore, the proposed scheme can be effectively utilized in a robot skill acquisition system where the robot can be taught by watching a human behavioral task.

  • PDF

Vision Navigation System by Autonomous Mobile Robot

  • Shin S.Y.;Lee, J.H.;Kang H.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.146.3-146
    • /
    • 2001
  • It has been integrated into several navigation systems. This paper shows that system recognizes difficult indoor roads and open area without any specific mark such as painted guide tine or tape. In this method, Robot navigates with visual sensors, which uses visual information to navigate itself along the road. An Artificial Neural Network System was used to decide where to move. It is designed with USB web camera as visual sensor.

  • PDF

비행로봇의 항공 영상 온라인 학습을 통한 지상로봇 검출 및 추적 (UGR Detection and Tracking in Aerial Images from UFR for Remote Control)

  • 김승훈;정일균
    • 로봇학회논문지
    • /
    • 제10권2호
    • /
    • pp.104-111
    • /
    • 2015
  • In this paper, we proposed visual information to provide a highly maneuverable system for a tele-operator. The visual information image is bird's eye view from UFR(Unmanned Flying Robot) shows around UGR(Unmanned Ground Robot). We need UGV detection and tracking method for UFR following UGR always. The proposed system uses TLD(Tracking Learning Detection) method to rapidly and robustly estimate the motion of the new detected UGR between consecutive frames. The TLD system trains an on-line UGR detector for the tracked UGR. The proposed system uses the extended Kalman filter in order to enhance the performance of the tracker. As a result, we provided the tele-operator with the visual information for convenient control.

오프라인 프로그래밍을 이용한 스카라 로봇의 통합제어시스템 설계 (Intergrated Control System Design of SCARA Robot Based-On Off-Line Programming)

  • 한성현;정동연
    • 한국공작기계학회논문집
    • /
    • 제11권3호
    • /
    • pp.21-27
    • /
    • 2002
  • In this paper, we have developed a Widows 98 version Off-Line Programming System which can simulate a Robot model in 3D Graphics space. The SCARA robot with four joints (FARA SM5)was adopted as an objective model. Forward kinematics, inverse kinematics and robot dynamics modeling were included in the developed program. The interface between users and the OLP system in the Widows 98's GUI environment was also studied. The developing language is Microsoft Visual C++. Graphic 1ibraries, OpenGL, by silicon Graphics, Inc. were utilized for 3D Graphics.

스카라 로봇의 3차원 그래픽 시뮬레이션 툴 개발 (Development of a 3D graphic simulation tool for SCARA robot)

  • 이대영;최재원;이민철
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1997년도 한국자동제어학술회의논문집; 한국전력공사 서울연수원; 17-18 Oct. 1997
    • /
    • pp.724-727
    • /
    • 1997
  • In this paper, we developed a Windows 95 version Off-Line Programming System which can simulate a Robot model in 3D Graphic space. 4 axes SCARA Robot (especially FARA SM5)was adopted as an objective model. Forward kinematics, inverse kinematics and robot dynamics modeling were included in the developed program. The interface between users and the OLP system in the Windows 95's GUI environment was also studied. The developing language is Microsoft Visual C++. Graphic libraries, OpenGL, by Silicon Graphics, Inc. were utilized for 3D Graphics.

  • PDF