• Title/Summary/Keyword: 비젼센서

Search Result 71, Processing Time 0.024 seconds

A Study on the Real-Time Vision Control Method for Manipulator's position Control in the Uncertain Circumstance (불확실한 환경에서 매니퓰레이터 위치제어를 위한 실시간 비젼제어기법에 관한 연구)

  • Jang, W.-S.;Kim, K.-S.;Shin, K.-S.;Joo, C.;;Yoon, H.-K.
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.16 no.12
    • /
    • pp.87-98
    • /
    • 1999
  • This study is concentrated on the development of real-time estimation model and vision control method as well as the experimental test. The proposed method permits a kind of adaptability not otherwise available in that the relationship between the camera-space location of manipulable visual cues and the vector of manipulator joint coordinates is estimate in real time. This is done based on a estimation model ta\hat generalizes known manipulator kinematics to accommodate unknown relative camera position and orientation as well as uncertainty of manipulator. This vision control method is roboust and reliable, which overcomes the difficulties of the conventional research such as precise calibration of the vision sensor, exact kinematic modeling of the manipulator, and correct knowledge of position and orientation of CCD camera with respect to the manipulator base. Finally, evidence of the ability of real-time vision control method for manipulator's position control is provided by performing the thin-rod placement in space with 2 cues test model which is completed without a prior knowledge of camera or manipulator positions. This feature opens the door to a range of applications of manipulation, including a mobile manipulator with stationary cameras tracking and providing information for control of the manipulator event.

  • PDF

A Study on the Determination of 3-D Object's Position Based on Computer Vision Method (컴퓨터 비젼 방법을 이용한 3차원 물체 위치 결정에 관한 연구)

  • 김경석
    • Journal of the Korean Society of Manufacturing Technology Engineers
    • /
    • v.8 no.6
    • /
    • pp.26-34
    • /
    • 1999
  • This study shows an alternative method for the determination of object's position, based on a computer vision method. This approach develops the vision system model to define the reciprocal relationship between the 3-D real space and 2-D image plane. The developed model involves the bilinear six-view parameters, which is estimated using the relationship between the camera space location and real coordinates of known position. Based on estimated parameters in independent cameras, the position of unknown object is accomplished using a sequential estimation scheme that permits data of unknown points in each of the 2-D image plane of cameras. This vision control methods the robust and reliable, which overcomes the difficulties of the conventional research such as precise calibration of the vision sensor, exact kinematic modeling of the robot, and correct knowledge of the relative positions and orientation of the robot and CCD camera. Finally, the developed vision control method is tested experimentally by performing determination of object position in the space using computer vision system. These results show the presented method is precise and compatible.

  • PDF

Development of a Lane Keeping Assist System using Vision Sensor and DRPG Algorithm (비젼센서와 DRPG알고리즘을 이용한 차선 유지 보조 시스템 개발)

  • Hwang, Jun-Yeon;Huh, Kun-Soo;Na, Hyuk-Min;Jung, Ho-Gi;Kang, Hyung-Jin;Yoon, Pal-Joo
    • Transactions of the Korean Society of Automotive Engineers
    • /
    • v.17 no.1
    • /
    • pp.50-57
    • /
    • 2009
  • Lane Keeping Assistant Systems (LKAS) require the cooperative operation between drivers and active steering angle/torque controllers. An LKAS is proposed in this study such that the desired reference path generation (DRPG) system generates the desired path to minimize the trajectory overshoot. Based on the reference path from the DRPG system, an optimal controller is designed to minimize the cost function. A HIL (Hardware In the Loop) simulator is constructed to evaluate the proposed LKAS system. The single camera is mounted on the simulator and acquires the monitor images to detect lane markers. The performance of the proposed system is evaluated by HIL system using the Carsim and the Matlab Simulink.

Control of Mobile Robot Navigation Using Vision Sensor Data Fusion by Nonlinear Transformation (비선형 변환의 비젼센서 데이터융합을 이용한 이동로봇 주행제어)

  • Jin Tae-Seok;Lee Jang-Myung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.11 no.4
    • /
    • pp.304-313
    • /
    • 2005
  • The robots that will be needed in the near future are human-friendly robots that are able to coexist with humans and support humans effectively. To realize this, robot need to recognize his position and direction for intelligent performance in an unknown environment. And the mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. Notice that in the conventional fusion schemes, the measurement is dependent on the current data sets only. Therefore, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this research, instead of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the accurate measurement. As a general approach of sensor fusion, a UT -Based Sensor Fusion(UTSF) scheme using Unscented Transformation(UT) is proposed for either joint or disjoint data structure and applied to the landmark identification for mobile robot navigation. Theoretical basis is illustrated by examples and the effectiveness is proved through the simulations and experiments. The newly proposed, UT-Based UTSF scheme is applied to the navigation of a mobile robot in an unstructured environment as well as structured environment, and its performance is verified by the computer simulation and the experiment.

Development of a Sensor System for Real-Time Posture Measurement of Mobile Robots (이동 로봇의 실시간 자세 추정을 위한 센서 시스템의 개발)

  • 이상룡;권승만
    • Transactions of the Korean Society of Mechanical Engineers
    • /
    • v.17 no.9
    • /
    • pp.2191-2204
    • /
    • 1993
  • A sensor system has been developed to measure the posture(position and orientation) of mobile robots working in industrial environments. The proposed sensor system consists of a CCD camera, retro-reflective landmarks, a strobe unit and an image processing board. The proposed hardware system can be built in economic price compared to commercial vision systems. The system has the capability of measuring the posture of mobile robots within 60 msec when a 386 personal computer is used as the host computer. The experimental results demonstrated a remarkable performance of the proposed sensor system in the posture measurement of mobile robots - the average error in position is less than 3 mm and the average error in orientation is less than 1.5.

Development of Robot Control and Measurement for Unknown Geometric Surface Grinding (미지형상 표면의 연삭 작업을 위한 로봇 제어ㆍ계측 시스템 개발)

  • Choe, Byeong-O;Park, Geun-U;Lee, Min-Gi;Lee, Jung-Hun
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.24 no.4 s.175
    • /
    • pp.1039-1046
    • /
    • 2000
  • This paper introduces the control and measurement of a double parallel robot manipulator applied for unknown geometric surface grinding. A measurement system is developed to recognize a grinding path by a vision camera and to observe a grinding load by a current sensor. With the measured fusion information, an intelligent controller identifies the unknown geometric surface and moves the robot along the grinding path with a constant grinding load.

Development of a Grinding Robot System for the Engine Cylinder Liner's Oil Groove (실린더 라이너 오일그루브 가공 로봇 시스템 개발)

  • Noh, Tae-Yang;Lee, Yun-Sik;Jung, Chang-Wook;Oh, Yong-Chul
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.33 no.6
    • /
    • pp.614-619
    • /
    • 2009
  • An engine for marine propulsion and power generation consists of several cylinder liner-piston sets. And the oil groove is on the cylinder liner inside wall for the lubrication between a piston and cylinder. The machining process of oil groove has been carried by manual work so far, because of the diversity of the shape. Recently, we developed an automatic grinding robot system for oil groove machining of engine cylinder liners. It can covers various types of oil grooves and adjust its position by itself. The grinding robot system consists of a robot, a machining tool head, sensors and a control system. The robot automatically recognizes the cylinder liner's inside configuration by using a laser displacement sensor and a vision sensor after the cylinder liner is placed on a set-up equipment.

A development of remote controlled mobile robot working in a hazard environment (위해환경에서 구동가능한 원격제어 이동 로봇 개발)

  • 박제용;최현석;현웅근
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2002.11a
    • /
    • pp.457-461
    • /
    • 2002
  • This paper describes a development of a robot working in hazard environment. The developed robot consists of robot controller with vision system and host PC program. The robot and camera can move with 2 degree of freedom by independent remote controlling a user friendly designed joystick. An environment is recognized by the vision system and ultra sonic sensors. The visual image and command data translated through 900MHz and 447MHz RF controller, respectively. To show the validity of the developed system, operations of the robot in the field area were illustrated.

  • PDF

마이크로볼로미터 센서용 진공패키지 조립공정 특성평가

  • Park, Chang-Mo;Han, Myeong-Su;Sin, Gwang-Su;Go, Hang-Ju;Kim, Seon-Hun;Gi, Hyeon-Cheol;Kim, Hyo-Jin
    • Proceedings of the Korean Vacuum Society Conference
    • /
    • 2010.02a
    • /
    • pp.252-252
    • /
    • 2010
  • 적외선 센서는 빛의 유무에 관계없이 주 야간 전방의 물체에서 발산하는 미약한 적외선(열선)을 감지하여 영상으로 재현하는 열상시스템은 자동차 야간 운전자 보조용 나이트 비젼, 핵심 시설의 감시 관리, 군수 등의 분야에 적용되어지고 있는 최첨단, 고부가가치를 지니고 있는 기술이다. 양자형은 센서 특성은 좋으나 냉각기(작동온도: $-196^{\circ}C$) 및 고진공 패키지인 dewar를 사용하는 반면에, 열형은 대부분 상온에서 동작되는 온도안정화를 위한 전자냉각모듈만을 구비하면 되므로 저가형으로 제작이 가능한 비냉각형 적외선 센서이다. 본 연구에서는 적외선 센서용 진공패키지 조립공정 및 패키지된 센서의 측정기술을 개발하였다. 금속 메탈패키지를 제작하였으며, 금속 진공패키지는 소자냉각용 TE Cooler와 장수명 진공유지를 위한 getter, 그리고 센서칩, 온도센서를. 장착하여 칩을 조립하였다. Cap ass'y와 base envelop의 솔더링 공정을 수행하였으며, 진공패키지의 진공유지를 위해 TMP를 이용하여 진공을 유지하고, 약 5일동안 패키지 bake-out을 수행하였다. 진공압력은 $10^{-7}\;torr$ 이하를 유지하였으며, getter를 활성화시키고, pinch-off 공정으로 조립 ass'y를 완성하였다. 진공 패키지의 기밀성은 He leak tester를 이용하여 측정하였으며, ${\sim}10^{-9}\;std.cm^3/sec$로 기밀성을 유지하였다. TE cooler를 작동한 온도안정성은 0.05 K 이하였다. 볼로미터 센서의 반응도는 $10^2\;V/W$ 이상을 나타내었으며, 탐지도는 $2{\times}10^8cm-Hz^{1/2}/W$를 나타내었다. 본 연구를 통하여 얻어진 결과는 향후 2차원 열영상용 어레이 검출기 및 웨이퍼수준의 패키징 공정에 유용하게 응용될 것으로 판단된다.

  • PDF

Efficient Digitizing in Reverse Engineering By Sensor Fusion (역공학에서 센서융합에 의한 효율적인 데이터 획득)

  • Park, Young-Kun;Ko, Tae-Jo;Kim, Hrr-Sool
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.18 no.9
    • /
    • pp.61-70
    • /
    • 2001
  • This paper introduces a new digitization method with sensor fusion for shape measurement in reverse engineering. Digitization can be classified into contact and non-contact type according to the measurement devices. Important thing in digitization is speed and accuracy. The former is excellent in speed and the latter is good for accuracy. Sensor fusion in digitization intends to incorporate the merits of both types so that the system can be automatized. Firstly, non-contact sensor with vision system acquires coarse 3D point data rapidly. This process is needed to identify and loco]ice the object located at unknown position on the table. Secondly, accurate 3D point data can be automatically obtained using scanning probe based on the previously measured coarse 3D point data. In the research, a great number of measuring points of equi-distance were instructed along the line acquired by the vision system. Finally, the digitized 3D point data are approximated to the rational B-spline surface equation, and the free-formed surface information can be transferred to a commercial CAD/CAM system via IGES translation in order to machine the modeled geometric shape.

  • PDF