• Title/Summary/Keyword: Vision Sensor

Search Result 829, Processing Time 0.03 seconds

Development on Tandem GMA Welding System using Seam Tracking System in Pipe Line (용접선 추적시스템을 적용한 탄뎀 원주 용접시스템 개발)

  • Lee, JongPyo;Lee, JiHye;Park, MinHo;Park, CheolKyun;Kim, IllSoo
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.31 no.11
    • /
    • pp.1007-1013
    • /
    • 2014
  • In this study to improve the productivity, advantage Tandem circumferential weld process of seam tracking system was applied for the laser vision sensor. Weld geometry scanning laser vision sensor and PLC control unit are used to scan correct positioning of welding torch when the program is implemented so that it can correctly track the welding line. The welding experiment was conducted to evaluate the performance of laser vision seam tracking sensor in tandem welding process. The seam tracking several experiments was to determine the reliability of the system, welding experiments relatively good quality welding bead was confirmed. Furthermore, the PLC program for seam tracking was used to confirm the validity of the application of tandem welding process according to the benefits of increased productivity, which is expected to contribute to national competitiveness.

Estimation of Angular Acceleration By a Monocular Vision Sensor

  • Lim, Joonhoo;Kim, Hee Sung;Lee, Je Young;Choi, Kwang Ho;Kang, Sung Jin;Chun, Sebum;Lee, Hyung Keun
    • Journal of Positioning, Navigation, and Timing
    • /
    • v.3 no.1
    • /
    • pp.1-10
    • /
    • 2014
  • Recently, monitoring of two-body ground vehicles carrying extremely hazardous materials has been considered as one of the most important national issues. This issue induces large cost in terms of national economy and social benefit. To monitor and counteract accidents promptly, an efficient methodology is required. For accident monitoring, GPS can be utilized in most cases. However, it is widely known that GPS cannot provide sufficient continuity in urban cannons and tunnels. To complement the weakness of GPS, this paper proposes an accident monitoring method based on a monocular vision sensor. The proposed method estimates angular acceleration from a sequence of image frames captured by a monocular vision sensor. The possibility of using angular acceleration is investigated to determine the occurrence of accidents such as jackknifing and rollover. By an experiment based on actual measurements, the feasibility of the proposed method is evaluated.

An Weldability Estimation of Laser Welded Specimens (레이저 용접물의 용접성 평가)

  • Lee, Jeong-Ick
    • Transactions of the Korean Society of Machine Tool Engineers
    • /
    • v.16 no.1
    • /
    • pp.60-68
    • /
    • 2007
  • It has been conducted by laser vision sensor for weldability estimation of front-bead after doing high speed butt laser welding of any condition. It has been developed a real time GUI(Graphic User Interface) system for weldability application in the basis of texts and field qualify levels. In the reference of bead imperfections, defects absolute position and defects intensity index of front-bead in the basis of formability reference, it has been produced a weldability estimation and defects intensity index of back-bead by back propagation neural network. In the result of by comparing measuring data by laser vision sensor of back-bead and data by back propagation neural network of one, it has been shown the similar results. Finally, under knowledge of welding condition in production line, it has been conducted a weldability estimation of back-bead only in knowledge of informations of front-bead data without using laser vision sensor or welding inspection experts and furthermore it can be used data for final inspection results of back-bead.

Intelligent Rain Sensing Algorithm for Vision-based Smart Wiper System (비전 기반 스마트 와이퍼 시스템을 위한 지능형 레인 감지 알고리즘 개발)

  • Lee, Kyung-Chang;Kim, Man-Ho;Im, Hong-Jun;Lee, Seok
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2003.06a
    • /
    • pp.1727-1730
    • /
    • 2003
  • A windshield wiper system plays a key part in assurance of driver's safety at rainfall. However, because quantity of rain and snow vary irregularly according to time and velocity of automotive, a driver changes speed and operation period of a wiper from time to time in order to secure enough visual field in the traditional windshield wiper system. Because a manual operation of windshield wiper distracts driver's sensitivity and causes inadvertent driving, this is becoming direct cause of traffic accident. Therefore, this paper presents the basic architecture of vision-based smart windshield wiper system and the rain sensing algorithm that regulate speed and operation period of windshield wiper automatically according to quantity of rain or snow. Also, this paper introduces the fuzzy wiper control algorithm based on human's expertise, and evaluates performance of suggested algorithm in simulator model. In especial, the vision sensor can measure wide area relatively than the optical rain sensor. hence, this grasp rainfall state more exactly in case disturbance occurs.

  • PDF

Development of a Lateral Control System for Autonomous Vehicles Using Data Fusion of Vision and IMU Sensors with Field Tests (비전 및 IMU 센서의 정보융합을 이용한 자율주행 자동차의 횡방향 제어시스템 개발 및 실차 실험)

  • Park, Eun Seong;Yu, Chang Ho;Choi, Jae Weon
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.3
    • /
    • pp.179-186
    • /
    • 2015
  • In this paper, a novel lateral control system is proposed for the purpose of improving lane keeping performance which is independent from GPS signals. Lane keeping is a key function for the realization of unmanned driving systems. In order to obtain this objective, a vision sensor based real-time lane detection scheme is developed. Furthermore, we employ a data fusion along with a real-time steering angle of the test vehicle to improve its lane keeping performance. The fused direction data can be obtained by an IMU sensor and vision sensor. The performance of the proposed system was verified by computer simulations along with field tests using MOHAVE, a commercial vehicle from Kia Motors of Korea.

Development of Vision Sensor Module for the Measurement of Welding Profile (용접 형상 측정용 시각 센서 모듈 개발)

  • Kim C.H.;Choi T.Y.;Lee J.J.;Suh J.;Park K.T.;Kang H.S.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2006.05a
    • /
    • pp.285-286
    • /
    • 2006
  • The essential tasks to operate the welding robot are the acquisition of the position and/or shape of the parent metal. For the seam tracking or the robot automation, many kinds of contact and non-contact sensors are used. Recently, the vision sensor is most popular. In this paper, the development of the system which measures the profile of the welding part is described. The total system will be assembled into a compact module which can be attached to the head of welding robot system. This system uses the line-type structured laser diode and the vision sensor It implemented Direct Linear Transformation (DLT) for the camera calibration as well as radial distortion correction. The three dimensional shape of the parent metal is obtained after simple linear transformation and therefore, the system operates in real time. Some experiments are carried out to evaluate the performance of the developed system.

  • PDF

Development and Application of a Profile Measurement Sensor for Remote Laser Welding Robots (원격 레이저 용접 로봇을 위한 형상 측정 센서의 개발과 응용)

  • Kim, Chang-Hyun;Choi, Tae-Yong;Lee, Ju-Jang;Suh, Jeong;Park, Kyoung-Taik;Kang, Hee-Shin
    • Laser Solutions
    • /
    • v.12 no.2
    • /
    • pp.11-16
    • /
    • 2009
  • A new profile measurement sensor was developed for remote laser welding robots. A stripe laser and a vision camera are used in the profile sensor. A simple sensor guided control scheme using the developed sensor is also introduced. The sensor can be used to guide the welding head in the remote welding application, where the working distance reaches to 450mm. In experiments, the profile measurement and the seam tracking were carried out using the developed sensor.

  • PDF

Integrated System for Autonomous Proximity Operations and Docking

  • Lee, Dae-Ro;Pernicka, Henry
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.12 no.1
    • /
    • pp.43-56
    • /
    • 2011
  • An integrated system composed of guidance, navigation and control (GNC) system for autonomous proximity operations and the docking of two spacecraft was developed. The position maneuvers were determined through the integration of the state-dependent Riccati equation formulated from nonlinear relative motion dynamics and relative navigation using rendezvous laser vision (Lidar) and a vision sensor system. In the vision sensor system, a switch between sensors was made along the approach phase in order to provide continuously effective navigation. As an extension of the rendezvous laser vision system, an automated terminal guidance scheme based on the Clohessy-Wiltshire state transition matrix was used to formulate a "V-bar hopping approach" reference trajectory. A proximity operations strategy was then adapted from the approach strategy used with the automated transfer vehicle. The attitude maneuvers, determined from a linear quadratic Gaussian-type control including quaternion based attitude estimation using star trackers or a vision sensor system, provided precise attitude control and robustness under uncertainties in the moments of inertia and external disturbances. These functions were then integrated into an autonomous GNC system that can perform proximity operations and meet all conditions for successful docking. A six-degree of freedom simulation was used to demonstrate the effectiveness of the integrated system.

Vision Based Sensor Fusion System of Biped Walking Robot for Environment Recognition (영상 기반 센서 융합을 이용한 이쪽로봇에서의 환경 인식 시스템의 개발)

  • Song, Hee-Jun;Lee, Seon-Gu;Kang, Tae-Gu;Kim, Dong-Won;Seo, Sam-Jun;Park, Gwi-Tae
    • Proceedings of the KIEE Conference
    • /
    • 2006.04a
    • /
    • pp.123-125
    • /
    • 2006
  • This paper discusses the method of vision based sensor fusion system for biped robot walking. Most researches on biped walking robot have mostly focused on walking algorithm itself. However, developing vision systems for biped walking robot is an important and urgent issue since biped walking robots are ultimately developed not only for researches but to be utilized in real life. In the research, systems for environment recognition and tole-operation have been developed for task assignment and execution of biped robot as well as for human robot interaction (HRI) system. For carrying out certain tasks, an object tracking system using modified optical flow algorithm and obstacle recognition system using enhanced template matching and hierarchical support vector machine algorithm by wireless vision camera are implemented with sensor fusion system using other sensors installed in a biped walking robot. Also systems for robot manipulating and communication with user have been developed for robot.

  • PDF

Integrated Navigation Design Using a Gimbaled Vision/LiDAR System with an Approximate Ground Description Model

  • Yun, Sukchang;Lee, Young Jae;Kim, Chang Joo;Sung, Sangkyung
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.14 no.4
    • /
    • pp.369-378
    • /
    • 2013
  • This paper presents a vision/LiDAR integrated navigation system that provides accurate relative navigation performance on a general ground surface, in GNSS-denied environments. The considered ground surface during flight is approximated as a piecewise continuous model, with flat and slope surface profiles. In its implementation, the presented system consists of a strapdown IMU, and an aided sensor block, consisting of a vision sensor and a LiDAR on a stabilized gimbal platform. Thus, two-dimensional optical flow vectors from the vision sensor, and range information from LiDAR to ground are used to overcome the performance limit of the tactical grade inertial navigation solution without GNSS signal. In filter realization, the INS error model is employed, with measurement vectors containing two-dimensional velocity errors, and one differenced altitude in the navigation frame. In computing the altitude difference, the ground slope angle is estimated in a novel way, through two bisectional LiDAR signals, with a practical assumption representing a general ground profile. Finally, the overall integrated system is implemented, based on the extended Kalman filter framework, and the performance is demonstrated through a simulation study, with an aircraft flight trajectory scenario.