• Title/Summary/Keyword: Robotic camera

Search Result 93, Processing Time 0.022 seconds

Development of a Vision-based Blank Alignment Unit for Press Automation Process (프레스 자동화 공정을 위한 비전 기반 블랭크 정렬 장치 개발)

  • Oh, Jong-Kyu;Kim, Daesik;Kim, Soo-Jong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.1
    • /
    • pp.65-69
    • /
    • 2015
  • A vision-based blank alignment unit for a press automation line is introduced in this paper. A press is a machine tool that changes the shape of a blank by applying pressure and is widely used in industries requiring mass production. In traditional press automation lines, a mechanical centering unit, which consists of guides and ball bearings, is employed to align a blank before a robot inserts it into the press. However it can only align limited sized and shaped of blanks. Moreover it cannot be applied to a process where more than two blanks are simultaneously inserted. To overcome these problems, we developed a press centering unit by means of vision sensors for press automation lines. The specification of the vision system is determined by considering information of the blank and the required accuracy. A vision application S/W with pattern recognition, camera calibration and monitoring functions is designed to successfully detect multiple blanks. Through real experiments with an industrial robot, we validated that the proposed system was able to align various sizes and shapes of blanks, and successfully detect more than two blanks which were simultaneously inserted.

Implementation of Fish Robot Tracking-Control Methods (물고기 로봇 추적 제어 구현)

  • Lee, Nam-Gu;Kim, Byeong-Jun;Shin, Kyoo-Jae
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2018.10a
    • /
    • pp.885-888
    • /
    • 2018
  • This paper researches a way of detecting fish robots moving in an aquarium. The fish robot was designed and developed for interactions with humans in aquariums. It was studied merely to detect a moving object in an aquarium because we need to find the positions of moving fish robots. The intention is to recognize the location of robotic fish using an image processing technique and a video camera. This method is used to obtain the velocity for each pixel in an image, and assumes a constant velocity in each video frame to obtain positions of fish robots by comparing sequential video frames. By using this positional data, we compute the distance between fish robots using a mathematical expression, and determine which fish robot is leading and which one is lagging. Then, the lead robot will wait for the lagging robot until it reaches the lead robot. The process runs continuously. This system is exhibited in the Busan Science Museum, satisfying a performance test of this algorithm.

Movement characteristics of pneumatic actuators for the semi-autonomous colonoscopic system (자율이동 대장 내시경을 위한 공압구동기의 이동 특성)

  • Kim, Byung-Kyu;Lee, Jin-Hee;Park, Ji-Sang;Lim, Young-Mo;Park, Jong-Oh;Kim, Soo-Hyun;Hong, Yeh-Sun
    • Proceedings of the KSME Conference
    • /
    • 2001.06b
    • /
    • pp.295-300
    • /
    • 2001
  • In recent years, as changing the habit of eating, the pathology in the colon grows up annually. For that reason, the colonoscopy is generalized. But it requires much time to acquire a dexterous skill to perform an operation. And the procedure is painful to the patient. Therefore, biomedical and robotic researchers are developing a locomotive colonoscope that can travel safely in colon. In this paper, we propose a novel design and concept of semi-autonomous colonoscope and two actuators for the micro robot. The micro robot comprises camera and LED for diagnosis, steering system to pass through the loop, pneumatic actuator and bow-shaped flexible supporters to control a contact force and to keep the space between colon wall and the actuator. For actuating mechanism, we suggest two models. One is based on the reaction force, and the other is impact force. In order to validate the concept and the performance of the actuators, we carried out the preliminary experiments in rigid pipes.

  • PDF

Design and Control of Wire-driven Flexible Robot Following Human Arm Gestures (팔 동작 움직임을 모사하는 와이어 구동 유연 로봇의 설계 및 제어)

  • Kim, Sanghyun;Kim, Minhyo;Kang, Junki;Son, SeungJe;Kim, Dong Hwan
    • The Journal of Korea Robotics Society
    • /
    • v.14 no.1
    • /
    • pp.50-57
    • /
    • 2019
  • This work presents a design and control method for a flexible robot arm operated by a wire drive that follows human gestures. When moving the robot arm to a desired position, the necessary wire moving length is calculated and the motors are rotated accordingly to the length. A robotic arm is composed of a total of two module-formed mechanism similar to real human motion. Two wires are used as a closed loop in one module, and universal joints are attached to each disk to create up, down, left, and right movements. In order to control the motor, the anti-windup PID was applied to limit the sudden change usually caused by accumulated error in the integral control term. In addition, master/slave communication protocol and operation program for linking 6 motors to MYO sensor and IMU sensor output were developed at the same time. This makes it possible to receive the image information of the camera attached to the robot arm and simultaneously send the control command to the robot at high speed.

A 2D / 3D Map Modeling of Indoor Environment (실내환경에서의 2 차원/ 3 차원 Map Modeling 제작기법)

  • Jo, Sang-Woo;Park, Jin-Woo;Kwon, Yong-Moo;Ahn, Sang-Chul
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.355-361
    • /
    • 2006
  • In large scale environments like airport, museum, large warehouse and department store, autonomous mobile robots will play an important role in security and surveillance tasks. Robotic security guards will give the surveyed information of large scale environments and communicate with human operator with that kind of data such as if there is an object or not and a window is open. Both for visualization of information and as human machine interface for remote control, a 3D model can give much more useful information than the typical 2D maps used in many robotic applications today. It is easier to understandable and makes user feel like being in a location of robot so that user could interact with robot more naturally in a remote circumstance and see structures such as windows and doors that cannot be seen in a 2D model. In this paper we present our simple and easy to use method to obtain a 3D textured model. For expression of reality, we need to integrate the 3D models and real scenes. Most of other cases of 3D modeling method consist of two data acquisition devices. One for getting a 3D model and another for obtaining realistic textures. In this case, the former device would be 2D laser range-finder and the latter device would be common camera. Our algorithm consists of building a measurement-based 2D metric map which is acquired by laser range-finder, texture acquisition/stitching and texture-mapping to corresponding 3D model. The algorithm is implemented with laser sensor for obtaining 2D/3D metric map and two cameras for gathering texture. Our geometric 3D model consists of planes that model the floor and walls. The geometry of the planes is extracted from the 2D metric map data. Textures for the floor and walls are generated from the images captured by two 1394 cameras which have wide Field of View angle. Image stitching and image cutting process is used to generate textured images for corresponding with a 3D model. The algorithm is applied to 2 cases which are corridor and space that has the four wall like room of building. The generated 3D map model of indoor environment is shown with VRML format and can be viewed in a web browser with a VRML plug-in. The proposed algorithm can be applied to 3D model-based remote surveillance system through WWW.

  • PDF

PHOTOMETRIC OBSERVATIONS AND LIGHT CURVE ANALYSIS OF BL ERIDANI (BL ERIDANI의 측광관측과 광도곡선 분석)

  • Han, Won-Yong;Yim, Hong-Suh;Lee, Chung-Uk;Youn, Jae-Hyuck;Yoon, Joh-Na;Kim, Ho-Il;Moon, Hong-Kyu;Byun, Yong-Ik;Park, Sun-Youp
    • Journal of Astronomy and Space Sciences
    • /
    • v.23 no.4
    • /
    • pp.319-326
    • /
    • 2006
  • We present light curves of a short period binary system BL Eridani. The light curves were observed with VRI filters by a 50cm wide field robotic telescope at Siding Spring Observatory (SSO), equipped with a 2K CCD camera, which was developed by Korea Astronomy and Space Science Institute (KASI), and Yonsei University Observatory(YUO). The photometric observations were made on 6 nights by automatic operation mode and remote observation mode at SSO and KASI in Korea, respectively. We obtained new VRI CCD light curves and new 5 times of minima, and analyzed the light corves with the Wilson & Deviney (1971) binary 2005 version and derived the new photometric solutions. The mass ratio q=0.48 in this study shows different value with earlier investigators. According to the model analysis, it is considered that the BL Eri system is currently undergoing contact stage of the two binary components, rather than near-contact stage.

A Real-Time Control Architecture for a Semi-Autonomous Underwater Vehicle (반자율 무인잠수정을 위한 실시간 제어 아키텍쳐)

  • LI JI-HONG;JEON BONG-HWAN;LEE PAN-MOOK;WON HONG-SEOK
    • Proceedings of the Korea Committee for Ocean Resources and Engineering Conference
    • /
    • 2004.05a
    • /
    • pp.198-203
    • /
    • 2004
  • This paper describes a real-time control architecture for DUSAUV (Dual Use Semi-Autonomous Underwater Vehicle), which has been developed at Korea Research Institute of Ships & Ocean Engineering (KRISO), KORDI, for being a test-bed oj development of technologies for underwater navigation and manipulator operation. DUSAUV has three built-in computers, seven thrusters for 6 degree of freedom motion control, one 4-function electric manipulator, one pan/tilt unit for camera, one ballasting motor, built-in power source, and various sensors such as IMU, DVL, sonar, and so on. A supervisor control system for GUI and manipulator operation is mounted on the surface vessel and communicates with vehicle through a fiber optic link. Furthermore, QNX, one of real-time operating system, is ported on the built-in control and navigation computers of vehicle for real-time control purpose, while MicroSoft OS product is ported on the supervisor system for GUI programming convenience. A hierarchical control architecture which consist of three layers (application layer, real-time layer, and physical layer) has been developed for efficient control system of above complex underwater robotic system. The experimental results with implementation of the layered control architecture for various motion control of DUSAUV in a basin of KRISO is also provided.

  • PDF

Color Vision System for Intelligent Rehabilitation Robot mounted on the Wheelchair (휠체어 장착형 지능형 재활 로봇을 위한 칼라 비전 시스템)

  • Song, Won-Kyung;Lee, He-Young;Kim, Jong-Sung;Bien, Zeung-Nam
    • Journal of the Korean Institute of Telematics and Electronics S
    • /
    • v.35S no.11
    • /
    • pp.75-87
    • /
    • 1998
  • KARES (KAIST Rehabilitation Engineering System) is the rehabilitation robot system in the type of the 6 degrees of freedom robot arm mounted on the wheelchair, in order to assist the independent livelihood of the disabled and the elderly. The interface device for programming and controlling of the robot arm is essential in the rehabilitation robotic system. Specially, in the case of the manual operation of the robot arm, the user has the burden of cognition and the difficulty for the operation of the robot arm. As a remedy, color vision system for the autonomous performance of jobs is proposed, and four basic desired jobs are specified. By mounting the camera in eye-in-hand type, color vision system for KARES is set up. The desired jobs for picking up the target and moving it to the user's face for drinking are successfully performed in real-time at the indoor environment.

  • PDF

Locomotive Mechanism Based on Pneumatic Actuators for the Semi-Autonomous Endoscopic System (자율주행 내시경을 위한 공압 구동방식의 이동메카니즘)

  • Kim, Byungkyu;Kim, Kyoung-Dae;Lee, Jinhee;Park, Jong-Oh;Kim, Soo-Hyun;Hong, Yeh-Sun
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.8 no.4
    • /
    • pp.345-350
    • /
    • 2002
  • In recent years, as changing the habit of eating, the pathology in the colon grows up annually. The colonoscopy is generalized, but if requires much time to acquire a dexterous skill to perform an operation and the procedure is painful to the patient. biomedical and robotic researchers are developing a locomotive colonoscope that can travel safe1y in colon. In this paper, we propose a new actuator and concept of semi-autonomous colonoscope. The micro robot comprises camera and LED for diagnosis, steer- ing system to pass through the loop, pneumatic actuator and bow-shaped flexible supporters to control a contact force and to pass over haustral folds in colon. For locomotion of semi-autonomous colonoscope, we suggest an actuator that is based on impact force between a cylinder and a piston. In order to validate the concept and the performance of the actuator, we carried out the simulation of moving characteristics and the preliminary experiments in rigid pipes and on the colon of pig.

A Study on Development of the Optimization Algorithms to Find the Seam Tracking (용접선 추적을 위한 최적화 알고리즘 개발에 관한 연구)

  • Jin, Byeong-Ju;Lee, Jong-Pyo;Park, Min-Ho;Kim, Do-Hyeong;Wu, Qian-Qian;Kim, Il-Soo;Son, Joon-Sik
    • Journal of Welding and Joining
    • /
    • v.34 no.2
    • /
    • pp.59-66
    • /
    • 2016
  • The Gas Metal Arc(GMA) welding, called Metal Inert Gas(MIG) welding, has been an important component in manufacturing industries. A key technology for robotic welding processes is seam tracking system, which is critical to improve the welding quality and welding capacities. The objectives of this study were to develop the intelligent and cost-effective algorithms for image processing in GMA welding which based on the laser vision sensor. Welding images were captured from the CCD camera and then processed by the proposed algorithm to track the weld joint location. The proposed algorithms that commonly used at the present stage were verified and compared to obtain the optimal one for each step in image processing. Finally, validity of the proposed algorithms was examined by using weld seam images obtained with different welding environments for image processing. The results proved that the proposed algorithm was quite excellent in getting rid of the variable noises to extract the feature points and centerline for seam tracking in GMA welding and could be employed for general industrial application.