• Title/Summary/Keyword: Camera Controller

Search Result 273, Processing Time 0.026 seconds

REPRESENTATION OF NAVIGATION INFORMATION FOR VISUAL CAR NAVIGATION SYSTEM

  • Joo, In-Hak;Lee, Seung-Yong;Cho, Seong-Ik
    • Proceedings of the KSRS Conference
    • /
    • 2007.10a
    • /
    • pp.508-511
    • /
    • 2007
  • Car navigation system is one of the most important applications in telematics. A newest trend of car navigation system is using real video captured by camera equipped on the vehicle, because video can overcome the semantic gap between map and real world. In this paper, we suggest a visual car navigation system that visually represents navigation information or route guidance. It can improve drivers' understanding about real world by capturing real-time video and displaying navigation information overlaid on it. Main services of the visual car navigation system are graphical turn guidance and lane change guidance. We suggest the system architecture that implements the services by integrating conventional route finding and guidance, computer vision functions, and augmented reality display functions. What we designed as a core part of the system is visual navigation controller, which controls other modules and dynamically determines visual representation methods of navigation information according to a determination rule based on current location and driving circumstances. We briefly show the implementation of system.

  • PDF

Development of Monitor & Controller for Tailored Blank Welding (Tailored Blank 용접을 위한 감시제어장치 개발)

  • 장영건;유병길;이경돈
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 1996.11a
    • /
    • pp.323-327
    • /
    • 1996
  • Gap and thickness difference information between blanks are often necessary for tailored blank welding quality evaluation , optimum welding parameters selection and evaluation of shearing machine, blink allocation device accuracy and clamping device. We develope 3D vision system and camera unit using structured lighting for this purpose. A simple ar d efficient scheme for gap and thickness feature recognition Is developed as well as measurements. Experimental results shows this system measuring accuracy is 10 ${\mu}{\textrm}{m}$ and 16${\mu}{\textrm}{m}$ for gap and thickness difference respectively The data are expexed to be useful for preview gap control.

  • PDF

Electrical and Thermal Characteristics of a Burst Dimming Piezoelectric Inverter (버스트 디밍 압전인버터의 전기적 특성 및 온도 특성)

  • Shin, Hoon-Beom;Ahn, Hyung-Keun;Han, Deuk-Young
    • Journal of the Korean Institute of Electrical and Electronic Material Engineers
    • /
    • v.19 no.9
    • /
    • pp.843-848
    • /
    • 2006
  • This paper explains the design and fabrication of an inverter for LCD backlight using a multilayer piezoelectric transformer. That inverter is operated by the burst dimming method whose dimming signal is controlled by the duty ratio of rectangular switching signal to a PWM controller. The brightness of CCFL's are investigated according to the various dimming signals, and when those CCFL's are turned on, the temperature distributions on the piezoelectric transformer and circuit components of the inverter are also investigated by a thermo-camera.

2색법에 의한 에멀죤 연료의 화염온도 및 soot 분포 측정에 관한 실험적 연구

  • Park, Jae-Wan;Park, Gwon-Ha;Heo, Gang-Yeol
    • 한국연소학회:학술대회논문집
    • /
    • 1998.10a
    • /
    • pp.103-110
    • /
    • 1998
  • This experiment is performed to investigate the effects of the emulsion on the flame temperature and soot formation in a diesel engine. The two-color method is used to measure the flame temperature for combustion of emulsified diesel in the Rapid Compression and Expansion Machine(RCEM). The concentration of soot is estimated via calculation of the KL factor. The solenoid valve, elecronic controller and needle lift sensor are used to control the exact injection timing and duration under various operating conditions. According to the results the soot concentration is reduced with the increasing W/O while the temperature reduced. The pressure data and the flame images captured by a high speed camera show that the ignition delay of emulsified diesel increase the duration of premixed combustion. The sizes of water drops are measured to be about 10${\mu}m$ by a microscope.

  • PDF

Dynamic Visual Servoing of Robot Manipulators (로봇 메니퓰레이터의 동력학 시각서보)

  • Baek, Seung-Min;Im, Gyeong-Su;Han, Ung-Gi;Guk, Tae-Yong
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.49 no.1
    • /
    • pp.41-47
    • /
    • 2000
  • A better tracking performance can be achieved, if visual sensors such as CCD cameras are used in controling a robot manipulator, than when only relative sensors such as encoders are used. However, for precise visual servoing of a robot manipulator, an expensive vision system which has fast sampling rate must be used. Moreover, even if a fast vision system is implemented for visual servoing, one cannot get a reliable performance without use of robust and stable inner joint servo-loop. In this paper, we propose a dynamic control scheme for robot manipulators with eye-in-hand camera configuration, where a dynamic learning controller is designed to improve the tracking performance of robotic system. The proposed control scheme is implemented for tasks of tracking moving objects and shown to be robust to parameter uncertainty, disturbances, low sampling rate, etc.

  • PDF

A Study on Detection of Lane and Displacement of Obstacle for AGV using Vision System (비전시스템을 이용한 자율주행량의 차선내 차량의 변위 검출에 관한 연구)

  • Lee, Jin-Woo;Choi, Sung-Uk;Lee, Chang-Hoon;Lee, Yung-Jin;Lee, Kwon-Soon
    • Proceedings of the KIEE Conference
    • /
    • 2001.07d
    • /
    • pp.2202-2205
    • /
    • 2001
  • This paper is composed of two parts. One is image preprocessing part to measure the condition of the lane and vehicle. This finds the information of lines using RGB ratio cutting algorithm, the edge detection and Hough transform. The other obtains the situation of other vehicles using the image processing and viewport. At first, 2 dimension image information derived from vision sensor is interpreted to the 3 dimension information by the angle and position of the CCD camera. Through these processes, if vehicle knows the driving conditions which are lane angle, distance error and real position of other vehicles, we should calculate the reference steering angle by steering controller.

  • PDF

Development of an Image Processing System for Classifying the Pig's Thermoregulatory Behavior (돼지의 체온 조절 행동 분류를 위한 영상처리 시스템 개발)

  • 장홍희;장동일;임영일;임정택
    • Journal of Animal Environmental Science
    • /
    • v.5 no.3
    • /
    • pp.139-148
    • /
    • 1999
  • This study was conducted to develop an image processing system which can classify the pig's thermoregulatory behavior under the different environmental conditions. The 4 pigs of 25kg were housed in the environmentally controlled chamber(1.4m$\times$2.2m floor space). Postural behavior of the pigs was captured with an CCD color camera. The raw behavioral images were processed by thresholoding, reduction, separation of slightly contacted pigs, labeling, noise removal, computation of number of labels, and classification of the pig's behavior. The correct classification rate of the image processing system was 97.8%(88 out of 90 testing images). The results of this study showed that the image processing system could be used for a behavior-based automatic environmental controller.

Representing Navigation Information on Real-time Video in Visual Car Navigation System

  • Joo, In-Hak;Lee, Seung-Yong;Cho, Seong-Ik
    • Korean Journal of Remote Sensing
    • /
    • v.23 no.5
    • /
    • pp.365-373
    • /
    • 2007
  • Car navigation system is a key application in geographic information system and telematics. A recent trend of car navigation system is using real video captured by camera equipped on the vehicle, because video has more representation power about real world than conventional map. In this paper, we suggest a visual car navigation system that visually represents route guidance. It can improve drivers' understanding about real world by capturing real-time video and displaying navigation information overlaid directly on the video. The system integrates real-time data acquisition, conventional route finding and guidance, computer vision, and augmented reality display. We also designed visual navigation controller, which controls other modules and dynamically determines visual representation methods of navigation information according to current location and driving circumstances. We briefly show implementation of the system.

Development of Drone Racing Simulator using SLAM Technology and Reconstruction of Simulated Environments (SLAM 기술을 활용한 가상 환경 복원 및 드론 레이싱 시뮬레이션 제작)

  • Park, Yonghee;Yu, Seunghyun;Lee, Jaegwang;Jeong, Jonghyeon;Jo, Junhyeong;Kim, Soyeon;Oh, Hyejun;Moon, Hyungpil
    • The Journal of Korea Robotics Society
    • /
    • v.16 no.3
    • /
    • pp.245-249
    • /
    • 2021
  • In this paper, we present novel simulation contents for drone racing and autonomous flight of drone. With Depth camera and SLAM, we conducted mapping 3 dimensional environment through RTAB-map. The 3 dimensional map is represented by point cloud data. After that we recovered this data in Unreal Engine. This recovered raw data reflects real data that includes noise and outlier. Also we built drone racing contents like gate and obstacles for evaluating drone flight in Unreal Engine. Then we implemented both HITL and SITL by using AirSim which offers flight controller and ROS api. Finally we show autonomous flight of drone with ROS and AirSim. Drone can fly in real place and sensor property so drone experiences real flight even in the simulation world. Our simulation framework increases practicality than other common simulation that ignore real environment and sensor.

Development of a Robot System for Automatic De-palletizing of Parcels loaded in Rolltainer (롤테이너 적재 소포를 자동으로 디팔레타이징하기 위한 로봇 시스템 개발)

  • Kim, Donghyung;Lim, Eul Gyoon;Kim, Joong Bae
    • The Journal of Korea Robotics Society
    • /
    • v.17 no.4
    • /
    • pp.431-437
    • /
    • 2022
  • This paper deals with a study on the automatic depalletizing robot for parcels loaded in rolltainer of domestic postal distribution centers. Specifically, we proposed a robot system that detect parcels loaded in a rolltainer with a 3D camera and perform de-palletizing using a cooperative robot. In addition, we developed the task flow chart for parcel de-palletizing and the method of retreat motion generation in the case of collision with rolltainer. Then, we implemented the proposed methods to the robot's controller by developing robot program. The proposed robot system was installed at the Anyang Post Distribution Center and field tests were completed. Field tests have shown that the robotic system has a success rate of over 90% for depalletizing task. And it was confirmed that the average tact time per parcel was 7.3 seconds.