• Title/Summary/Keyword: onboard image processing system

Search Result 10, Processing Time 0.024 seconds

An Onboard Image Processing System for Road Images (도로교통 영상처리를 위한 고속 영상처리시스템의 하드웨어 구현)

  • 이운근;이준웅;조석빈;고덕화;백광렬
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.9 no.7
    • /
    • pp.498-506
    • /
    • 2003
  • A computer vision system applied to an intelligent safety vehicle has been required to be worked on a small sized real time special purposed hardware not on a general purposed computer. In addition, the system should have a high reliability even under the adverse road traffic environment. This paper presents a design and an implementation of an onboard hardware system taking into account for high speed image processing to analyze a road traffic scene. The system is mainly composed of two parts: an early processing module of FPGA and a postprocessing module of DSP. The early processing module is designed to extract several image primitives such as the intensity of a gray level image and edge attributes in a real-time Especially, the module is optimized for the Sobel edge operation. The postprocessing module of DSP utilizes the image features from the early processing module for making image understanding or image analysis of a road traffic scene. The performance of the proposed system is evaluated by an experiment of a lane-related information extraction. The experiment shows the successful results of image processing speed of twenty-five frames of 320$\times$240 pixels per second.

Acquisition, Processing and Image Generation System for Camera Data Onboard Spacecraft

  • C.V.R Subbaraya Sastry;G.S Narayan Rao;N Ramakrishna;V.K Hariharan
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.3
    • /
    • pp.94-100
    • /
    • 2023
  • The primary goal of any communication spacecraft is to provide communication in variety of frequency bands based on mission requirements within the Indian mainland. Some of the spacecrafts operating in S-band utilizes a 6m or larger aperture Unfurlable Antenna (UFA for S-band links and provides coverage through five or more S-band spot beams over Indian mainland area. The Unfurlable antenna is larger than the satellite and so the antenna is stowed during launch. Upon reaching the orbit, the antenna is deployed using motors. The deployment status of any deployment mechanism will be monitored and verified by the telemetered values of micro-switch position before the start of deployment, during the deployment and after the completion of the total mechanism. In addition to these micro switches, a camera onboard will be used for capturing still images during primary and secondary deployments of UFA. The proposed checkout system is realized for validating the performance of the onboard camera as part of Integrated Spacecraft Testing (IST) conducted during payload checkout operations. It is designed for acquiring the payload data of onboard camera in real-time, followed by archiving, processing and generation of images in near real-time. This paper presents the architecture, design and implementation features of the acquisition, processing and Image generation system for Camera onboard spacecraft. Subsequently this system can be deployed in missions wherever similar requirement is envisaged.

EXTRACTION OF LANE-RELATED INFORMATION AND A REAL-TIME IMAGE PROCESSING ONBOARD SYSTEM

  • YI U. K.;LEE W.
    • International Journal of Automotive Technology
    • /
    • v.6 no.2
    • /
    • pp.171-181
    • /
    • 2005
  • The purpose of this paper is two-fold: 1) A novel algorithm in order to extract lane-related information from road images is presented; 2) Design specifications of an image processing onboard unit capable of extracting lane­related information in real-time is also presented. Obtaining precise information from road images requires many features due to the effects of noise that eventually leads to long processing time. By exploiting a FPGA and DSP, we solve the problem of real-time processing. Due to the fact that image processing of road images relies largely on edge features, the FPGA is adopted in the hardware design. The schematic configuration of the FPGA is optimized in order to perform 3 $\times$ 3 Sobel edge extraction. The DSP carries out high-level image processing of recognition, decision, estimation, etc. The proposed algorithm uses edge features to define an Edge Distribution Function (EDF), which is a histogram of edge magnitude with respect to the edge orientation angle. The EDF enables the edge-related information and lane-related to be connected. The performance of the proposed system is verified through the extraction of lane-related information. The experimental results show the robustness of the proposed algorithm and a processing speed of more than 25 frames per second, which is considered quite successful.

A Study on Intelligent Railway Level Crossing System for Accident Prevention

  • Cho, Bong-Kwan;Jung, Jae-Il
    • International Journal of Railway
    • /
    • v.3 no.3
    • /
    • pp.106-112
    • /
    • 2010
  • Accidents at level crossing have large portion on train accidents, and causes economical loss by train delay and operational interruption. Various safety equipments are employed to reduce the accident at level crossing, but existing warning device, and crossing barrier are simple train-oriented protection equipments. In this paper, intelligent railway level crossing system is proposed to prevent and reduce accidents. For train driver's prompt action, image of level crossing and obstacle warning message are continuously provided to train driver through wireless communication in level crossing control zone. Obstacle warning messages, which are extracted by computer vision processing of captured image at level crossing, are recognized by train driver through message color, flickering and warning sound. It helps train driver to decide how to take an action. Meanwhile, for vehicle driver's attention, location and speed of approaching train are given to roadside equipments. We identified the effect of proposed system through test installation at Sea train and Airport level crossing of Yeong-dong line.

  • PDF

Implementation of Virtual Instrumentation based Realtime Vision Guided Autopilot System and Onboard Flight Test using Rotory UAV (가상계측기반 실시간 영상유도 자동비행 시스템 구현 및 무인 로터기를 이용한 비행시험)

  • Lee, Byoung-Jin;Yun, Suk-Chang;Lee, Young-Jae;Sung, Sang-Kyung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.18 no.9
    • /
    • pp.878-886
    • /
    • 2012
  • This paper investigates the implementation and flight test of realtime vision guided autopilot system based on virtual instrumentation platform. A graphical design process via virtual instrumentation platform is fully used for the image processing, communication between systems, vehicle dynamics control, and vision coupled guidance algorithms. A significatnt ojective of the algorithm is to achieve an environment robust autopilot despite wind and an irregular image acquisition condition. For a robust vision guided path tracking and hovering performance, the flight path guidance logic is combined in a multi conditional basis with the position estimation algorithm coupled with the vehicle attitude dynamics. An onboard flight test equipped with the developed realtime vision guided autopilot system is done using the rotary UAV system with full attitude control capability. Outdoor flight test demonstrated that the designed vision guided autopilot system succeeded in UAV's hovering on top of ground target within about several meters under geenral windy environment.

Visual Target Tracking and Relative Navigation for Unmanned Aerial Vehicles in a GPS-Denied Environment

  • Kim, Youngjoo;Jung, Wooyoung;Bang, Hyochoong
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.15 no.3
    • /
    • pp.258-266
    • /
    • 2014
  • We present a system for the real-time visual relative navigation of a fixed-wing unmanned aerial vehicle in a GPS-denied environment. An extended Kalman filter is used to construct a vision-aided navigation system by fusing the image processing results with barometer and inertial sensor measurements. Using a mean-shift object tracking algorithm, an onboard vision system provides pixel measurements to the navigation filter. The filter is slightly modified to deal with delayed measurements from the vision system. The image processing algorithm and the navigation filter are verified by flight tests. The results show that the proposed aerial system is able to maintain circling around a target without using GPS data.

Onboard Active Vision Based Hovering Control for Quadcopter in Indoor Environments (실내 환경에서의 능동카메라 기반 쿼더콥터의 호버링 제어)

  • Jin, Tae-Seok
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.20 no.1
    • /
    • pp.19-26
    • /
    • 2017
  • In this paper, we describe the design and performance of UAV system toward compact and fully autonomous quadrotors, where they can complete logistics application, rescue work, inspection tour and remote sensing without external assistance systems like ground station computers, high-performance wireless communication devices or motion capture system. we propose high-speed hovering flyght height control method based on state feedback control with image information from active camera and multirate observer because we can get image of the information only every 30ms. Finally, we show the advantages of proposed method by simulations and experiments.

Implementation of Automatic Target Tracking System for Multirotor UAVs Using Velocity Command Based PID controller (속도 명령 기반 PID 제어기를 이용한 멀티로터 무인항공기의 표적 자동 추종 시스템 구현)

  • Jeong, Hyeon-Do;Ko, Seon-Jae;Choi, Byoung-Jo
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.13 no.6
    • /
    • pp.321-328
    • /
    • 2018
  • This paper presents an automatic target tracking flight system using a PID controller based on velocity command of a multirotor UAV. The automatic flight system includes marker based onboard target detection and an automatic velocity command generation replacing manual controller. A quad-rotor UAV is equipped with a camera and an image processing computer to detect the marker in real time and to estimate the relative distance from the target. The marker tracking system consists of PID controller and generates velocity command based on the relative distance. The generated velocity command is used as the input of the UAV's original flight controller. The operation of the proposed system was verified through actual flight tests using a marker on top of a moving vehicle and tracks it to successfully demonstrate its capability using a quad-rotor UAV.

A Real-Time Onboard image Processing System for Lane Departure Warning (차선이탈 경보시스템을 위한 실시간 영상처리 하드웨어 구현)

  • Yi, Un-Kun
    • Proceedings of the KIEE Conference
    • /
    • 2004.07d
    • /
    • pp.2507-2509
    • /
    • 2004
  • 지능형 안전자동차에 비전센서를 채택하여 이의 응용시스템을 개발하기 위해서는 궁극적으로 많은 양의 영상데이터를 시스템의 제어목적에 부합하도록 실시간으로 처리하기 위한 노력과 구현하고자 하는 영상처리시스템을 정적인 실내환경과 달리 열악한 환경의 차량에 탑재가 용이하게 하기 위한 소형화의 노력이 요구된다. 본 논문에서 구현된 실시간 영상처리 하드웨어는 에지 연산 등의 반복된 전처리는 FPGA에서 처리하고, 상위단계의 영상처리는 RISC에서 수행하는 구조이다. 구현된 영상처리 하드웨어는 에지정보 기반의 차선정보추출 및 차선이탈 경보알고리즘을 적용하여 그 성능을 평가하였으며, 초당 25프레임 이상의 영상처리를 수행할 수 있는 연산속도를 보여 성공적인 결과를 얻을 수 있었다.

  • PDF

DEVELOPMENT OF GOCI/COMS DATA PROCESSING SYSTEM

  • Ahn, Yu-Hwan;Shanmugam, Palanisamy;Han, Hee-Jeong;Ryu, Joo-Hyung
    • Proceedings of the KSRS Conference
    • /
    • v.1
    • /
    • pp.90-93
    • /
    • 2006
  • The first Geostationary Ocean Color Imager (GOCI) onboard its Communication Ocean and Meteorological Satellite (COMS) is scheduled for launch in 2008. GOCI includes the eight visible-to-near-infrared (NIR) bands, 0.5km pixel resolution, and a coverage region of 2500 ${\times}$ 2500km centered at 36N and 130E. GOCI has had the scope of its objectives broadened to understand the role of the oceans and ocean productivity in the climate system, biogeochemical variables, geological and biological response to physical dynamics and to detect and monitor toxic algal blooms of notable extension through observations of ocean color. The special feature with GOCI is that like MODIS, MERIS and GLI, it will include the band triplets 660-680-745 for the measurements of sun-induced chlorophyll-a fluorescence signal from the ocean. The GOCI will provide SeaWiFS quality observations with frequencies of image acquisition 8 times during daytime and 2 times during nighttime. With all the above features, GOCI is considered to be a remote sensing tool with great potential to contribute to better understanding of coastal oceanic ecosystem dynamics and processes by addressing environmental features in a multidisciplinary way. To achieve the objectives of the GOCI mission, we develop the GOCI Data Processing System (GDPS) which integrates all necessary basic and advanced techniques to process the GOCI data and deliver the desired biological and geophysical products to its user community. Several useful ocean parameters estimated by in-water and other optical algorithms included in the GDPS will be used for monitoring the ocean environment of Korea and neighbouring countries and input into the models for climate change prediction.

  • PDF