• 제목/요약/키워드: Vision Systems

검색결과 1,716건 처리시간 0.03초

모노비전과 퍼지규칙을 이용한 이동로봇의 경로계획과 장애물회피 (Obstacle Avoidance and Path Planning for a Mobile Robot Using Single Vision System and Fuzzy Rule)

  • 배봉규;이원창;강근택
    • 한국지능시스템학회:학술대회논문집
    • /
    • 한국퍼지및지능시스템학회 2000년도 추계학술대회 학술발표 논문집
    • /
    • pp.274-277
    • /
    • 2000
  • In this paper we propose new algorithms of path planning and obstacle avoidance for an autonomous mobile robot with vision system. Distance variation is included in path planning to approach the target point and avoid obstacles well. The fuzzy rules are also applied to both trajectory planning and obstacle avoidance to improve the autonomy of mobile robot. It is shown by computer simulation that the proposed algorithm is working well.

  • PDF

차체 부품 누락 방지를 위한 자동검사 시스템 개발 (A Development Auto Inspection System for Prevent an Omission of Motor Body Units)

  • 이용중;이형우;김기대
    • 한국공작기계학회:학술대회논문집
    • /
    • 한국공작기계학회 2002년도 춘계학술대회 논문집
    • /
    • pp.146-148
    • /
    • 2002
  • An automatic inspection vision systems whose development was the industry applications motor rear side member. This system are connected to the 9 ea camera fur the process inspection the bolt, nuts, units in the rear side member product. This automatic inspection vision systems can perform inspection of not attached the bolt, nuts, units, etc fast and accuracy in react time fashion. And then perform very sophisticated inspection which human workers can not perform.

  • PDF

실시간 처리를 위한 타이어 자동 선별 비젼 시스템 (The automatic tire classfying vision system for real time processing)

  • 박귀태;김진헌;정순원;송승철
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1992년도 한국자동제어학술회의논문집(국내학술편); KOEX, Seoul; 19-21 Oct. 1992
    • /
    • pp.358-363
    • /
    • 1992
  • The tire manufacturing process demands classification of tire types when the tires are transferred between the inner processes. Though most processes are being well automated, the classification relies greatly upon the visual inspection of humen. This has been an obstacle to the factory automation of tire manufacturing companies. This paper proposes an effective vision systems which can be usefully applied to the tire classification process in real time. The system adopts a parallel architecture using multiple transputers and contains the algorithms of preprocesssing for character recognition. The system can be easily expandable to manipulate the large data that can be processed seperately.

  • PDF

Evaluation of Defects in the Bonded Area of Shoes using an Infrared Thermal Vision Camera

  • Kim, Jae-Yeol;Yang, Dong-Jo;Kim, Chang-Hyun
    • International Journal of Control, Automation, and Systems
    • /
    • 제1권4호
    • /
    • pp.511-514
    • /
    • 2003
  • The Infrared Camera usually detects only Infrared waves emitted from the light in order to illustrate the temperature distribution. An Infrared diagnosis system can be applied to various fields. But the defect discrimination can be automatic or mechanized in the special shoes total inspection system. This study introduces a method for special shoes nondestructive total inspection. Performance of the proposed method is shown through thermo-Image.

Vision Based Traffic Data Collection in Intelligent Transportation Systems

  • Mei Yu;Kim, Yong-Deak
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2000년도 ITC-CSCC -2
    • /
    • pp.773-776
    • /
    • 2000
  • Traffic monitoring plays an important role in intelligent transportation systems. It can be used to collect real-time traffic data concerning traffic flow. Passive shadows resulted from roadside buildings or trees and active shadows caused by moving vehicles, are one of the factors that arise errors in vision based vehicle detection. In this paper, a land mark based method is proposed for vehicle detection and shadow rejection, and finally vehicle count are achieved based on the land mark detection method.

  • PDF

컴퓨터비전을 이용한 손동작 인식에 관한 연구 (A Study on Hand Gesture Recognition using Computer Vision)

  • 박창민
    • 경영과정보연구
    • /
    • 제4권
    • /
    • pp.395-407
    • /
    • 2000
  • It is necessary to develop method that human and computer can interfact by the hand gesture without any special device. In this thesis, the real time hand gesture recognition was developed. The system segments the region of a hand recognizes the hand posture and track the movement of the hand, using computer vision. And it does not use the blue screen as a background, the data glove and special markers for the recognition of the hand gesture.

  • PDF

이동로봇의 자율주행을 위한 다중센서융합기반의 지도작성 및 위치추정 (Map-Building and Position Estimation based on Multi-Sensor Fusion for Mobile Robot Navigation in an Unknown Environment)

  • 진태석;이민중;이장명
    • 제어로봇시스템학회논문지
    • /
    • 제13권5호
    • /
    • pp.434-443
    • /
    • 2007
  • Presently, the exploration of an unknown environment is an important task for thee new generation of mobile service robots and mobile robots are navigated by means of a number of methods, using navigating systems such as the sonar-sensing system or the visual-sensing system. To fully utilize the strengths of both the sonar and visual sensing systems. This paper presents a technique for localization of a mobile robot using fusion data of multi-ultrasonic sensors and vision system. The mobile robot is designed for operating in a well-structured environment that can be represented by planes, edges, comers and cylinders in the view of structural features. In the case of ultrasonic sensors, these features have the range information in the form of the arc of a circle that is generally named as RCD(Region of Constant Depth). Localization is the continual provision of a knowledge of position which is deduced from it's a priori position estimation. The environment of a robot is modeled into a two dimensional grid map. we defines a vision-based environment recognition, phisically-based sonar sensor model and employs an extended Kalman filter to estimate position of the robot. The performance and simplicity of the approach is demonstrated with the results produced by sets of experiments using a mobile robot.

Aerial Object Detection and Tracking based on Fusion of Vision and Lidar Sensors using Kalman Filter for UAV

  • Park, Cheonman;Lee, Seongbong;Kim, Hyeji;Lee, Dongjin
    • International journal of advanced smart convergence
    • /
    • 제9권3호
    • /
    • pp.232-238
    • /
    • 2020
  • In this paper, we study on aerial objects detection and position estimation algorithm for the safety of UAV that flight in BVLOS. We use the vision sensor and LiDAR to detect objects. We use YOLOv2 architecture based on CNN to detect objects on a 2D image. Additionally we use a clustering method to detect objects on point cloud data acquired from LiDAR. When a single sensor used, detection rate can be degraded in a specific situation depending on the characteristics of sensor. If the result of the detection algorithm using a single sensor is absent or false, we need to complement the detection accuracy. In order to complement the accuracy of detection algorithm based on a single sensor, we use the Kalman filter. And we fused the results of a single sensor to improve detection accuracy. We estimate the 3D position of the object using the pixel position of the object and distance measured to LiDAR. We verified the performance of proposed fusion algorithm by performing the simulation using the Gazebo simulator.

Guidance Law for Vision-Based Automatic Landing of UAV

  • Min, Byoung-Mun;Tahk, Min-Jea;Shim, Hyun-Chul David;Bang, Hyo-Choong
    • International Journal of Aeronautical and Space Sciences
    • /
    • 제8권1호
    • /
    • pp.46-53
    • /
    • 2007
  • In this paper, a guidance law for vision-based automatic landing of unmanned aerial vehicles (UAVs) is proposed. Automatic landing is a challenging but crucial capability for UAVs to achieve a fully autonomous flight. In an autonomous landing maneuver of UAVs, the decision of where to landing and the generation of guidance command to achieve a successful landing are very significant problem. This paper is focused on the design of guidance law applicable to automatic landing problem of fixed-wing UAV and rotary-wing UAV, simultaneously. The proposed guidance law generates acceleration command as a control input which derived from a specified time-to-go ($t_go$) polynomial function. The coefficient of $t_go$-polynomial function are determined to satisfy some terminal constraints. Nonlinear simulation results using a fixed-wing and rotary-wing UAV models are presented.