• 제목/요약/키워드: a vision system

검색결과 3,173건 처리시간 0.034초

기계시각을 이용한 장미와 국화 절화의 품질 계측장치 개발 (Development of a System to Measure Quality of Cut Flowers of Rose and Chrysanthemum Using Machine Vision)

  • 서상룡;최승묵;조남홍;박종률
    • Journal of Biosystems Engineering
    • /
    • 제28권3호
    • /
    • pp.231-238
    • /
    • 2003
  • Rose and chrysanthemum are the most popular flowers in Korean floriculture. Sorting flowers is a labor intensive operation in cultivation of the cut flowers and needed to be mechanized. Machine vision is one of the promising solutions for this purpose. This study was carried out to develop hardware and software of a cut flower sorting system using machine vision and to test its performance. Results of this study were summarized as following; 1. Length of the cut flower measured by the machine vision system showed a good correlation with actual length of the flower at a level of the coefficients of determination (R$^2$) of 0.9948 and 0.9993 for rose and chrysanthemum respectively and average measurement errors of the system were about 2% and 1% of the shortest length of the sample flowers. The experimental result showed that the machine vision system could be used successfully to measure length of the cut flowers. 2. Stem diameter of the cut flowers measured by the machine vision system showed a correlation with actual diameter at the coefficients of determination (R$^2$) of 0.8429 and 0.9380 for rose and chrysanthemum respectively and average measurement errors of the system were about 15% and 7.5% of the shortest diameter of the sample flowers which could be a serious source of error in grading operation. It was recommended that the error rate should be considered to set up grading conditions of each class of the cut flowers. 3. Bud maturity of 20 flowers each judged using the machine vision system showed a coincidence with the judgement by inspectors at ranges of 80%∼85% and 85%∼90% for rose and chrysanthemum respectively. Performance of the machine vision system to judge bud maturity could be improved through setting up more precise criteria to judge the maturity with more samples of the flowers. 4. Quality of flower judged by stem curvature using the machine vision system showed a coincidence with the judgement by inspectors at 90% for good and 85% for bad flowers of both rose and chrysanthemum. The levels of coincidence was considered as that the machine vision system used was an acceptable system to judge the quality of flower by stem curvature.

SLAM 기반 GPS/INS/영상센서를 결합한 헬리콥터 항법시스템의 구성 (SLAM Aided GPS/INS/Vision Navigation System for Helicopter)

  • 김재형;유준;곽휘권
    • 제어로봇시스템학회논문지
    • /
    • 제14권8호
    • /
    • pp.745-751
    • /
    • 2008
  • This paper presents a framework for GPS/INS/Vision based navigation system of helicopters. GPS/INS coupled algorithm has weak points such as GPS blockage and jamming, while the helicopter is a speedy and high dynamical vehicle amenable to lose the GPS signal. In case of the vision sensor, it is not affected by signal jamming and also navigation error is not accumulated. So, we have implemented an GPS/INS/Vision aided navigation system providing the robust localization suitable for helicopters operating in various environments. The core algorithm is the vision based simultaneous localization and mapping (SLAM) technique. For the verification of the SLAM algorithm, we performed flight tests. From the tests, we confirm the developed system is robust enough under the GPS blockage. The system design, software algorithm, and flight test results are described.

컴퓨터 시각에 의한 고형 입자의 소량 유동율 측정장치 개발 (Development of a Computer Vision System to Measure Low Flow Rate of Solid Particles)

  • 이경환;서상룡;문정기
    • Journal of Biosystems Engineering
    • /
    • 제23권5호
    • /
    • pp.481-490
    • /
    • 1998
  • A computer vision system to measure low flow rate of solid particles was developed and tested to examine its performance with various sized 7 kinds of seeds, perilla, mung bean, paddy, small red bean, black soybean, Cuba bean and small potato tuber. The test was performed for two types of particle flow, continuous and discontinuous. For the continuous flow tested with perilla, mung bean and paddy, the tests resulted correlation coefficients for the flow rates measured by the computer vision and direct method about 0.98. Average errors of the computer vision measurement were in a range of 6∼9%. For the discontinuous flow tested with small red bean, black soybean, Cuba bean and small potato tuber, the tests resulted correlation coefficients for the flow rates measured by the computer vision and direct method 0.98∼0.99. Average errors of the computer vision measurement were in a range of 5∼10%. Performance of the computer vision system was compared with that of the conventional optical sensor to count particles in discontinuous flow. The comparison was done with black soybean, Cuba bean and small potato tuber, and resulted that the computer vision has much better performance than the optical sensor in a sense of precision of the measurement.

  • PDF

Implementation of a Stereo Vision Using Saliency Map Method

  • Choi, Hyeung-Sik;Kim, Hwan-Sung;Shin, Hee-Young;Lee, Min-Ho
    • Journal of Advanced Marine Engineering and Technology
    • /
    • 제36권5호
    • /
    • pp.674-682
    • /
    • 2012
  • A new intelligent stereo vision sensor system was studied for the motion and depth control of unmanned vehicles. A new bottom-up saliency map model for the human-like active stereo vision system based on biological visual process was developed to select a target object. If the left and right cameras successfully find the same target object, the implemented active vision system with two cameras focuses on a landmark and can detect the depth and the direction information. By using this information, the unmanned vehicle can approach to the target autonomously. A number of tests for the proposed bottom-up saliency map were performed, and their results were presented.

A vision-based system for long-distance remote monitoring of dynamic displacement: experimental verification on a supertall structure

  • Ni, Yi-Qing;Wang, You-Wu;Liao, Wei-Yang;Chen, Wei-Huan
    • Smart Structures and Systems
    • /
    • 제24권6호
    • /
    • pp.769-781
    • /
    • 2019
  • Dynamic displacement response of civil structures is an important index for in-construction and in-service structural condition assessment. However, accurately measuring the displacement of large-scale civil structures such as high-rise buildings still remains as a challenging task. In order to cope with this problem, a vision-based system with the use of industrial digital camera and image processing has been developed for long-distance, remote, and real-time monitoring of dynamic displacement of supertall structures. Instead of acquiring image signals, the proposed system traces only the coordinates of the target points, therefore enabling real-time monitoring and display of displacement responses in a relatively high sampling rate. This study addresses the in-situ experimental verification of the developed vision-based system on the Canton Tower of 600 m high. To facilitate the verification, a GPS system is used to calibrate/verify the structural displacement responses measured by the vision-based system. Meanwhile, an accelerometer deployed in the vicinity of the target point also provides frequency-domain information for comparison. Special attention has been given on understanding the influence of the surrounding light on the monitoring results. For this purpose, the experimental tests are conducted in daytime and nighttime through placing the vision-based system outside the tower (in a brilliant environment) and inside the tower (in a dark environment), respectively. The results indicate that the displacement response time histories monitored by the vision-based system not only match well with those acquired by the GPS receiver, but also have higher fidelity and are less noise-corrupted. In addition, the low-order modal frequencies of the building identified with use of the data obtained from the vision-based system are all in good agreement with those obtained from the accelerometer, the GPS receiver and an elaborate finite element model. Especially, the vision-based system placed at the bottom of the enclosed elevator shaft offers better monitoring data compared with the system placed outside the tower. Based on a wavelet filtering technique, the displacement response time histories obtained by the vision-based system are easily decomposed into two parts: a quasi-static ingredient primarily resulting from temperature variation and a dynamic component mainly caused by fluctuating wind load.

A Three-Degree-of-Freedom Anthropomorphic Oculomotor Simulator

  • Bang Young-Bong;Paik Jamie K.;Shin Bu-Hyun;Lee Choong-Kil
    • International Journal of Control, Automation, and Systems
    • /
    • 제4권2호
    • /
    • pp.227-235
    • /
    • 2006
  • For a sophisticated humanoid that explores and learns its environment and interacts with humans, anthropomorphic physical behavior is much desired. The human vision system orients each eye with three-degree-of-freedom (3-DOF) in the directions of horizontal, vertical and torsional axes. Thus, in order to accurately replicate human vision system, it is imperative to have a simulator with 3-DOF end-effector. We present a 3-DOF anthropomorphic oculomotor system that reproduces realistic human eye movements for human-sized humanoid applications. The parallel link architecture of the oculomotor system is sized and designed to match the performance capabilities of the human vision. In this paper, a biologically-inspired mechanical design and the structural kinematics of the prototype are described in detail. The motility of the prototype in each axis of rotation was replicated through computer simulation, while performance tests comparable to human eye movements were recorded.

다수의 비전 센서와 INS를 활용한 랜드마크 기반의 통합 항법시스템 (INS/Multi-Vision Integrated Navigation System Based on Landmark)

  • 김종명;이현재
    • 한국항공우주학회지
    • /
    • 제45권8호
    • /
    • pp.671-677
    • /
    • 2017
  • 본 논문은 관성항법시스템(Inertial Navigation System)과 비전 센서(Vision Sensor)를 활용한 통합 항법시스템의 성능 향상을 위한 INS/멀티비전 통합항법 시스템을 제시하였다. 기존의 단일 센서나 스테레오 비전(Stereo vision)을 활용한 경우 측정되는 랜드마크의 수가 적을 경우 필터가 발산하는 문제가 발생할 수 있다. 이러한 문제를 해결하기 위해 본 논문에서는 3개의 비전 센서를 동체를 기준으로 $0^{\circ}$, $120^{\circ}$, $-120^{\circ}$으로 설치하여 단일 센서로 사용되는 경우보다 성능이 향상됨을 수치 시뮬레이션을 통하여 검증하였다.

독립 비젼 시스템 기반의 축구로봇을 위한 계층적 행동 제어기 (A Hierarchical Motion Controller for Soccer Robots with Stand-alone Vision System)

  • 이동일;김형종;김상준;장재완;최정원;이석규
    • 한국정밀공학회지
    • /
    • 제19권9호
    • /
    • pp.133-141
    • /
    • 2002
  • In this paper, we propose a hierarchical motion controller with stand-alone vision system to enhance the flexibility of the robot soccer system. In addition, we simplified the model of dynamic environments of the robot using petri-net and simple state diagram. Based on the proposed model, we designed the robot soccer system with velocity and position controller that includes 4-level hierarchically structured controller. Some experimental results using the stand-alone vision system from host system show improvement of the controller performance by reducing processing time of vision algorithm.

비전정보와 캐드DB 매칭을 통한 웹 기반 금형 판별 시스템 개발 (Development of Web Based Mold Discrimination System using the Matching Process for Vision Information and CAD DB)

  • 최진화;전병철;조명우
    • 한국공작기계학회논문집
    • /
    • 제15권5호
    • /
    • pp.37-43
    • /
    • 2006
  • The target of this study is development of web based mold discrimination system by matching vision information with CAD database. The use of 2D vision image makes possible speedy mold discrimination from many databases. The image processing such as preprocessing, cleaning is done for obtaining vivid image with object information. The web-based system is a program which runs to exchange messages between a server and a client by making of ActiveX control and the result of mold discrimination is shown on web-browser. For effective feature classification and extraction, signature method is used to make sensible information from 2D data. As a result, the possibility of proposed system is shown as matching feature information from vision image with CAD database samples.

DEVELOPMENT OF A MACHINE VISION SYSTEM FOR WEED CONTROL USING PRECISION CHEMICAL APPLICATION

  • Lee, Won-Suk;David C. Slaughter;D.Ken Giles
    • 한국농업기계학회:학술대회논문집
    • /
    • 한국농업기계학회 1996년도 International Conference on Agricultural Machinery Engineering Proceedings
    • /
    • pp.802-811
    • /
    • 1996
  • Farmers need alternatives for weed control due to the desire to reduce chemicals used in farming. However, conventional mechanical cultivation cannot selectively remove weeds located in the seedline between crop plants and there are no selective heribicides for some crop/weed situations. Since hand labor is costly , an automated weed control system could be feasible. A robotic weed control system can also reduce or eliminate the need for chemicals. Currently no such system exists for removing weeds located in the seedline between crop plants. The goal of this project is to build a real-time , machine vision weed control system that can detect crop and weed locations. remove weeds and thin crop plants. In order to accomplish this objective , a real-time robotic system was developed to identify and locate outdoor plants using machine vision technology, pattern recognition techniques, knowledge-based decision theory, and robotics. The prototype weed control system is composed f a real-time computer vision system, a uniform illumination device, and a precision chemical application system. The prototype system is mounted on the UC Davis Robotic Cultivator , which finds the center of the seedline of crop plants. Field tests showed that the robotic spraying system correctly targeted simulated weeds (metal coins of 2.54 cm diameter) with an average error of 0.78 cm and the standard deviation of 0.62cm.

  • PDF