• 제목/요약/키워드: Vision assistant

검색결과 29건 처리시간 0.028초

컴퓨터 비전을 이용한 강의 도우미 시스템 (Teaching Assistant System using Computer Vision)

  • 김태준;박창훈;최강선
    • 실천공학교육논문지
    • /
    • 제5권2호
    • /
    • pp.109-115
    • /
    • 2013
  • 본 논문에서는 컴퓨터 비전을 이용한 강의 도우미 시스템을 제안한다. 강의자가 강의를 진행하며 사용하는 강의 노트 및 관련 동영상 등 다양한 강의 컨텐츠를 사용할 때, 컨텐츠 전환 등 컴퓨터 조작으로 인해 강의의 끊김이 유발된다. 제안하는 시스템에서는 강의 도중 강의 컴퓨터에 대한 조작이 필요할 때, 보드에 미리 정해진 기호를 그려 넣고 시스템이 이를 인식하여 해당 조작을 수행함으로써 강의 끊김 없이 손쉽게 강의를 이끌어 가도록 도와준다. 제안된 강의 도우미 시스템에서는 임의의 기호를 인식하기 위해 모양 문맥이라는 특징 표현자를 사용한다.

Egocentric Vision for Human Activity Recognition Using Deep Learning

  • Malika Douache;Badra Nawal Benmoussat
    • Journal of Information Processing Systems
    • /
    • 제19권6호
    • /
    • pp.730-744
    • /
    • 2023
  • The topic of this paper is the recognition of human activities using egocentric vision, particularly captured by body-worn cameras, which could be helpful for video surveillance, automatic search and video indexing. This being the case, it could also be helpful in assistance to elderly and frail persons for revolutionizing and improving their lives. The process throws up the task of human activities recognition remaining problematic, because of the important variations, where it is realized through the use of an external device, similar to a robot, as a personal assistant. The inferred information is used both online to assist the person, and offline to support the personal assistant. With our proposed method being robust against the various factors of variability problem in action executions, the major purpose of this paper is to perform an efficient and simple recognition method from egocentric camera data only using convolutional neural network and deep learning. In terms of accuracy improvement, simulation results outperform the current state of the art by a significant margin of 61% when using egocentric camera data only, more than 44% when using egocentric camera and several stationary cameras data and more than 12% when using both inertial measurement unit (IMU) and egocentric camera data.

Automation for Oyster Hinge Breaking System

  • So, J.D.;Wheaton, F.W.
    • 한국농업기계학회:학술대회논문집
    • /
    • 한국농업기계학회 1996년도 International Conference on Agricultural Machinery Engineering Proceedings
    • /
    • pp.658-667
    • /
    • 1996
  • A computer vision system was developed to automatically detect and locate the oyster hinge line, one step in shucking an oyster. The computer vision system consisted of a personal computer, a color frame grabber, a color CCD video camera with a zoom lens, two video monitor, a specially designed fixture to hold the oyster, a lighting system to illuminate the oyster and the system software. The software consisted of a combination of commercially available programs and custom designed programs developed using the Microsoft CTM . Test results showed that the image resolution was the most important variable influencing hinge detection efficiency. Whether or not the trimmed -off-flat-white surface area was dry or wet, the oyster size relative to the image size selected , and the image processing methods used all influenced the hinge locating efficiency. The best computer software and hardware combination used successfully located 97% of the oyster hinge lines tested. This efficienc was achieve using camera field of view of 1.9 by 1.5cm , a 180 by 170 pixel image window, and a dry trimmed -off oyster hinge end surface.

  • PDF

발 움직임 검출을 통한 로봇 팔 제어에 관한 연구 (A Study on Robot Arm Control System using Detection of Foot Movement)

  • 지훈;이동훈
    • 재활복지공학회논문지
    • /
    • 제9권1호
    • /
    • pp.67-72
    • /
    • 2015
  • 팔의 사용이 자유롭지 못한 장애인들을 위하여 발의 움직임 검출을 통하여 로봇 팔을 제어할 수 있는 시스템을 구현하였다. 발의 움직임에 대한 영상을 얻기 위하여 양쪽 발 앞에 두 대의 카메라를 설치하였으며, 획득된 영상에 대해 LabView 기반 Vision Assistant를 이용하여 다중 관심영역을 설정한 후, 좌/우영역내에서 검출된 좌/우, 상/하 엣지를 기반으로 발의 움직임을 검출하였다. 좌/우 두발의 영상으로부터 좌/우 엣지와 상/하 엣지 검출 수에 따라 6관절 로봇 팔을 제어할 수 있는 제어용 데이터를 시리얼 통신을 통해 전송한 후 로봇 팔을 발로 상/하, 좌/우 제어할 수 있는 시스템을 구현하였다. 실험 결과 0.5초 이내의 반응속도와 88% 이상의 동작 인식률을 얻을 수 있었다.

  • PDF

자동차 비전 프로세서 동향 (Trend of Vehicle Vision Processor)

  • 한진호;변경진;엄낙웅
    • 전자통신동향분석
    • /
    • 제30권4호
    • /
    • pp.102-109
    • /
    • 2015
  • 자동차 분야에서 운전자의 안전 및 안전한 운전을 위해 비전 시스템에 기반한 Advanced Driver Assistant System(ADAS)을 개발하고 있고 비전 시스템을 이용한 물체인식 기술을 이용해서 차선인식, 보행자인식, 차량인식 등 통해 차량위치 및 추돌위험 등을 감지하기 위해 자동차 수준에서 필요로 하는 물체인식 기술 요구조건은 날로 증가하고 있다. 이를 지원하기 위한 Vehicle Vision Processor 또한 발전을 해오고 있고 초기에 50GOPS의 연산능력에서 약 400GOPS에 가까운 연산능력으로 720p 이미지 크기에 대해서 30fps Frame Rate로 처리할 수 있는 등 지금까지의 vehicle vision system을 위한 vision 연산기능이 강화된 vision processor 동향에 대해서 살펴보겠다.

  • PDF

DEVELOPMENT OF A MACHINE VISION SYSTEM FOR WEED CONTROL USING PRECISION CHEMICAL APPLICATION

  • Lee, Won-Suk;David C. Slaughter;D.Ken Giles
    • 한국농업기계학회:학술대회논문집
    • /
    • 한국농업기계학회 1996년도 International Conference on Agricultural Machinery Engineering Proceedings
    • /
    • pp.802-811
    • /
    • 1996
  • Farmers need alternatives for weed control due to the desire to reduce chemicals used in farming. However, conventional mechanical cultivation cannot selectively remove weeds located in the seedline between crop plants and there are no selective heribicides for some crop/weed situations. Since hand labor is costly , an automated weed control system could be feasible. A robotic weed control system can also reduce or eliminate the need for chemicals. Currently no such system exists for removing weeds located in the seedline between crop plants. The goal of this project is to build a real-time , machine vision weed control system that can detect crop and weed locations. remove weeds and thin crop plants. In order to accomplish this objective , a real-time robotic system was developed to identify and locate outdoor plants using machine vision technology, pattern recognition techniques, knowledge-based decision theory, and robotics. The prototype weed control system is composed f a real-time computer vision system, a uniform illumination device, and a precision chemical application system. The prototype system is mounted on the UC Davis Robotic Cultivator , which finds the center of the seedline of crop plants. Field tests showed that the robotic spraying system correctly targeted simulated weeds (metal coins of 2.54 cm diameter) with an average error of 0.78 cm and the standard deviation of 0.62cm.

  • PDF

반려 로봇 (Life Companion Robots)

  • 김재홍;서범수;조재일;최정단
    • 전자통신동향분석
    • /
    • 제36권1호
    • /
    • pp.12-21
    • /
    • 2021
  • This article presents the future vision and core technologies of the "Life Companion Robot," which is one of the 12 future concepts introduced in the ETRI Technology Roadmap published in November 2020. Assistant robots, care robots, and life support robots were proposed as the development stages of life companion robots. Further, core technologies for each of the ten major roles that must be directly or indirectly performed by life companion robots are introduced. Finally, this article describes in detail three major artificial intelligence technologies for autonomous robots.

카메라영상에 의한 DGPS-GIS기반 차선변경 지원시스템의 평가 및 신뢰성 검증 (Assessment and Reliability Validation of Lane Departure Assistance System Based on DGPS-GIS Using Camera Vision)

  • 문상찬;이순걸;김민우;주다니
    • 한국자동차공학회논문집
    • /
    • 제22권6호
    • /
    • pp.49-58
    • /
    • 2014
  • This paper proposes a new assessment and reliability validation method of Lane Departure Assistance System based on DGPS-GIS by measuring lanes with camera vision. Assessment of lane departure is performed with yaw speed measurement and determination method for false alarm of ISO 17361 and performance validation is executed after generating departure warning boundary line by considering deviation error of LDAS using DGPS. Distance between the wheel and the lane is obtained through line abstraction using Hough transformation of the lane image with camera vision. Evaluation validation is obtained by comparing this value with the distance obtained with LDAS. The experimental result shows that the error of the extracted distance of the LDAS is within 5 cm. Also it proves performance of LDAS based on DGPS-GIS and assures effectiveness of the proposed validation method for system reliability using camera vision.

Implementation of Network System for Bio-physical signal Communication

  • Kim, Jeong Lae;Kang, Jeong Jin;Rothwell, Edward J.
    • International Journal of Advanced Culture Technology
    • /
    • 제1권1호
    • /
    • pp.1-5
    • /
    • 2013
  • This network system for home care realized communication by the bio-physical signal, to convey physical rhythm. Four function of displacement had point of a Vision, Somatosensory, Vestibular and CNS. Bio-physical signal was decided to design a maximum points and minimum points with 0.01unit in reference level. Bio-physical signal was checked to compound physical condition of body posture for sensory organ. There detected a measurement of Vision, Somatosensory, Vestibular, CNS and BMI. The service of network system of home can be used to support a health care system for health assistant in health care center. It will expect to manage a physical parameter for network communication.

  • PDF

젖소의 개체인식 및 형상 정보화를 위한 컴퓨터 시각 시스템 개발(II) - 스테레오 영상을 이용한 체위 분석 - (Development of Computer Vision System for Individual Recognition and Feature Information of Cow (II) - Analysis of body parameters using stereo image -)

  • 이종환
    • Journal of Biosystems Engineering
    • /
    • 제28권1호
    • /
    • pp.65-76
    • /
    • 2003
  • The analysis of cow body parameters is important to provide some useful information fur cow management and cow evaluation. Present methods give many stresses to cows because they are invasive and constrain cow postures during measurement of body parameters. This study was conducted to develop the stereo vision system fur non-invasive analysis of cow body features. Body feature parameters of 16 heads at two farms(A, B) were measured using scales and nineteen stereo images of them with walking postures were captured under outdoor illumination. In this study, the camera calibration and inverse perspective transformation technique was established fer the stereo vision system. Two calibration results were presented for farm A and fm B, respectively because setup distances from camera to cow were 510 cm at farm A and 630cm at farm B. Calibration error values fer the stereo vision system were within 2 cm for farm A and less than 4.9 cm for farm B. Eleven feature points of cow body were extracted on stereo images interactively and five assistant points were determined by computer program. 3D world coordinates for these 15 points were calculated by computer program and also used for calculation of cow body parameters such as withers height. pelvic arch height. body length. slope body length. chest depth and chest width. Measured errors for body parameters were less than 10% for most cows. For a few cow. measured errors for slope body length and chest width were more than 10% due to searching errors fer their feature points at inside-body positions. Equation for chest girth estimated by chest depth and chest width was presented. Maximum of estimated error fur chest girth was within 10% of real values and mean value of estimated error was 8.2cm. The analysis of cow body parameters using stereo vision system were successful although body shape on the binocular stereo image was distorted due to cow movements.