• 제목/요약/키워드: Human Tracking

검색결과 652건 처리시간 0.025초

Object detection and tracking using a high-performance artificial intelligence-based 3D depth camera: towards early detection of African swine fever

  • Ryu, Harry Wooseuk;Tai, Joo Ho
    • Journal of Veterinary Science
    • /
    • 제23권1호
    • /
    • pp.17.1-17.10
    • /
    • 2022
  • Background: Inspection of livestock farms using surveillance cameras is emerging as a means of early detection of transboundary animal disease such as African swine fever (ASF). Object tracking, a developing technology derived from object detection aims to the consistent identification of individual objects in farms. Objectives: This study was conducted as a preliminary investigation for practical application to livestock farms. With the use of a high-performance artificial intelligence (AI)-based 3D depth camera, the aim is to establish a pathway for utilizing AI models to perform advanced object tracking. Methods: Multiple crossovers by two humans will be simulated to investigate the potential of object tracking. Inspection of consistent identification will be the evidence of object tracking after crossing over. Two AI models, a fast model and an accurate model, were tested and compared with regard to their object tracking performance in 3D. Finally, the recording of pig pen was also processed with aforementioned AI model to test the possibility of 3D object detection. Results: Both AI successfully processed and provided a 3D bounding box, identification number, and distance away from camera for each individual human. The accurate detection model had better evidence than the fast detection model on 3D object tracking and showed the potential application onto pigs as a livestock. Conclusions: Preparing a custom dataset to train AI models in an appropriate farm is required for proper 3D object detection to operate object tracking for pigs at an ideal level. This will allow the farm to smoothly transit traditional methods to ASF-preventing precision livestock farming.

A Study on the Relationship of Human Factors Integration In the Defense

  • Ko, NamKyung;Kwon, YongSoo
    • 시스템엔지니어링학술지
    • /
    • 제7권2호
    • /
    • pp.45-50
    • /
    • 2011
  • This work presents the relationship between the domains of Human Factors Integration(HFI) to develop the weapon systems through integrating human factors into the defense acquisition program. The HFI is a systematic process for identifying, tracking and resolving human related issues ensuring a balanced development of both technologies and human aspects of capability. In this point of view, this paper identifies and analyzes the HFI domains. Based on the results, this paper presents the relationships between the domains of the HFI.

인간의 움직임 추출을 이용한 감정적인 행동 인식 시스템 개발 (Emotional Human Body Recognition by Using Extraction of Human Body from Image)

  • 송민국;주영훈;박진배
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2006년 학술대회 논문집 정보 및 제어부문
    • /
    • pp.214-216
    • /
    • 2006
  • Expressive face and human body gestures are among the main non-verbal communication channels in human-human interaction. Understanding human emotions through body gesture is one of the necessary skills both for humans and also for the computers to interact with their human counterparts. Gesture analysis is consisted of several processes such as detecting of hand, extracting feature, and recognizing emotions. Skin color information for tracking hand gesture is obtained from face detection region. We have revealed relationships between paricular body movements and specific emotions by using HMM(Hidden Markov Model) classifier. Performance evaluation of emotional human body recognition has experimented.

  • PDF

점유 센서를 위한 합성곱 신경망과 자기 조직화 지도를 활용한 온라인 사람 추적 (Online Human Tracking Based on Convolutional Neural Network and Self Organizing Map for Occupancy Sensors)

  • 길종인;김만배
    • 방송공학회논문지
    • /
    • 제23권5호
    • /
    • pp.642-655
    • /
    • 2018
  • 빌딩, 집에 설치되어 있는 점유 센서는 사람이 없으면 소등하고, 반대이면 점등한다. 현재는 주요 센서로 PIR(pyroelectric infra-red)이 널리 사용되고 있다. 최근에 비전 카메라 센서를 이용하여 사람 점유를 검출하는 연구가 진행되고 있다. 카메라 센서는 정지된 사람을 검출할 수 없는 PIR의 단점을 극복할 수 있는 장점이 있다. 이동 및 정지된 사람의 추적은 카메라 점유 센서의 주요 기능이다. 본 논문에서는 합성곱 신경망 모델과 자기 조직화 지도를 활용한 온라인 사람 추적 기법을 제안한다. 오프라인에 모델을 학습시키기 위해서는 많은 수의 훈련 샘플이 필요하다. 이러한 문제를 해결하기 위해, 학습되지 않은 모델을 사용하고, 실험 영상으로부터 직접 훈련 샘플을 수집하여 모델을 갱신한다. 오버헤드 카메라로 실내에서 촬영한 영상을 이용하여, 제안 방법이 효과적으로 사람을 추적하고 있음을 실험을 통해 증명하였다.

휴먼마우스 구현을 위한 효율적인 손끝좌표 추적 및 마우스 포인트 제어기법 (Efficient Fingertip Tracking and Mouse Pointer Control for Implementation of a Human Mouse)

  • 박지영;이준호
    • 한국정보과학회논문지:소프트웨어및응용
    • /
    • 제29권11호
    • /
    • pp.851-859
    • /
    • 2002
  • 본 연구는 마우스의 입력을 사람의 손동작으로 대체하는 휴먼마우스 시스템을 위한 새로운 손끝 좌표 추적기법과 마우스 포인터 움직임 결정법을 제안한다. 손끝좌표 추적에는 얼굴영역 추적을 위해 제안되었던 CAMSHIFT 알고리즘을 개선하여 적용하였다. 정확한 손 영역 검출을 위하여 각 사용자의 환경에 최적화된 피부영역 컬러 정보를 얻을 수 있는 실시간 피부영역 학습과 손의 자유로운 움직임으로 인하여 발생하는 손 영역의 크기 및 방향변화를 고려한 영역제한 기법을 적용하였다. 또한 손의 주축을 이용한 손끝좌표 계산법을 통해 빠르고 정확하게 손끝의 위치를 찾을 수 있다. 실시간 손끝좌표 검출에는 처리속도의 한계가 있어 얻어지는 좌표의 연속성이 결여되어 마우스 포인터의 움직임이 연결되지 않는 문제점이 있다. 연속적인 마우스 포인터의 움직임 표현을 위해 손끝좌표의 이동거리를 통해 마우스 포인터의 속도와 가속도를 계산하여 마우스 포인터의 운동방정식을 정의하고 이를 이용하여 마우스 포인터의 위치를 결정한다. 제안한 알고리즘을 적용하여 실험한 결과 빠르고 정확한 손끝좌표 추적과 마우스 포인터의 자연스러운 움직임이 가능함을 보였다.

Human Spatial Cognition Using Visual and Auditory Stimulation

  • Yu, Mi;Piao, Yong-Jun;Kim, Yong-Yook;Kwon, Tae-Kyu;Hong, Chul-Un;Kim, Nam-Gyun
    • International Journal of Precision Engineering and Manufacturing
    • /
    • 제7권2호
    • /
    • pp.41-45
    • /
    • 2006
  • This paper deals with human spatial cognition using visual and auditory stimulation. More specially, this investigation is to observe the relationship between the head and the eye motor system for the localization of visual target direction in space and to try to describe what is the role of right-side versus left-side pinna. In the experiment of visual stimulation, nineteen red LEDs (Luminescent Diodes, Brightness: $210\;cd/^2$) arrayed in the horizontal plane of the surrounding panel are used. Here the LEDs are located 10 degrees apart from each other. Physiological parameters such as EOG (Electro-Oculography), head movement, and their synergic control are measured by BIOPAC system and 3SPACE FASTRAK. In the experiment of auditory stimulation, one side of the pinna function was distorted intentionally by inserting a short tube in the ear canal. The localization error caused by right and left side pinna distortion was investigated as well. Since a laser pointer showed much less error (0.5%) in localizing target position than FASTRAK (30%) that has been generally used, a laser pointer was used for the pointing task. It was found that harmonic components were not essential for auditory target localization. However, non-harmonic nearby frequency components was found to be more important in localizing the target direction of sound. We have found that the right pinna carries out one of the most important functions in localizing target direction and pure tone with only one frequency component is confusing to be localized. It was also found that the latency time is shorter in self moved tracking (SMT) than eye alone tracking (EAT) and eye hand tracking (EHT). These results can be used in further study on the characterization of human spatial cognition.

UAV기반 동적영상센서의 위치불확실성을 통한 보행자 추정 (Tracking of Walking Human Based on Position Uncertainty of Dynamic Vision Sensor of Quadcopter UAV)

  • 이정현;진태석
    • 제어로봇시스템학회논문지
    • /
    • 제22권1호
    • /
    • pp.24-30
    • /
    • 2016
  • The accuracy of small and low-cost CCD cameras is insufficient to provide data for precisely tracking unmanned aerial vehicles (UAVs). This study shows how a quad rotor UAV can hover on a human targeted tracking object by using data from a CCD camera rather than imprecise GPS data. To realize this, quadcopter UAVs need to recognize their position and posture in known environments as well as unknown environments. Moreover, it is necessary for their localization to occur naturally. It is desirable for UAVs to estimate their position by solving uncertainty for quadcopter UAV hovering, as this is one of the most important problems. In this paper, we describe a method for determining the altitude of a quadcopter UAV using image information of a moving object like a walking human. This method combines the observed position from GPS sensors and the estimated position from images captured by a fixed camera to localize a UAV. Using the a priori known path of a quadcopter UAV in the world coordinates and a perspective camera model, we derive the geometric constraint equations that represent the relation between image frame coordinates for a moving object and the estimated quadcopter UAV's altitude. Since the equations are based on the geometric constraint equation, measurement error may exist all the time. The proposed method utilizes the error between the observed and estimated image coordinates to localize the quadcopter UAV. The Kalman filter scheme is applied for this method. Its performance is verified by a computer simulation and experiments.

실외환경에서의 e-레저 모바일 AR에 대한 연구 (A study on e-leisure mobile AR in outdoor environments)

  • 고준호;최유진;이헌주;김윤상
    • 디지털콘텐츠학회 논문지
    • /
    • 제19권6호
    • /
    • pp.1027-1032
    • /
    • 2018
  • 최근, e-스포츠, e-게임을 포함하는 e-레저를 위한 새로운 콘텐츠가 요구되고 있다. 이러한 요구로 사람을 추적 대상으로 하는 e-레저용 모바일 AR 연구가 진행되고 있다. e-레저 모바일 AR은 실외환경에서 사용되기 때문에, 원거리에서의 추적 성능이 중요하다. 그러나, snow, snapchat 등과 같은 기존 모바일 AR은 원거리에서 추적 성능이 낮은 단점이 있다. 따라서, 본 논문에서는 실외환경에서의 e-레저 모바일 AR을 제안한다. 제안된 e-레저 모바일 AR은 색상 마커 및 인체비를 이용하여 실외환경(원거리)에서 머리의 위치를 추적하고, 추적된 위치에 가상의 객체를 증강한다. 제안된 e-레저 모바일 AR의 성능은 추적 성능 및 연산 시간의 측정 실험을 통해 검토되었다.

IGRT를 위한 비침습적인 호흡에 의한 장기 움직임 실시간 추적시스템 (A Non-invasive Real-time Respiratory Organ Motion Tracking System for Image Guided Radio-Therapy)

  • 김윤종;윤의중
    • 대한의용생체공학회:의공학회지
    • /
    • 제28권5호
    • /
    • pp.676-683
    • /
    • 2007
  • A non-invasive respiratory gated radiotherapy system like those based on external anatomic motion gives better comfortableness to patients than invasive system on treatment. However, higher correlation between the external and internal anatomic motion is required to increase the effectiveness of non-invasive respiratory gated radiotherapy. Both of invasive and non-invasive methods need to track the internal anatomy with the higher precision and rapid response. Especially, the non-invasive method has more difficulty to track the target position successively because of using only image processing. So we developed the system to track the motion for a non-invasive respiratory gated system to accurately find the dynamic position of internal structures such as the diaphragm and tumor. The respiratory organ motion tracking apparatus consists of an image capture board, a fluoroscopy system and a processing computer. After the image board grabs the motion of internal anatomy through the fluoroscopy system, the computer acquires the organ motion tracking data by image processing without any additional physical markers. The patients breathe freely without any forced breath control and coaching, when this experiment was performed. The developed pattern-recognition software could extract the target motion signal in real-time from the acquired fluoroscopic images. The range of mean deviations between the real and acquired target positions was measured for some sample structures in an anatomical model phantom. The mean and max deviation between the real and acquired positions were less than 1mm and 2mm respectively with the standardized movement using a moving stage and an anatomical model phantom. Under the real human body, the mean and maximum distance of the peak to trough was measured 23.5mm and 55.1mm respectively for 13 patients' diaphragm motion. The acquired respiration profile showed that human expiration period was longer than the inspiration period. The above results could be applied to respiratory-gated radiotherapy.