• Title/Summary/Keyword: Human Tracking

Search Result 652, Processing Time 0.028 seconds

Object detection and tracking using a high-performance artificial intelligence-based 3D depth camera: towards early detection of African swine fever

  • Ryu, Harry Wooseuk;Tai, Joo Ho
    • Journal of Veterinary Science
    • /
    • v.23 no.1
    • /
    • pp.17.1-17.10
    • /
    • 2022
  • Background: Inspection of livestock farms using surveillance cameras is emerging as a means of early detection of transboundary animal disease such as African swine fever (ASF). Object tracking, a developing technology derived from object detection aims to the consistent identification of individual objects in farms. Objectives: This study was conducted as a preliminary investigation for practical application to livestock farms. With the use of a high-performance artificial intelligence (AI)-based 3D depth camera, the aim is to establish a pathway for utilizing AI models to perform advanced object tracking. Methods: Multiple crossovers by two humans will be simulated to investigate the potential of object tracking. Inspection of consistent identification will be the evidence of object tracking after crossing over. Two AI models, a fast model and an accurate model, were tested and compared with regard to their object tracking performance in 3D. Finally, the recording of pig pen was also processed with aforementioned AI model to test the possibility of 3D object detection. Results: Both AI successfully processed and provided a 3D bounding box, identification number, and distance away from camera for each individual human. The accurate detection model had better evidence than the fast detection model on 3D object tracking and showed the potential application onto pigs as a livestock. Conclusions: Preparing a custom dataset to train AI models in an appropriate farm is required for proper 3D object detection to operate object tracking for pigs at an ideal level. This will allow the farm to smoothly transit traditional methods to ASF-preventing precision livestock farming.

A Study on the Relationship of Human Factors Integration In the Defense

  • Ko, NamKyung;Kwon, YongSoo
    • Journal of the Korean Society of Systems Engineering
    • /
    • v.7 no.2
    • /
    • pp.45-50
    • /
    • 2011
  • This work presents the relationship between the domains of Human Factors Integration(HFI) to develop the weapon systems through integrating human factors into the defense acquisition program. The HFI is a systematic process for identifying, tracking and resolving human related issues ensuring a balanced development of both technologies and human aspects of capability. In this point of view, this paper identifies and analyzes the HFI domains. Based on the results, this paper presents the relationships between the domains of the HFI.

Emotional Human Body Recognition by Using Extraction of Human Body from Image (인간의 움직임 추출을 이용한 감정적인 행동 인식 시스템 개발)

  • Song, Min-Kook;Joo, Young-Hoon;Park, Jin-Bae
    • Proceedings of the KIEE Conference
    • /
    • 2006.10c
    • /
    • pp.214-216
    • /
    • 2006
  • Expressive face and human body gestures are among the main non-verbal communication channels in human-human interaction. Understanding human emotions through body gesture is one of the necessary skills both for humans and also for the computers to interact with their human counterparts. Gesture analysis is consisted of several processes such as detecting of hand, extracting feature, and recognizing emotions. Skin color information for tracking hand gesture is obtained from face detection region. We have revealed relationships between paricular body movements and specific emotions by using HMM(Hidden Markov Model) classifier. Performance evaluation of emotional human body recognition has experimented.

  • PDF

Online Human Tracking Based on Convolutional Neural Network and Self Organizing Map for Occupancy Sensors (점유 센서를 위한 합성곱 신경망과 자기 조직화 지도를 활용한 온라인 사람 추적)

  • Gil, Jong In;Kim, Manbae
    • Journal of Broadcast Engineering
    • /
    • v.23 no.5
    • /
    • pp.642-655
    • /
    • 2018
  • Occupancy sensors installed in buildings and households turn off the light if the space is vacant. Currently PIR(pyroelectric infra-red) motion sensors have been utilized. Recently, the researches using camera sensors have been carried out in order to overcome the demerit of PIR that cannot detect stationary people. The detection of moving and stationary people is a main functionality of the occupancy sensors. In this paper, we propose an on-line human occupancy tracking method using convolutional neural network (CNN) and self-organizing map. It is well known that a large number of training samples are needed to train the model offline. To solve this problem, we use an untrained model and update the model by collecting training samples online directly from the test sequences. Using videos capurted from an overhead camera, experiments have validated that the proposed method effectively tracks human.

Efficient Fingertip Tracking and Mouse Pointer Control for Implementation of a Human Mouse (휴먼마우스 구현을 위한 효율적인 손끝좌표 추적 및 마우스 포인트 제어기법)

  • 박지영;이준호
    • Journal of KIISE:Software and Applications
    • /
    • v.29 no.11
    • /
    • pp.851-859
    • /
    • 2002
  • This paper discusses the design of a working system that visually recognizes hand gestures for the control of a window based user interface. We present a method for tracking the fingertip of the index finger using a single camera. Our method is based on CAMSHIFT algorithm and performs better than the CAMSHIFT algorithm in that it tracks well particular hand poses used in the system in complex backgrounds. We describe how the location of the fingertip is mapped to a location on the monitor, and how it Is both necessary and possible to smooth the path of the fingertip location using a physical model of a mouse pointer. Our method is able to track in real time, yet not absorb a major share of computational resources. The performance of our system shows a great promise that we will be able to use this methodology to control computers in near future.

Human Spatial Cognition Using Visual and Auditory Stimulation

  • Yu, Mi;Piao, Yong-Jun;Kim, Yong-Yook;Kwon, Tae-Kyu;Hong, Chul-Un;Kim, Nam-Gyun
    • International Journal of Precision Engineering and Manufacturing
    • /
    • v.7 no.2
    • /
    • pp.41-45
    • /
    • 2006
  • This paper deals with human spatial cognition using visual and auditory stimulation. More specially, this investigation is to observe the relationship between the head and the eye motor system for the localization of visual target direction in space and to try to describe what is the role of right-side versus left-side pinna. In the experiment of visual stimulation, nineteen red LEDs (Luminescent Diodes, Brightness: $210\;cd/^2$) arrayed in the horizontal plane of the surrounding panel are used. Here the LEDs are located 10 degrees apart from each other. Physiological parameters such as EOG (Electro-Oculography), head movement, and their synergic control are measured by BIOPAC system and 3SPACE FASTRAK. In the experiment of auditory stimulation, one side of the pinna function was distorted intentionally by inserting a short tube in the ear canal. The localization error caused by right and left side pinna distortion was investigated as well. Since a laser pointer showed much less error (0.5%) in localizing target position than FASTRAK (30%) that has been generally used, a laser pointer was used for the pointing task. It was found that harmonic components were not essential for auditory target localization. However, non-harmonic nearby frequency components was found to be more important in localizing the target direction of sound. We have found that the right pinna carries out one of the most important functions in localizing target direction and pure tone with only one frequency component is confusing to be localized. It was also found that the latency time is shorter in self moved tracking (SMT) than eye alone tracking (EAT) and eye hand tracking (EHT). These results can be used in further study on the characterization of human spatial cognition.

Tracking of Walking Human Based on Position Uncertainty of Dynamic Vision Sensor of Quadcopter UAV (UAV기반 동적영상센서의 위치불확실성을 통한 보행자 추정)

  • Lee, Junghyun;Jin, Taeseok
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.22 no.1
    • /
    • pp.24-30
    • /
    • 2016
  • The accuracy of small and low-cost CCD cameras is insufficient to provide data for precisely tracking unmanned aerial vehicles (UAVs). This study shows how a quad rotor UAV can hover on a human targeted tracking object by using data from a CCD camera rather than imprecise GPS data. To realize this, quadcopter UAVs need to recognize their position and posture in known environments as well as unknown environments. Moreover, it is necessary for their localization to occur naturally. It is desirable for UAVs to estimate their position by solving uncertainty for quadcopter UAV hovering, as this is one of the most important problems. In this paper, we describe a method for determining the altitude of a quadcopter UAV using image information of a moving object like a walking human. This method combines the observed position from GPS sensors and the estimated position from images captured by a fixed camera to localize a UAV. Using the a priori known path of a quadcopter UAV in the world coordinates and a perspective camera model, we derive the geometric constraint equations that represent the relation between image frame coordinates for a moving object and the estimated quadcopter UAV's altitude. Since the equations are based on the geometric constraint equation, measurement error may exist all the time. The proposed method utilizes the error between the observed and estimated image coordinates to localize the quadcopter UAV. The Kalman filter scheme is applied for this method. Its performance is verified by a computer simulation and experiments.

A study on e-leisure mobile AR in outdoor environments (실외환경에서의 e-레저 모바일 AR에 대한 연구)

  • Ko, Junho;Choi, Yu Jin;Lee, Hun Joo;Kim, Yoon Sang
    • Journal of Digital Contents Society
    • /
    • v.19 no.6
    • /
    • pp.1027-1032
    • /
    • 2018
  • Recently, new content for e-leisure, including e-sports and e-games, has become necessary. To meet this requirement, e-leisure mobile AR studies which track human are underway. The tracking performance at long distances is important because e-leisure mobile AR is used in outdoor environments. However, conventional mobile AR applications such as SNOW and Snapchat have the disadvantage of low tracking performance at long distances. Therefore, we propose an e-leisure mobile AR in outdoor environments. The proposed e-leisure mobile AR can estimate the position of the head in outdoor environments at long distances by using color markers and the human body ratio, and then augment a virtual object at the estimated position. The performance of the proposed e-leisure mobile AR was evaluated by measuring the tracking performance and processing time.

A Non-invasive Real-time Respiratory Organ Motion Tracking System for Image Guided Radio-Therapy (IGRT를 위한 비침습적인 호흡에 의한 장기 움직임 실시간 추적시스템)

  • Kim, Yoon-Jong;Yoon, Uei-Joong
    • Journal of Biomedical Engineering Research
    • /
    • v.28 no.5
    • /
    • pp.676-683
    • /
    • 2007
  • A non-invasive respiratory gated radiotherapy system like those based on external anatomic motion gives better comfortableness to patients than invasive system on treatment. However, higher correlation between the external and internal anatomic motion is required to increase the effectiveness of non-invasive respiratory gated radiotherapy. Both of invasive and non-invasive methods need to track the internal anatomy with the higher precision and rapid response. Especially, the non-invasive method has more difficulty to track the target position successively because of using only image processing. So we developed the system to track the motion for a non-invasive respiratory gated system to accurately find the dynamic position of internal structures such as the diaphragm and tumor. The respiratory organ motion tracking apparatus consists of an image capture board, a fluoroscopy system and a processing computer. After the image board grabs the motion of internal anatomy through the fluoroscopy system, the computer acquires the organ motion tracking data by image processing without any additional physical markers. The patients breathe freely without any forced breath control and coaching, when this experiment was performed. The developed pattern-recognition software could extract the target motion signal in real-time from the acquired fluoroscopic images. The range of mean deviations between the real and acquired target positions was measured for some sample structures in an anatomical model phantom. The mean and max deviation between the real and acquired positions were less than 1mm and 2mm respectively with the standardized movement using a moving stage and an anatomical model phantom. Under the real human body, the mean and maximum distance of the peak to trough was measured 23.5mm and 55.1mm respectively for 13 patients' diaphragm motion. The acquired respiration profile showed that human expiration period was longer than the inspiration period. The above results could be applied to respiratory-gated radiotherapy.