• Title/Summary/Keyword: Camera Tracking

Search Result 1,011, Processing Time 0.028 seconds

Development of A Multi-sensor Fusion-based Traffic Information Acquisition System with Robust to Environmental Changes using Mono Camera, Radar and Infrared Range Finder (환경변화에 강인한 단안카메라 레이더 적외선거리계 센서 융합 기반 교통정보 수집 시스템 개발)

  • Byun, Ki-hoon;Kim, Se-jin;Kwon, Jang-woo
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.16 no.2
    • /
    • pp.36-54
    • /
    • 2017
  • The purpose of this paper is to develop a multi-sensor fusion-based traffic information acquisition system with robust to environmental changes. it combines the characteristics of each sensor and is more robust to the environmental changes than the video detector. Moreover, it is not affected by the time of day and night, and has less maintenance cost than the inductive-loop traffic detector. This is accomplished by synthesizing object tracking informations based on a radar, vehicle classification informations based on a video detector and reliable object detections of a infrared range finder. To prove the effectiveness of the proposed system, I conducted experiments for 6 hours over 5 days of the daytime and early evening on the pedestrian - accessible road. According to the experimental results, it has 88.7% classification accuracy and 95.5% vehicle detection rate. If the parameters of this system is optimized to adapt to the experimental environment changes, it is expected that it will contribute to the advancement of ITS.

Auto-guiding Performance from IGRINS Test Observations (Immersion GRating INfrared Spectrograph)

  • Lee, Hye-In;Pak, Soojong;Le, Huynh Anh N.;Kang, Wonseok;Mace, Gregory;Pavel, Michael;Jaffe, Daniel T.;Lee, Jae-Joon;Kim, Hwihyun;Jeong, Ueejeong;Chun, Moo-Young;Park, Chan;Yuk, In-Soo;Kim, Kangmin
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.39 no.2
    • /
    • pp.92.1-92.1
    • /
    • 2014
  • In astronomical spectroscopy, stable auto-guiding and accurate target centering capabilities are critical to increase the achievement of high observation efficiency and sensitivity. We developed an instrument control software for the Immersion GRating INfrared Spectrograph (IGRINS), a high spectral resolution near-infrared slit spectrograph with (R=40,000). IGRINS is currently installed on the McDonald 2.7 m telescope in Texas, USA. We had successful commissioning observations in March, May, and July of 2014. The role of the IGRINS slit-viewing camera (SVC) is to move the target onto the slit, and to provide feedback about the tracking offsets for the auto-guiding. For a point source, we guide the telescope with the target on the slit. While for an extended source, we use another a guide star in the field offset from the slit. Since the slit blocks the center of the point spread function, it is challenging to fit the Gaussian function to guide and center the target on slit. We developed several center finding algorithms, e.g., 2D-Gaussian Fitting, 1D-Gaussian Fitting, and Center Balancing methods. In this presentation, we show the results of auto-guiding performances with these algorithms.

  • PDF

Development of application for guidance and controller unit for low cost and small UAV missile based on smartphone (스마트폰을 활용한 소형 저가 유도탄 유도조종장치용 어플리케이션 개발)

  • Noh, Junghoon;Cho, Kyongkuk;Kim, Seongjun;Kim, Wonsop;Jeong, Jinseob;Sang, Jinwoo;Park, Chung-Woon;Gong, Minsik
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.45 no.7
    • /
    • pp.610-617
    • /
    • 2017
  • In the recent weapon system trend, it is required to develop small and low cost guidance missile to track and strike the enemy target effectively. Controling the such small drone typed weapon demands a integrated electronic device that equipped with not only a wireless network interface, a high resolution camera, various sensors for target tracking, and position and attitude control but also a high performance processor that integrates and processes those sensor outputs in real-time. In this paper, we propose the android smartphone as a solution for that and implement the guidance and control application of the missile. Furthermore, the performance of the implemented guidance and control application is analyzed through the simulation.

Development and Validation of a Measurement Technique for Interfacial Velocity in Liquid-gas Separated Flow Using IR-PTV (적외선 입자추적유속계를 이용한 액체-기체 분리유동 시 계면속도 측정기법 개발 및 검증)

  • Kim, Sangeun;Kim, Hyungdae
    • Transactions of the Korean Society of Mechanical Engineers B
    • /
    • v.39 no.7
    • /
    • pp.549-555
    • /
    • 2015
  • A measurement technique of interfacial velocity in air-water separated flow by particle tracking velocimetry using an infrared camera (IR-PTV) was developed. As infrared light with wavelength in the range of 3-5 um could hardly penetrate water, IR-PTV can selectively visualize only the tracer particles existing in depths less than 20 um underneath the air-water interface. To validate the measurement accuracy of the IR-PTV technique, a measurement of the interfacial velocity of the air-water separated flow using Styrofoam particles floating in water was conducted. The interfacial velocity values obtained with the two different measurement techniques showed good agreement with errors less than 5%. It was found from the experimental results obtained using the developed technique that with increasing air velocity, the interfacial velocity proportionally increases, likely because of the increased interfacial stress.

Audio-Visual Fusion for Sound Source Localization and Improved Attention (음성-영상 융합 음원 방향 추정 및 사람 찾기 기술)

  • Lee, Byoung-Gi;Choi, Jong-Suk;Yoon, Sang-Suk;Choi, Mun-Taek;Kim, Mun-Sang;Kim, Dai-Jin
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.35 no.7
    • /
    • pp.737-743
    • /
    • 2011
  • Service robots are equipped with various sensors such as vision camera, sonar sensor, laser scanner, and microphones. Although these sensors have their own functions, some of them can be made to work together and perform more complicated functions. AudioFvisual fusion is a typical and powerful combination of audio and video sensors, because audio information is complementary to visual information and vice versa. Human beings also mainly depend on visual and auditory information in their daily life. In this paper, we conduct two studies using audioFvision fusion: one is on enhancing the performance of sound localization, and the other is on improving robot attention through sound localization and face detection.

Object Recognition Face Detection With 3D Imaging Parameters A Research on Measurement Technology (3D영상 객체인식을 통한 얼굴검출 파라미터 측정기술에 대한 연구)

  • Choi, Byung-Kwan;Moon, Nam-Mee
    • Journal of the Korea Society of Computer and Information
    • /
    • v.16 no.10
    • /
    • pp.53-62
    • /
    • 2011
  • In this paper, high-tech IT Convergence, to the development of complex technology, special technology, video object recognition technology was considered only as a smart - phone technology with the development of personal portable terminal has been developed crossroads. Technology-based detection of 3D face recognition technology that recognizes objects detected through the intelligent video recognition technology has been evolving technologies based on image recognition, face detection technology with through the development speed is booming. In this paper, based on human face recognition technology to detect the object recognition image processing technology is applied through the face recognition technology applied to the IP camera is the party of the mouth, and allowed the ability to identify and apply the human face recognition, measurement techniques applied research is suggested. Study plan: 1) face model based face tracking technology was developed and applied 2) algorithm developed by PC-based measurement of human perception through the CPU load in the face value of their basic parameters can be tracked, and 3) bilateral distance and the angle of gaze can be tracked in real time, proved effective.

Development of A Framework for Robust Extraction of Regions Of Interest (환경 요인에 독립적인 관심 영역 추출을 위한 프레임워크의 개발)

  • Kim, Seong-Hoon;Lee, Kwang-Eui;Heo, Gyeong-Yong
    • Journal of the Korea Society of Computer and Information
    • /
    • v.16 no.12
    • /
    • pp.49-57
    • /
    • 2011
  • Extraction of regions of interest (ROIs) is the first and important step for the applications in computer vision and affects the rest of the application process. However, ROI extraction can be easily affected by the environment such as illumination, camera, etc. Many applications adopt problem-specific knowledge and/or post-processing to correct the error occurred in ROI extraction. In this paper, proposed is a robust framework that could overcome the environmental change and is independent from the rest of the process. The proposed framework uses a differential image and a color distribution to extract ROIs. The color distribution can be learned on-line, which make the framework to be robust to environmental change. Even more, the components of the framework are independent each other, which makes the framework flexible and extensible. The usefulness of the proposed framework is demonstrated with the application of hand region extraction in an image sequence.

Detection of Optical Flows on the Trajectories of Feature Points Using the Cellular Nonlinear Neural Networks (셀룰라 비선형 네트워크를 이용한 특징점 궤적 상에서 Optical Flow 검출)

  • Son, Hon-Rak;Kim, Hyeong-Suk
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.37 no.6
    • /
    • pp.10-21
    • /
    • 2000
  • The Cellular Noninear Networks structure for Distance Transform(DT) and the robust optical flow detection algorithm based on the DT are proposed. For some applications of optical flows such as target tracking and camera ego-motion computation, correct optical flows at a few feature points are more useful than unreliable one at every pixel point. The proposed algorithm is for detecting the optical flows on the trajectories only of the feature points. The translation lengths and the directions of feature movements are detected on the trajectories of feature points on which Distance Transform Field is developed. The robustness caused from the use of the Distance Transform and the easiness of hardware implementation with local analog circuits are the properties of the proposed structure. To verify the performance of the proposed structure and the algorithm, simulation has been done about various images under different noisy environment.

  • PDF

Evaluation of the Head Mouse System using Gyro-and Opto-Sensors (각속도 및 광센서를 이용한 헤드 마우스의 평가)

  • Park, Min-Je;Kim, Soo-Chan
    • Journal of the HCI Society of Korea
    • /
    • v.5 no.2
    • /
    • pp.1-6
    • /
    • 2010
  • In this research, we designed the head mouse system for disabled and gamers, a mouse controller which can be controlled by head movements and eye blinks only, and compared its performance with other regular mouse controller systems. The head mouse was moved by a gyro-sensor, which can measure an angular rotation of a head movement, and the eye blink was used as a clicking event of the mouse system. Accumulated errors caused by integral, which was a problem that previous head mouse system had, were removed periodically, and treated as dead zones in the non-linear relative point graph, and direct mouse point control was possible using its moving distance and acceleration calculation. We used the active light sources to minimize the influence of the ambient light changes, so that the head mouse was not affected by the change in external light source. In a comparison between the head mouse and the gazing tracking mouse (Quick Glance), the above method resulted about 21% higher score on the clicking event experiment called "20 clicks", about 25% higher on the dasher experiment, and about 37% higher on on-screen keyboard test respectively, which concludes that the proposed head mouse has better performance than the standard mouse system.

  • PDF

Design and Implementation of Robot-Based Alarm System of Emergency Situation Due to Falling of The Eldely (고령자 낙상에 의한 응급 상황의 4족 로봇 기반 알리미 시스템 설계 및 구현)

  • Park, ChulHo;Lim, DongHa;Kim, Nam Ho;Yu, YunSeop
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.17 no.4
    • /
    • pp.781-788
    • /
    • 2013
  • In this paper, we introduce a quadruped robot-based alarm system for monitoring the emergency situation due to falling in the elderly. Quadruped robot includes the FPGA Board(Field Programmable Gate Array) applying a red-color tracking algorithm. To detect a falling of the elderly, a sensor node is worn on chest and accelerations and angular velocities measured by the sensor node are transferred to quadruped robot, and then the emergency signal is transmitted to manager if a fall is detected. Manager controls the robot and then he judges the situation by monitoring the real-time images transmitted from the robot. If emergency situation is decided by the manager, he calls 119. When the fall detection system using only sensor nodes is used, sensitivity of 100% and specificity of 98.98% were measured. Using the combination of the fall detection system and portable camera (robot), the emergency situation was detected to 100 %.