Efficient Object Selection Algorithm by Detection of Human Activity

행동 탐지 기반의 효율적인 객체 선택 알고리듬

  • Park, Wang-Bae (Dept. of Image Engineering, Graduate School of Advanced Image Science, Multimedia, and film, Chung-Ang University) ;
  • Seo, Yung-Ho (Dept. of Image Engineering, Graduate School of Advanced Image Science, Multimedia, and film, Chung-Ang University) ;
  • Doo, Kyoung-Soo (Dept. of Image Engineering, Graduate School of Advanced Image Science, Multimedia, and film, Chung-Ang University) ;
  • Choi, Jong-Soo (Dept. of Image Engineering, Graduate School of Advanced Image Science, Multimedia, and film, Chung-Ang University)
  • 박왕배 (중앙대학교 첨단영상대학원) ;
  • 서융호 (중앙대학교 첨단영상대학원) ;
  • 두경수 (중앙대학교 첨단영상대학원) ;
  • 최종수 (중앙대학교 첨단영상대학원)
  • Received : 2009.08.17
  • Published : 2010.05.25

Abstract

This paper presents an efficient object selection algorithm by analyzing and detecting of human activity. Generally, when people point any something, they will put a face on the target direction. Therefore, the direction of the face and fingers and was ordered to be connected to a straight line. At first, in order to detect the moving objects from the input frames, we extract the interesting objects in real time using background subtraction. And the judgment of movement is determined by Principal Component Analysis and a designated time period. When user is motionless, we estimate the user's indication by estimation in relation to vector from the head to the hand. Through experiments using the multiple views, we confirm that the proposed algorithm can estimate the movement and indication of user more efficiently.

본 논문에서는 행동 탐지 기반으로 사람의 지시행위를 인식하여 지시방향의 객체를 선택하고 이를 추적하는 시스템을 제안한다. 일반적으로 사람은 무엇인가를 지시할 경우, 얼굴 방향을 목표물에 두게 된다. 따라서 얼굴과 손끝을 연결한 직선을 지시방향이라 간주하고, 지시된 객체를 선택한다. 제안된 알고리듬에서는 카메라를 통해 입력된 영상으로부터 움직임 영역을 검출하기 위해 배경 차분을 사용하여 실시간으로 관심 객체의 움직임을 추출한다. 보행 여부의 판단은 주성분(PCA) 분석과 객체의 움직임 변위로 결정되며, 이 때 사람이 정지 상태면, 머리를 기준으로 손에 이르는 벡터 관계를 계산하여 사용자의 지시방향을 최종적으로 결정한다. 실험결과를 통하여 다시점 카메라를 이용한 다각도의 영상에서 사람의 지시 방향을 정확하게 추정해 냄으로서 제안된 알고리즘의 유효성을 검증하였다.

Keywords

References

  1. L. Wang, W. Hu, and T. Tan, "Recent developments in human motion analysis", National Laboratory of Pattern Recognition, Vol. 36, pp. 585-601, 2003. https://doi.org/10.1016/S0031-3203(02)00100-0
  2. A. Pentland, "Looking at people: Sensing for Ubiquitous and Wearable Computing", IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol. 22, No. 1, pp. 107-119, 2000. https://doi.org/10.1109/34.824823
  3. R. Cipolla, P. A. Hadfiels and N. J. Hollinghurst, "Uncalibrated stereo vision with pointing for man-machine interface", Proc. of IAPR workshop on Machine Vision Application, pp. 163-166, 1994.
  4. M. Nishimura, A. Nishikawa, K. Koara, and F. Miyazaki, "A Multi-Modal Interface for Recognizing Gestures Expressed by Cyclically Repeated Motion of the Hand", IAPR Workshop on Machine Vision Applications, pp. 219-222, 2000.
  5. E. Sato, A. Nakajima, T. Yamaguchi, and F. Harashima, "Humatronics(1) – natural interaction between human and networked robot using human motion recognition", IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2794-2799, 2005.
  6. A. Butz, M. Schneider, and M. Spassova, "Searchlight – A Lightweight Search Function for Pervasive Environments", Proceedings of Int. conf. on pervasive computing, 2004.
  7. R. Hartly and A. Zisserman, "Multiple view geometry in computer vision 2 edition", Cambridge University Press, pp. 184-243, 2000.
  8. M. Piccardi, "Background Subtraction Techniques: a reviews", Int. Conf. on Systems, Man, and Cybernetics, Vol. 4, pp. 3099-3104, 2004.
  9. 조상현, 최홍문, "솔더페이스트의 고속, 고정밀 검사를 위한 이차원/삼차원 복합 광학계 및 알고리즘 구현", 대한전자공학회 논문지 제41권, SP편 제 3호, pp. 139-146, 2004.