Browse > Article

Gaze Recognition Interface Development for Smart Wheelchair  

Park, S.H. (대구대학교 정보통신공학부)
Publication Information
Journal of rehabilitation welfare engineering & assistive technology / v.5, no.1, 2011 , pp. 103-110 More about this Journal
Abstract
In this paper, we propose a gaze recognition interface for smart wheelchair. The gaze recognition interface is a user interface which recognize the commands using the gaze recognition and avoid the detected obstacles by sensing the distance through range sensors on the way to driving. Smart wheelchair is composed of gaze recognition and tracking module, user interface module, obstacle detector, motor control module, and range sensor module. The interface in this paper uses a camera with built-in infra red filter and 2 LED light sources to see what direction the pupils turn to and can send command codes to control the system, thus it doesn't need any correction process per each person. The results of the experiment showed that the proposed interface can control the system exactly by recognizing user's gaze direction.
Keywords
smart wheelchair; Raze recognition; user interface;
Citations & Related Records
Times Cited By KSCI : 3  (Citation Analysis)
연도 인용수 순위
1 Takeshi Saitoh, Noriyuki Takahashi, Ryosuke Konishi, "Development of an Intelligent Wheelchair with Visual Oral Motion", IEEE International Conference on Robot & Human Interactive Communication, pp.145-150, 2007.
2 L.M. Bergasa, M. Mazo, A. Gardel, R. Barea, L. Boquete, "Commands Generation by Face Movements Applied to the Guidance of a Wheelchair for Handicappen People", Pattern Recognition, 2000. Proceedings. 15th International Conference, Vol.4, pp.660-663, 2000.
3 박진우, 권용무, 손광훈, "동공과 글린트의 특징점 관계를 이용한 시선 추적 시스템", 방송공학회 논문지, 제11권, 제1호, pp.80-90, 2006
4 최승억, 정종우, 서영완, "테이블탑기기와 인터랙티브 벽면 디스플레이의 기술동향과 응용", 정보과학회지, 제26권, 제3호, 제226호, pp.5-14, 2008.
5 Y. Nakazato, M. Kanbara, N. Yokoya, "Wearable augmented reality system using invisible visual markers and an IR camera", Proc. of International Symposium on Wearable Computers, pp.198-199, 2005.
6 Y. Kishino, M. Tsukamoto, Y. Sakane, S. Nishio, "Realizing a visual marker using LEDs for wearable computing environment", In Proc. of International Conference on Distributed Computing Systems Workshops, pp.314-319, 2003.
7 Q. Chen, N.D Georganas, and E.M. Petriu, "Realtime Vision-based Hand Gesture Recognition Using Haar-like Features ", Instrum entation and Measurement Technology Conference Proceedings, IEEE, pp.1-6, 2007.
8 K.H. Kim, H.K. Kim, J.S. Kim, W.Son, and S.Y.Lee, "A biosignal - based human interface controlling a power - wheelchair for people with motor disabilities", ETRI journal, pp.111-114, 2006.
9 문인석, 홍원기, 류정탁, " 이동로봇의 장애물 회피 알고리즘 연구" 2010 대한 임베디드공학회 추계학술대회, pp.504-507, 2010.