DOI QR코드

DOI QR Code

A New Ergonomic Interface System for the Disabled Person

장애인을 위한 새로운 감성 인터페이스 연구

  • Heo, Hwan (Department of Electronics and Electrical Engineering, Dongguk University) ;
  • Lee, Ji-Woo (Department of Electronics and Electrical Engineering, Dongguk University) ;
  • Lee, Won-Oh (Department of Electronics and Electrical Engineering, Dongguk University) ;
  • Lee, Eui-Chul (Division of Fusion and Convergence of Mathematical Sciences, National Institute for Mathematical Sciences) ;
  • Park, Kang-Ryoung (Division of Electronics and Electrical Engineering, Dongguk University)
  • 허환 (동국대학교 전자전기공학과) ;
  • 이지우 (동국대학교 전자전기공학과) ;
  • 이원오 (동국대학교 전자전기공학과) ;
  • 이의철 (국가수리과학연구소 융복합수리과학연구부) ;
  • 박강령 (동국대학교 전자전기공학부)
  • Received : 2011.01.27
  • Accepted : 2012.02.08
  • Published : 2011.02.28

Abstract

Objective: Making a new ergonomic interface system based on camera vision system, which helps the handicapped in home environment. Background: Enabling the handicapped to manipulate the consumer electronics by the proposed interface system. Method: A wearable device for capturing the eye image using a near-infrared(NIR) camera and illuminators is proposed for tracking eye gaze position(Heo et al., 2011). A frontal viewing camera is attached to the wearable device, which can recognize the consumer electronics to be controlled(Heo et al., 2011). And the amount of user's eye fatigue can be measured based on eye blink rate, and in case that the user's fatigue exceeds in the predetermined level, the proposed system can automatically change the mode of gaze based interface into that of manual selection. Results: The experimental results showed that the gaze estimation error of the proposed method was 1.98 degrees with the successful recognition of the object by the frontal viewing camera(Heo et al., 2011). Conclusion: We made a new ergonomic interface system based on gaze tracking and object recognition Application: The proposed system can be used for helping the handicapped in home environment.

Keywords

References

  1. Ando, K., Johanson, C. E., Levy, D. L., Yasillo, N. J., Holzman, P. S. and Schuster, C. R., Effects of phencyclidine, secobarbital and diazepam on eye tracking in rhesus monkeys, Psychopharmacology, 8(4), 295-300, 1983.
  2. Bay, H., Tuytelaars, T. and Gool, L. V., SURF: speeded up robust features, Lecture Notes in Computer Science, 3951, 404-417, 2006. https://doi.org/10.1007/11744023_32
  3. Belongie, S., Malik, J. and Puzicha, J., Shape matching and object recognition using shape contexts, IEEE Trans. Pattern Analysis and Machine Intelligence, 24(4), 509-522, 2002. https://doi.org/10.1109/34.993558
  4. Cheng, D. and Vertegaal, R., "An Eye for an Eye: A Performance Evaluation Comparison of the LC Technologies and Tobii Eye Trackers", Proceedings of Symp. Eye Tracking Research & Application, (pp.61-61), San Antonio. TX. 2004.
  5. Daugman, J. G., High confidence visual recognition of persons by a test of statistical independence, IEEE Trans. Pattern Analysis and Machine Intelligence, 15(11), 1148-1161, 1993. https://doi.org/10.1109/34.244676
  6. Daugman, J. G., How iris recognition works, IEEE Trans. Circuits and Systems Society, 14(1), 21-30, 2004. https://doi.org/10.1109/TCSVT.2003.818350
  7. Duygulu, P., Barnard, K., de Freitas, J. F. G. and Forsyth, D. A., Object recognition as machine translation: learning a lexicon for a fixed image translation, Lecture Notes in Computer Sciences, 2353, 349-354, 2006.
  8. Gonzalez, R. C. and Wood R. E., Digital Image Processing, 2rd ed., Prentice Hall, 2003.
  9. Heo, H., Lee, W. O., Lee, J. W., Park, K. R., Lee, E. C. and Whang, M. C., "Object recognition and selection method by gaze tracking and SURF algorithm", Proceedings of International Conference on Multimedia and Signal Processing(CMSP'11), Guilin. China. 2011.
  10. Lee, J. J., Park, K. R. and Kim, J., "Gaze Detection System under HMD Environments for User Interface", Proceedings of the Joint Conference of ICANN/ICONIP, Istanbul. Turkey. 2003.
  11. Lee, E. C., Heo, H. and Park, K. R., The comparative measurements of eyestrain caused by 2D and 3D displays, IEEE Trans. Consumer Electronics, 56(3), 1677-1683, 2010. https://doi.org/10.1109/TCE.2010.5606312
  12. Lowe, D. G., "Object recognition from local scale-invariant features," Proceedings of IEEE Conf. Computer Vision (ICCV 99), 2(pp. 1150-1157), Corfu. Greece. 1999.
  13. Lowe, D. G., Distinctive image features from scale-invariant keypoints, International Journal of Computer Vision, 60(2), 91-110, 2004. https://doi.org/10.1023/B:VISI.0000029664.99615.94
  14. Mutch, J. and Lowe, D. G., "Multiclass Object Recognition with Sparse, Localized Features", Proceedings of the IEEE Conf. Computer Vision and Pattern Recognition, 2(pp. 11-18), New York. NY. 2006.
  15. OpenCV, http://sourceforge.net/projects/opencvlibrary/.
  16. Shi, F., Gale, A. and Purdy, K., A new gaze-based interface for environmental control, Lecture Notes in Computer Science, 4555, 996-1005, 2007. https://doi.org/10.1007/978-3-540-73281-5_109
  17. Shih, S. W. and Liu, J., A novel approach to 3-D gaze tracking using stereo cameras, IEEE Trans. Systems, Man, and Cybernetics, Part B: Cybernetics, 34(1), 234-245, 2004. https://doi.org/10.1109/TSMCB.2003.811128
  18. Yoo, D. H. and Chung, M. J., "Non-intrusive Eye Gaze Estimation without Knowledge of Eye Pose", Proceedings of IEEE Conf. Automatic Face and Gesture Recognition (FGR 04), (pp. 785-790), Seoul. Korea.2004.