DOI QR코드

DOI QR Code

A Study on Developmental Direction of Interface Design for Gesture Recognition Technology

  • Received : 2012.07.23
  • Accepted : 2012.07.31
  • Published : 2012.08.31

Abstract

Objective: Research on the transformation of interaction between mobile machines and users through analysis on current gesture interface technology development trend. Background: For smooth interaction between machines and users, interface technology has evolved from "command line" to "mouse", and now "touch" and "gesture recognition" have been researched and being used. In the future, the technology is destined to evolve into "multi-modal", the fusion of the visual and auditory senses and "3D multi-modal", where three dimensional virtual world and brain waves are being used. Method: Within the development of computer interface, which follows the evolution of mobile machines, actively researching gesture interface and related technologies' trend and development will be studied comprehensively. Through investigation based on gesture based information gathering techniques, they will be separated in four categories: sensor, touch, visual, and multi-modal gesture interfaces. Each category will be researched through technology trend and existing actual examples. Through this methods, the transformation of mobile machine and human interaction will be studied. Conclusion: Gesture based interface technology realizes intelligent communication skill on interaction relation ship between existing static machines and users. Thus, this technology is important element technology that will transform the interaction between a man and a machine more dynamic. Application: The result of this study may help to develop gesture interface design currently in use.

Keywords

References

  1. Archibugi, D., "Patenting as an Indicator of Technological Innovation: A Review", Science and Public Policy, Vol. 19, No. 6, 1992.
  2. Cheung, G. K., Baker, S., Hodgins, J. and Kandade. T., "Markerless Human Motion Transfer", International Symposium on 3D Data processing. Visualization and Transmission, pp. 373-378, 2004.
  3. Choi, E.-S., Bang, W.-C., Cho, S.-J., Yang, J., Kim, D.-Y. and Kim. S.-R., "Beatbox Music Phone: Gesture-based Interactive Mobile Phone using a Tri-axis Accelerometer", in IEEE International Conference on Industrial Technology(ICIT 2005), pp. 97-102, 2005.
  4. Fano, A. and Gershman, A., "Issues and challenges in ubiquitous computing: The future of business services in the age of ubiquitous computing", Communications of the ACM. Special Issue: Issues and challenges in ubiquitous computing, Vol. 45, No. 2, pp. 83-87, 2002.
  5. Furness, T. A. and Kocian, D. F., "Putting Humans into Virtual Space", Proceedings of the 16th Conference on Aerospace Simulation, 2(pp. 48-52), San Diego. CA. 1986.
  6. Hannukela, J., Huttunen, S., Sangi, P. and Heikkila. J., "Motion-based finger tracking for user interaction with mobile devices", European Congerence on Visual Media Production, pp. 1-6, 2007.
  7. Lee, M., Yoong Choon Chang and TseKian Neo., "S Fast motion Sensing Algorithm for Vision Based Mobile Phones User Interface Design", IEEE Pacific Rim Conference on Communications Computers and Signal Procissing, pp. 336-341, 2009.
  8. Kallio, S., Kela, J. and Mantyjarvi. J., "Online gesture recognition system for mobile interaction", IEEE International Conference on Systems. Man and Cybernetics, Vol. 3, pp. 2070-2076, 2003.
  9. Paperno, E., Sasada, I. and Leonovich. E., "A New Method for magnetic Position and Orientation Tracking", IEEE Transaction on Magnetics, Vol. 37, No. 4, pp. 1938-1940, 2001. https://doi.org/10.1109/20.951014
  10. Potamanos, G., Graf, H. P. and Cosatto. E., "An Image Transform Approach For HMM Based Automatic Lipreading", International Congerence on Image Processing, pp. 173-177, 1998.
  11. Rabiner, L. R., "A tutorial on hidden markov models and selected applications in speech recognition", Proceedings of the IEEE, Vol. 77, No. 5, pp. 257-286, 1989. https://doi.org/10.1109/5.18626
  12. Rajeev Sharma, Vladimir I Pavlovic and Thomas S. Huang., "Toward Multimodal Humman-Computer Interface", Proceedings of the IEEE, Vol. 86, No. 5, pp. 853-869, 1998. https://doi.org/10.1109/5.664275
  13. S. Burak Gokturk, Hakan Yalcin and Cyrus Bamji., "A Time - Of - Flight Depth Sensor - System Description, Issues and Solutions", International Congerence on Computer Vision and Pattern Recognition Workshop, Vol. 3, pp. 35-44, 2004.
  14. Steve Hotelling, Joshua A Strickon and Brian Q Huppi., "Multipoint touchscreen", US, 2006-0097991(7663607), 2006.
  15. Wei Du and Hua Li., "Vision based gesture recognition system with single camera", International Conference on Signal Processing, Vol. 2, pp. 1351-1357, 2000.
  16. Yun Xiaoping, Bachmann, E. R. and McGhee, R. B., "A Simplified Quaternion-Based Algorithm for Orientation Estimation From Earth Gravity and Magnetic Field Measurements", IEEE Transactions on Instrumentation and Measurement, Vol. 57, No. 3, pp. 638-650, 2008. https://doi.org/10.1109/TIM.2007.911646