Browse > Article
http://dx.doi.org/10.5143/JESK.2012.31.4.475

Interacting with Touchless Gestures: Taxonomy and Requirements  

Kim, Huhn (Department of Mechanical System Design Engineering, Seoul National University of Science and Technology)
Publication Information
Journal of the Ergonomics Society of Korea / v.31, no.4, 2012 , pp. 475-481 More about this Journal
Abstract
Objective: The aim of this study is to make the taxonomy for classifying diverse touchless gestures and establish the design requirements that should be considered in determining suitable gestures during gesture-based interaction design. Background: Recently, the applicability of touchless gestures is more and more increasing as relevant technologies are being advanced. However, before touchless gestures are widely applied to various devices or systems, the understanding on human gestures' natures and their standardization should be prerequisite. Method: In this study, diverse gesture types in various literatures were collected and, based on those, a new taxonomy for classifying touchless gestures was proposed. And many gesture-based interaction design cases and studies were analyzed. Results: The proposed taxonomy consisted of two dimensions: shape (deictic, manipulative, semantic, or descriptive) and motion(static or dynamic). The case analysis based on the taxonomy showed that manipulative and dynamic gestures were widely applied. Conclusion: Four core requirements for valuable touchless gestures were intuitiveness, learnability, convenience and discriminability. Application: The gesture taxonomy can be applied to produce alternatives of applicable touchless gestures, and four design requirements can be used as the criteria for evaluating the alternatives.
Keywords
Touchless or non-touch gestures; Gesture taxonomy; Gesture requirements; Gesture-based interaction;
Citations & Related Records
Times Cited By KSCI : 2  (Citation Analysis)
연도 인용수 순위
1 Zerlina, F. V., Huang, H., Ku, D. H., Lee, B. L. and Lin, Y., Gesture-based Interface Cognitive Work Evaluation in a Driving Context, IE486 Final Project, School of Industrial Engineering, Purdue University.
2 Mantyjarvi, J., Kela, J., Korpipaa, P. and Kallio, S., Enabling Fast and Effortless Customisation in Accelerometer based Gesture Interaction, MUM '04 Proceedings of the 3rd international conference on Mobile and ubiquitous multimedia, 25-31, 2004.
3 Nam, J. Y., Choe, J. and Jung, E. S., Development of Finger Gestures for Touchscreen-based Web Browser Operation, Journal of the Ergonomics Society of Korea, Vol. 27, No. 4, pp.109-117, 2008.   과학기술학회마을   DOI
4 Nielsen, M. Störring, M., Moeslund, T. B. and Granum, E., A Procedure for Developing Intuitive and Ergonomic Gesture Interface for HCI, In: Gesture-Based Communication in Human-Computer Interaction, 105-106, 2004.
5 Pavlovic, I, V., Sharna, R. and Huang, S, T., Visual Interpretation of Hand Gestures for Human-Computer Interaction, IEEE Transactions on Patiern Analysis and Machine Intelligence, Vol. 19, No. 7, 1997.
6 Pickering, C. A., Burnham, K. J. and Richardson, M. J., A Research Study of Hand Gesture Recognition Technologies and Applications for Human Vehicle Interaction, Automotive Electronics, 3rd Institution of Engineering and Technology Conference, pp.1-15, 28-29 June, 2007.
7 Rahman, A. M., Hossain, M. A. and Parra, J., Motion-Path based Gesture Interaction with Smart Home Services, MM '09 Proceedings of the 17th ACM international conference on Multimedia, 761-764, 2009.
8 Saffer, D., Designing Gestural Interfaces, O'Reilly Media, Inc., 2009.
9 Vatavu, R. D. & Pentiuc, S. G., Multi-level Representation of Gesture as Command for Human Computer Interaction, Computing and Informatics, Vol. 27, 1001-1015, 2008.
10 Visteon, http://www.youtube.com/watch?v=dd8i1fD_ia8, 2010.
11 Fourney, A., Terry, M. and Mann, R., http://www.youtube.com/watch?v=sjoRzqudssk, University of Waterloo, 2009.
12 Henze, N., Locken, A., Boll, S., Hesselmann, T. and Pielot, M., Free-Hand Gestures for Music Playback: Deriving Gestures with a User-Centred Process, MUM'10, December 1-3, 2010.
13 Alpern, M. and Minardo, K., Developing a car gesture interface for use as a secondary task, CHI 2003, pp.932-933, 2003.
14 Bach, K. M., Jæger, M. G., Skov, M. B. and Thomassen, N. G., You Can Touch, but You Can't Look: Interacting with In-Vehicle Systems, CHI 2008, pp.1139-1148, April 5-10, 2008.
15 eyeSight, http://www.youtube.com/watch?v=FRJp7b-EFbI, 2010.
16 Hitachi, http://www.youtube.com/watch?v=O21SYHDEPOs&feature= related, 2008.
17 Karam, M. and Schraefel, M. C., A Taxonomy of Gestures in Human Computer Interaction, Technical Report ECSTR-IAM05-009, Electronics and Computer science, University of Southampton, 2005.
18 Kim, H. S., Hwang, S. W. and Moon, H. J., A Study on Vision Based Gesture Recognition Interface Design for Digital TV, Journal of Korean Society of Design Science, Vol. 20, No. 3, 257-268, 2007.   과학기술학회마을
19 Kim, H. and Park, M. K., A Study on the Constitutional Elements of GUI that Induces Gestures in Touch Screen, Journal of the Korean Society of Design Cluture, Vol. 15, No. 2, 146-157, 2009.
20 Kim, H. and Song, H. W., Towards Designing More Intuitive Touchless Operations based on Hand Gestures, Journal of Korean Society of Design Science, Vol. 25, No. 1, 269-277, 2012.