Conditions of Applications, Situations and Functions Applicable to Gesture Interface |
Ryu, Tae-Beum
(Department of Industrial and Management Engineering, Hanbat National University)
Lee, Jae-Hong (Department of Industrial Engineering, Seoul National University) Song, Joo-Bong (Department of Industrial Engineering, Seoul National University) Yun, Myung-Hwan (Department of Industrial Engineering, Seoul National University) |
1 | Bhuiyan, M. and Picking, R., "Gesture-controlled user interfaces, what have we done and what's next?" 5th Collaborative Research Symposium on Security, E-Learning, Internet and Networking(pp. 59-60), Darmstadt. Germany. 2009. |
2 | Bhuiyan, M. and Picking, R. A., Gesture controlled user interface for inclusive design and evaluative study of its usability, Journal of Software Engineering and Applications, 4(513-521), 2011. DOI |
3 | Blatt, L. and Schell, A., "Gesture Set Economics for Text and Spreadsheet Editors". Proceedings of the Human Factors and Ergonomics Society 34th Annual meeting(pp. 410-414), Orlando, FL. 1990. |
4 | Guo, C. and Sharlin, E., "Exploring the use of tangible user interfaces for human-robot interaction: a comparative study". CHI '08: Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems(pp. 121-130), Florence. Italy. 2008. |
5 | Wilson, A. and Oliver. N., GWindows: Towards Robust Perception-Based UI. in First IEEE Workshop on Computer Vision and Pattern Recognition for Human Computer Interaction. 2003. |
6 | Young, J., Sung, J., Voida, A. and Sharlin, E., Evaluating human-robot interaction, International Journal of Social Robotics, 3, 53-67, 2011. DOI |
7 | Rauschert, I., Agrawal, P., Sharma, R., Fuhrmann, S., Brewer, I. and MacEachren, A. M., "Designing a human-centered, multimodal GIS interface to support emergency management". Proceedings of the 10th ACM International Symposium on Advances in Geographic Information Systems(pp. 119-124), McLean. VA. 2002. |
8 | Nesselrath, R., Lu, C., Schulz, C.H., Frey, J. and Alexandersson, J., A gesture based system for context-sensitive interaction with smart homes, In R. Wichert and B.Eberhardt(Eds), Advanced Technologies and Societal Change, Springer, Berlin, 209-219, 2011. |
9 | Oviatt, S., DeAngeli, A. and Kuhn, K., Integration and synchronization of input modes during multimodal human-computer interaction. Referring Phenomena in a Multimedia Context and their Computational Treatment, 1-13, 1997. |
10 | Oviatt, S., Ten Myths of Multimodal Interaction. Communications of the ACM, 42(11), 74-81, 1999. DOI |
11 | Rico, J., "Usable gestures for mobile interfaces: evaluating social acceptability", Proceedings of the 28th international conference on Human factors in computing system(pp. 887-896), Atlanta. GA. 2010. |
12 | Wachs, J., Kolsch, M. and Stern, H., Vision-based hand-gesture applications. Communications of the ACM, 54(2), 60-71, 2011. DOI ScienceOn |
13 | Ronkainen, S., Koskinen, E., Liu, Y. and Korhonen, P., Environment Analysis as a Basis for Designing Multimodal and Multidevice User Interfaces, Human-Computer Interaction, 25(2), 148-193, 2010. DOI |
14 | Rhyne, J., Dialogue Management for Gestural Interfaces. Computer graphics, 21(2), 137-142, 1987. DOI |
15 | Shan, C., Gesture Control for Consumer Electronics, Multimedia Interaction and Intelligent User Interfaces, 107-128, 2010. DOI |
16 | Wickens, C. D. and Hollands, J. G., Engineering psychology and human performance, 3rd ed., Prentice Hall, 1999. |
17 | Hummels, C. and Stappers, P. J., "Meaningful gestures for human computer interaction: beyond hand postures". Proceeding of Third IEEE International Conference on Automatic Face and Gesture Recognition(pp. 591-596), Nara. Japan. 1998. |
18 | Hurtienne, J., Stößel, C. and Sturm, C., Physical gestures for abstract concepts: Inclusive design with primary metaphors. Interacting with Computers, 22(6), 475-484, 2010. DOI |
19 | Jia, P., Hu, H., Lu, T. and Yuan, K., Head gesture recognition for hands-free control of an intelligent wheelchair. Industrial Robot: An International Journal, 34(1), 60-68, 2007. DOI ScienceOn |
20 | Kela, J., Korpipaa, P., Mäntyjarvi, J., Kallio, S., Savino, G., Jozzo, L. and Marca, D., Accelerometer-based gesture control for a design environment, Personal and Ubiquitous Computing, 10(5), 285-299, 2006. DOI |
21 | Kuhnel, C., Westermann, T. and Hemmert, F., I'm home: Defining and evaluating a gesture set for smart-home control, International Journal of Human-Computer Studies, 69(11), 693-704, 2011. DOI |
22 | Nishikawa, A., Hosoi, T., Koara, K., Negoro, D., Hikita, A., Asano, S., Kakutani, H., Miyazaki, F., Sekimoto, M., Yasui, M., Miyake, Y., Takiguchi, S., and Monden, M., FAce MOUSe: A novel humanmachine interface for controlling the position of a laparoscope. IEEE Transactions on Robotics and Automation 19(5), 825-841, 2003. DOI |
23 | Li, J., Communication of Emotion in Social Robots through Simple Head and Arm Movements, International Journal of Social Robotics, 3, 125-142, 2010. |
24 | Mitra, S. and Acharya, T., Gesture recognition: A survey. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 37(3), 311-324, 2007. DOI ScienceOn |
25 | Nielsen, M., Storring, M., Moeslund, T. B. and Granum, E., "A procedure for developing intuitive and ergonomic gesture interfaces for manmachine interaction". Proceedings of the 5th International Gesture Workshop(pp. 1-12), Aalborg. Denmark. 2003. |