References
- Bhuiyan, M. and Picking, R., "Gesture-controlled user interfaces, what have we done and what's next?" 5th Collaborative Research Symposium on Security, E-Learning, Internet and Networking(pp. 59-60), Darmstadt. Germany. 2009.
- Bhuiyan, M. and Picking, R. A., Gesture controlled user interface for inclusive design and evaluative study of its usability, Journal of Software Engineering and Applications, 4(513-521), 2011. https://doi.org/10.4236/jsea.2011.49059
- Blatt, L. and Schell, A., "Gesture Set Economics for Text and Spreadsheet Editors". Proceedings of the Human Factors and Ergonomics Society 34th Annual meeting(pp. 410-414), Orlando, FL. 1990.
- Guo, C. and Sharlin, E., "Exploring the use of tangible user interfaces for human-robot interaction: a comparative study". CHI '08: Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems(pp. 121-130), Florence. Italy. 2008.
- Hummels, C. and Stappers, P. J., "Meaningful gestures for human computer interaction: beyond hand postures". Proceeding of Third IEEE International Conference on Automatic Face and Gesture Recognition(pp. 591-596), Nara. Japan. 1998.
- Hurtienne, J., Stößel, C. and Sturm, C., Physical gestures for abstract concepts: Inclusive design with primary metaphors. Interacting with Computers, 22(6), 475-484, 2010. https://doi.org/10.1016/j.intcom.2010.08.009
- Jia, P., Hu, H., Lu, T. and Yuan, K., Head gesture recognition for hands-free control of an intelligent wheelchair. Industrial Robot: An International Journal, 34(1), 60-68, 2007. https://doi.org/10.1108/01439910710718469
- Kela, J., Korpipaa, P., Mäntyjarvi, J., Kallio, S., Savino, G., Jozzo, L. and Marca, D., Accelerometer-based gesture control for a design environment, Personal and Ubiquitous Computing, 10(5), 285-299, 2006. https://doi.org/10.1007/s00779-005-0033-8
- Kuhnel, C., Westermann, T. and Hemmert, F., I'm home: Defining and evaluating a gesture set for smart-home control, International Journal of Human-Computer Studies, 69(11), 693-704, 2011. https://doi.org/10.1016/j.ijhcs.2011.04.005
- Li, J., Communication of Emotion in Social Robots through Simple Head and Arm Movements, International Journal of Social Robotics, 3, 125-142, 2010.
- Mitra, S. and Acharya, T., Gesture recognition: A survey. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 37(3), 311-324, 2007. https://doi.org/10.1109/TSMCC.2007.893280
- Nielsen, M., Storring, M., Moeslund, T. B. and Granum, E., "A procedure for developing intuitive and ergonomic gesture interfaces for manmachine interaction". Proceedings of the 5th International Gesture Workshop(pp. 1-12), Aalborg. Denmark. 2003.
- Nishikawa, A., Hosoi, T., Koara, K., Negoro, D., Hikita, A., Asano, S., Kakutani, H., Miyazaki, F., Sekimoto, M., Yasui, M., Miyake, Y., Takiguchi, S., and Monden, M., FAce MOUSe: A novel humanmachine interface for controlling the position of a laparoscope. IEEE Transactions on Robotics and Automation 19(5), 825-841, 2003. https://doi.org/10.1109/TRA.2003.817093
- Nesselrath, R., Lu, C., Schulz, C.H., Frey, J. and Alexandersson, J., A gesture based system for context-sensitive interaction with smart homes, In R. Wichert and B.Eberhardt(Eds), Advanced Technologies and Societal Change, Springer, Berlin, 209-219, 2011.
- Oviatt, S., DeAngeli, A. and Kuhn, K., Integration and synchronization of input modes during multimodal human-computer interaction. Referring Phenomena in a Multimedia Context and their Computational Treatment, 1-13, 1997.
- Oviatt, S., Ten Myths of Multimodal Interaction. Communications of the ACM, 42(11), 74-81, 1999. https://doi.org/10.1145/319382.319398
- Rauschert, I., Agrawal, P., Sharma, R., Fuhrmann, S., Brewer, I. and MacEachren, A. M., "Designing a human-centered, multimodal GIS interface to support emergency management". Proceedings of the 10th ACM International Symposium on Advances in Geographic Information Systems(pp. 119-124), McLean. VA. 2002.
- Rico, J., "Usable gestures for mobile interfaces: evaluating social acceptability", Proceedings of the 28th international conference on Human factors in computing system(pp. 887-896), Atlanta. GA. 2010.
- Ronkainen, S., Koskinen, E., Liu, Y. and Korhonen, P., Environment Analysis as a Basis for Designing Multimodal and Multidevice User Interfaces, Human-Computer Interaction, 25(2), 148-193, 2010. https://doi.org/10.1080/07370020903586712
- Rhyne, J., Dialogue Management for Gestural Interfaces. Computer graphics, 21(2), 137-142, 1987. https://doi.org/10.1145/24919.24933
- Shan, C., Gesture Control for Consumer Electronics, Multimedia Interaction and Intelligent User Interfaces, 107-128, 2010. https://doi.org/10.1007/978-1-84996-507-1_5
- Wachs, J., Kolsch, M. and Stern, H., Vision-based hand-gesture applications. Communications of the ACM, 54(2), 60-71, 2011. https://doi.org/10.1145/1897816.1897838
- Wickens, C. D. and Hollands, J. G., Engineering psychology and human performance, 3rd ed., Prentice Hall, 1999.
- Wilson, A. and Oliver. N., GWindows: Towards Robust Perception-Based UI. in First IEEE Workshop on Computer Vision and Pattern Recognition for Human Computer Interaction. 2003.
- Young, J., Sung, J., Voida, A. and Sharlin, E., Evaluating human-robot interaction, International Journal of Social Robotics, 3, 53-67, 2011. https://doi.org/10.1007/s12369-010-0081-8