• Title/Summary/Keyword: Gesture Interface

Search Result 231, Processing Time 0.022 seconds

A Study on Gesture Interface through User Experience (사용자 경험을 통한 제스처 인터페이스에 관한 연구)

  • Yoon, Ki Tae;Cho, Eel Hea;Lee, Jooyoup
    • Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology
    • /
    • v.7 no.6
    • /
    • pp.839-849
    • /
    • 2017
  • Recently, the role of the kitchen has evolved from the space for previous survival to the space that shows the present life and culture. Along with these changes, the use of IoT technology is spreading. As a result, the development and diffusion of new smart devices in the kitchen is being achieved. The user experience for using these smart devices is also becoming important. For a natural interaction between a user and a computer, better interactions can be expected based on context awareness. This paper examines the Natural User Interface (NUI) that does not touch the device based on the user interface (UI) of the smart device used in the kitchen. In this method, we use the image processing technology to recognize the user's hand gesture using the camera attached to the device and apply the recognized hand shape to the interface. The gestures used in this study are proposed to gesture according to the user's context and situation, and 5 kinds of gestures are classified and used in the interface.

Hand Gesture Interface Using Mobile Camera Devices (모바일 카메라 기기를 이용한 손 제스처 인터페이스)

  • Lee, Chan-Su;Chun, Sung-Yong;Sohn, Myoung-Gyu;Lee, Sang-Heon
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.16 no.5
    • /
    • pp.621-625
    • /
    • 2010
  • This paper presents a hand motion tracking method for hand gesture interface using a camera in mobile devices such as a smart phone and PDA. When a camera moves according to the hand gesture of the user, global optical flows are generated. Therefore, robust hand movement estimation is possible by considering dominant optical flow based on histogram analysis of the motion direction. A continuous hand gesture is segmented into unit gestures by motion state estimation using motion phase, which is determined by velocity and acceleration of the estimated hand motion. Feature vectors are extracted during movement states and hand gestures are recognized at the end state of each gesture. Support vector machine (SVM), k-nearest neighborhood classifier, and normal Bayes classifier are used for classification. SVM shows 82% recognition rate for 14 hand gestures.

Design of Contactless Gesture-based Rhythm Action Game Interface for Smart Mobile Devices

  • Ju, Da-Young
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.585-591
    • /
    • 2012
  • Objective: The aim of this study is to propose the contactless gesture-based interface on smart mobile devices for especially rhythm action games. Background: Most existing approaches about interactions of smart mobile games are tab on the touch screen. However that way is such undesirable for someone or for sometimes, because of the disabled person, or the inconvenience that users need to touch/tab specific devices. Moreover more importantly, new interaction can derive new possibilities from stranded game genre. Method: In this paper, I present a smart mobile game with contactless gesture-based interaction and the interfaces using computer vision technology. Discovering the gestures which are easy to recognize and research of interaction system that fits to game on smart mobile device are conducted as previous studies. A combination between augmented reality technique and contactless gesture interaction is also tried. Results: The rhythm game allows a user to interact with smart mobile devices using hand gestures, without touching or tabbing the screen. Moreover users can feel fun in the game as other games. Conclusion: Evaluation results show that users make low failure numbers, and the game is able to recognize gestures with quite high precision in real time. Therefore the contactless gesture-based interaction has potentials to smart mobile game. Application: The results are applied to the commercial game application.

A new study on hand gesture recognition algorithm using leap motion system (Leap Motion 시스템을 이용한 손동작 인식기반 제어 인터페이스 기술 연구)

  • Nam, Jae-Hyun;Yang, Seung-Hun;Hu, Woong;Kim, Byung-Gyu
    • Journal of Korea Multimedia Society
    • /
    • v.17 no.11
    • /
    • pp.1263-1269
    • /
    • 2014
  • As rapid development of new hardware control interface technology, new concepts have been being proposed and emerged. In this paper, a new approach based on leap motion system is proposed. While we employ a position information from sensor, the hand gesture recognition is suggested with the pre-defined patterns. To do this, we design a recognition algorithm with hand gesture and finger patterns. We apply the proposed scheme to 3-dimensional avatar controling and editing software tool for making animation in the cyber space as a representative application. This proposed algorithm can be used to control computer systems in medical treatment, game, education and other various areas.

Gesture Communication: Collaborative and Participatory Design in a New Type of Digital Communication (제스츄어 커뮤니케이션: 새로운 방식의 디지털 커뮤니케이션의 참여 디자인 제안)

  • Won, Ha Youn
    • Korea Science and Art Forum
    • /
    • v.20
    • /
    • pp.307-314
    • /
    • 2015
  • Tele-Gesture is a tangible user interface(TUI) device that allows a user to physically point to a 3D object in real life and have their gestures play back by a robotic finger that can point to the same object, either at the same time, or at another point in time. To understand the extent of the gestures as new way of digital collaborative communication, collaboration situation and types were experimented as TUI implementations. The design prototype reveals that there is a rich non-verbal component of communication in the form of gesture-clusters and body movements that happen in an digital communication. This result of analysis can contribute to compile relevant contributions to the fields of communication, human behavior, and interaction with high technology through an interpretive social experience.

A Study on Tangible Gesture Interface Prototype Development of the Quiz Game (퀴즈게임의 체감형 제스처 인터페이스 프로토타입 개발)

  • Ahn, Jung-Ho;Ko, Jae-Pil
    • Journal of Digital Contents Society
    • /
    • v.13 no.2
    • /
    • pp.235-245
    • /
    • 2012
  • This paper introduce a quiz game contents based on gesture interface. We analyzed the off-line quiz games, extracted its presiding components, and digitalized them so that the proposed game contents is able to substitute for the off-line quiz games. We used the Kinect camera to obtain the depth images and performed the preprocessing including vertical human segmentation, head detection and tracking and hand detection, and gesture recognition for hand-up, hand vertical movement, fist shape, pass and fist-and-attraction. Especially, we defined the interface gestures designed as a metaphor for natural gestures in real world so that users are able to feel abstract concept of movement, selection and confirmation tangibly. Compared to our previous work, we added the card compensation process for completeness, improved the vertical hand movement and the fist shape recognition methods for the example selection and presented an organized test to measure the recognition performance. The implemented quiz application program was tested in real time and showed very satisfactory gesture recognition results.

A Development of the Next-generation Interface System Based on the Finger Gesture Recognizing in Use of Image Process Techniques (영상처리를 이용한 지화인식 기반의 차세대 인터페이스 시스템 개발)

  • Kim, Nam-Ho
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.4
    • /
    • pp.935-942
    • /
    • 2011
  • This study aims to design and implement the finger gesture recognizing system that automatically recognizes finger gestures input through a camera and controls the computer. Common CCD cameras were redesigned as infrared light cameras to acquire the images. The recorded images go through the pre-process to find the hand features, the finger gestures are read accordingly, and an event takes place for the follow-up mouse controlling and presentation, and finally the way to control computers is suggested. The finger gesture recognizing system presented in this study has been verified as the next-generation interface to replace the mouse and keyboard for the future information-based units.

A Hierarchical Bayesian Network for Real-Time Continuous Hand Gesture Recognition (연속적인 손 제스처의 실시간 인식을 위한 계층적 베이지안 네트워크)

  • Huh, Sung-Ju;Lee, Seong-Whan
    • Journal of KIISE:Software and Applications
    • /
    • v.36 no.12
    • /
    • pp.1028-1033
    • /
    • 2009
  • This paper presents a real-time hand gesture recognition approach for controlling a computer. We define hand gestures as continuous hand postures and their movements for easy expression of various gestures and propose a Two-layered Bayesian Network (TBN) to recognize those gestures. The proposed method can compensate an incorrectly recognized hand posture and its location via the preceding and following information. In order to vertify the usefulness of the proposed method, we implemented a Virtual Mouse interface, the gesture-based interface of a physical mouse device. In experiments, the proposed method showed a recognition rate of 94.8% and 88.1% for a simple and cluttered background, respectively. This outperforms the previous HMM-based method, which had results of 92.4% and 83.3%, respectively, under the same conditions.

Implementation of Hand-Gesture Interface to manipulate a 3D Object of Augmented Reality (증강현실의 3D 객체 조작을 위한 핸드-제스쳐 인터페이스 구현)

  • Jang, Myeong-Soo;Lee, Woo-Beom
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.16 no.4
    • /
    • pp.117-123
    • /
    • 2016
  • A hand-gesture interface to manipulate a 3D object of augmented reality is implemented by recognizing the user hand-gesture in this paper. Proposed method extracts the hand region from real image, and creates augmented object by hand marker recognized user hand-gesture. Also, 3D object manipulation corresponding to user hand-gesture is performed by analyzing a hand region ratio, a numbet of finger and a variation ratio of hand region center. In order to evaluate the performance of the our proposed method, after making a 3D object by using the OpenGL library, all processing tasks are implemented by using the Intel OpenCV library and C++ language. As a result, the proposed method showed the average 90% recognition ratio by the user command-modes successfully.