• Title/Summary/Keyword: User-Defined Gesture

Search Result 15, Processing Time 0.017 seconds

Gesture interface with 3D accelerometer for mobile users (모바일 사용자를 위한 3 차원 가속도기반 제스처 인터페이스)

  • Choe, Bong-Whan;Hong, Jin-Hyuk;Cho, Sung-Bae
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.378-383
    • /
    • 2009
  • In these days, many systems are equipped with people to infer their intention and provide the corresponding service. People always carry their own mobile device with various sensors, and the accelerator takes a role in this environment. The accelerator collects motion information, which is useful for the development of gesture-based user interfaces. Generally, it needs to develop an effective method for the mobile environment that supports relatively less computational capability since huge computation is required to recognize time-series patterns such as gestures. In this paper, we propose a 2-stage motion recognizer composed of low-level and high-level motions based on the motion library. The low-level motion recognizer uses the dynamic time warping with 3D acceleration data, and the high-level motion is defined linguistically with the low-level motions.

  • PDF

Human Gesture Recognition Technology Based on User Experience for Multimedia Contents Control (멀티미디어 콘텐츠 제어를 위한 사용자 경험 기반 동작 인식 기술)

  • Kim, Yun-Sik;Park, Sang-Yun;Ok, Soo-Yol;Lee, Suk-Hwan;Lee, Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.10
    • /
    • pp.1196-1204
    • /
    • 2012
  • In this paper, a series of algorithms are proposed for controlling different kinds of multimedia contents and realizing interact between human and computer by using single input device. Human gesture recognition based on NUI is presented firstly in my paper. Since the image information we get it from camera is not sensitive for further processing, we transform it to YCbCr color space, and then morphological processing algorithm is used to delete unuseful noise. Boundary Energy and depth information is extracted for hand detection. After we receive the image of hand detection, PCA algorithm is used to recognize hand posture, difference image and moment method are used to detect hand centroid and extract trajectory of hand movement. 8 direction codes are defined for quantifying gesture trajectory, so the symbol value will be affirmed. Furthermore, HMM algorithm is used for hand gesture recognition based on the symbol value. According to series of methods we presented, we can control multimedia contents by using human gesture recognition. Through large numbers of experiments, the algorithms we presented have satisfying performance, hand detection rate is up to 94.25%, gesture recognition rate exceed 92.6%, hand posture recognition rate can achieve 85.86%, and face detection rate is up to 89.58%. According to these experiment results, we can control many kinds of multimedia contents on computer effectively, such as video player, MP3, e-book and so on.

Gesture Recognition based on Motion Inertial Sensors for Interactive Game Contents (체험형 게임콘텐츠를 위한 움직임 관성센서 기반의 제스처 인식)

  • Jung, Young-Kee;Cha, Byung-Rae
    • Journal of Advanced Navigation Technology
    • /
    • v.13 no.2
    • /
    • pp.262-271
    • /
    • 2009
  • The purpose of this study was to propose the method to recognize gestures based on inertia sensor which recognizes the movements of the user using inertia sensor and lets the user enjoy the game by comparing the recognized movements with the pre-defined movements for the game contents production. Additionally, it was tried to provide users with various data entry methods by letting them wear small controllers using three-axis accelerator sensor. Users can proceed the game by moving according to the action list printed on the screen. They can proceed the experiential games according to the accuracy and timing of their movements. If they use multiple small wireless controllers together wearing them on the major parts of hands and feet and utilize the proposed methods, they will be more interested in the game and be absorbed in it.

  • PDF

Experience Design Guideline for Smart Car Interface (스마트카의 인터페이스를 위한 경험 디자인 가이드라인)

  • Yoo, Hoon Sik;Ju, Da Young
    • Design Convergence Study
    • /
    • v.15 no.1
    • /
    • pp.135-150
    • /
    • 2016
  • Due to the development of communication technology and expansion of Intelligent Transport System (ITS), the car is changing from a simple mechanical device to second living space which has comprehensive convenience function and is evolved into the platform which is playing as an interface for this role. As the interface area to provide various information to the passenger is being expanded, the research importance about smart car based user experience is rising. This study has a research objective to propose the guidelines regarding the smart car user experience elements. In order to conduct this study, smart car user experience elements were defined as function, interaction, and surface and through the discussions of UX/UI experts, 8 representative techniques, 14 representative techniques, and 8 locations of the glass windows were specified for each element. Following, the smart car users' priorities of the experience elements, which were defined through targeting 100 drivers, were analyzed in the form of questionnaire survey. The analysis showed that the users' priorities in applying the main techniques were in the order of safety, distance, and sensibility. The priorities of the production method were in the order of voice recognition, touch, gesture, physical button, and eye tracking. Furthermore, regarding the glass window locations, users prioritized the front of the driver's seat to the back. According to the demographic analysis on gender, there were no significant differences except for two functions. Therefore this showed that the guidelines of male and female can be commonly applied. Through user requirement analysis about individual elements, this study provides the guides about the requirement in each element to be applied to commercialized product with priority.

User-centric Immersible and Interactive Electronic Book based on the Interface of Tabletop Display (테이블탑 디스플레이 기반 사용자 중심의 실감형 상호작용 전자책)

  • Song, Dae-Hyeon;Park, Jae-Wan;Lee, Chil-Woo
    • The Journal of the Korea Contents Association
    • /
    • v.9 no.6
    • /
    • pp.117-125
    • /
    • 2009
  • In this paper, we propose user-centric immersible and interactive electronic book based on the interface of tabletop display. Electronic book is usually used for users that want to read the text book with multimedia contents like video, audio, animation and etc. It is based on tabletop display platform then the conventional input device like keyboard and mouse is not essentially needed. Users can interact with the contents based on the gestures defined for the interface of tabletop display using hand finger touches then it gives superior and effective interface for users to use the electronic book interestingly. This interface supports multiple users then it gives more diverse effects on the conventional electronic contents just made for one user. In this paper our method gives new way for the conventional electronics book and it can define the user-centric gestures and help users to interact with the book easily. We expect our method can be utilized for many edutainment contents.