• Title/Summary/Keyword: Touch gesture

Search Result 73, Processing Time 0.018 seconds

Multi - Modal Interface Design for Non - Touch Gesture Based 3D Sculpting Task (비접촉식 제스처 기반 3D 조형 태스크를 위한 다중 모달리티 인터페이스 디자인 연구)

  • Son, Minji;Yoo, Seung Hun
    • Design Convergence Study
    • /
    • v.16 no.5
    • /
    • pp.177-190
    • /
    • 2017
  • This research aims to suggest a multimodal non-touch gesture interface design to improve the usability of 3D sculpting task. The task and procedure of design sculpting of users were analyzed across multiple circumstances from the physical sculpting to computer software. The optimal body posture, design process, work environment, gesture-task relationship, the combination of natural hand gesture and arm movement of designers were defined. The preliminary non-touch 3D S/W were also observed and natural gesture interaction, visual metaphor of UI and affordance for behavior guide were also designed. The prototype of gesture based 3D sculpting system were developed for validation of intuitiveness and learnability in comparison to the current S/W. The suggested gestures were proved with higher performance as a result in terms of understandability, memorability and error rate. Result of the research showed that the gesture interface design for productivity system should reflect the natural experience of users in previous work domain and provide appropriate visual - behavioral metaphor.

Design of Contactless Gesture-based Rhythm Action Game Interface for Smart Mobile Devices

  • Ju, Da-Young
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.585-591
    • /
    • 2012
  • Objective: The aim of this study is to propose the contactless gesture-based interface on smart mobile devices for especially rhythm action games. Background: Most existing approaches about interactions of smart mobile games are tab on the touch screen. However that way is such undesirable for someone or for sometimes, because of the disabled person, or the inconvenience that users need to touch/tab specific devices. Moreover more importantly, new interaction can derive new possibilities from stranded game genre. Method: In this paper, I present a smart mobile game with contactless gesture-based interaction and the interfaces using computer vision technology. Discovering the gestures which are easy to recognize and research of interaction system that fits to game on smart mobile device are conducted as previous studies. A combination between augmented reality technique and contactless gesture interaction is also tried. Results: The rhythm game allows a user to interact with smart mobile devices using hand gestures, without touching or tabbing the screen. Moreover users can feel fun in the game as other games. Conclusion: Evaluation results show that users make low failure numbers, and the game is able to recognize gestures with quite high precision in real time. Therefore the contactless gesture-based interaction has potentials to smart mobile game. Application: The results are applied to the commercial game application.

Implementation of new gestures on the Multi-touch table

  • Park, Sang Bong;Kim, Beom jin
    • International Journal of Advanced Culture Technology
    • /
    • v.1 no.1
    • /
    • pp.15-18
    • /
    • 2013
  • This paper describes new gestures on the Multi-touch table. The 2 new gestures with 3 fingers are used for minimizing of all windows that is already open and converting Aero mode. We also implement a FTIR (Frustrated Total Internal Reflection) Multi-touch table that consists of sheet of acrylic, infrared LEDs, camera and rear projector. The operation of proposed gestures is verified on the implemented Multi-touch table.

  • PDF

Design and Implementation of a Smartphone-based User-Convenance Home Network Control System using Gesture (제스처를 이용한 스마트폰 기반 사용자 편의 홈 네트워크 제어 시스템의 설계 및 구현)

  • Jeon, Byoungchan;Cha, Siho
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.11 no.2
    • /
    • pp.113-120
    • /
    • 2015
  • Under the penetration of smartphones equipped with a variety of features grows globally, the efficient using of a variety of functions of smartphones has been increased. In accordance with this trend, a lot of researches on the remote control method using the smart phone for consumer products in home networks. Input methods of the current smpartphoes are typically button-based inputs through touching. The button input methods are inconvenient for people who are not familiar touch. Therefore, the researches on the different input schemes to replace the touch methods are required. In this paper, we propose a gesture based input method to replace the touch-sensitive input that of the existing smartphone applications, and a way to apply it to home networks. The proposed method uses three-axis acceleration sensor which is built into smatphones, and it also defines six kinds of gestures patterns that may be applied to home network systems by measuring the recognition rates.

Human-Object Interaction Framework Using RGB-D Camera (RGB-D 카메라를 사용한 사용자-실사물 상호작용 프레임워크)

  • Baeka, Yong-Hwan;Lim, Changmin;Park, Jong-Il
    • Journal of Broadcast Engineering
    • /
    • v.21 no.1
    • /
    • pp.11-23
    • /
    • 2016
  • Recent days, touch interaction interface is the most widely used interaction interface to communicate with digital devices. Because of its usability, touch technology is applied almost everywhere from watch to advertising boards and it is growing much bigger. However, this technology has a critical weakness. Normally, touch input device needs a contact surface with touch sensors embedded in it. Thus, touch interaction through general objects like books or documents are still unavailable. In this paper, a human-object interaction framework based on RGB-D camera is proposed to overcome those limitation. The proposed framework can deal with occluded situations like hovering the hand on top of the object and also moving objects by hand. In such situations object recognition algorithm and hand gesture algorithm may fail to recognize. However, our framework makes it possible to handle complicated circumstances without performance loss. The framework calculates the status of the object with fast and robust object recognition algorithm to determine whether it is an object or a human hand. Then, the hand gesture recognition algorithm controls the context of each object by gestures almost simultaneously.

Speech-Oriented Multimodal Usage Pattern Analysis for TV Guide Application Scenarios (TV 가이드 영역에서의 음성기반 멀티모달 사용 유형 분석)

  • Kim Ji-Young;Lee Kyong-Nim;Hong Ki-Hyung
    • MALSORI
    • /
    • no.58
    • /
    • pp.101-117
    • /
    • 2006
  • The development of efficient multimodal interfaces and fusion algorithms requires knowledge of usage patterns that show how people use multiple modalities. We analyzed multimodal usage patterns for TV-guide application scenarios (or tasks). In order to collect usage patterns, we implemented a multimodal usage pattern collection system having two input modalities: speech and touch-gesture. Fifty-four subjects participated in our study. Analysis of the collected usage patterns shows a positive correlation between the task type and multimodal usage patterns. In addition, we analyzed the timing between speech-utterances and their corresponding touch-gestures that shows the touch-gesture occurring time interval relative to the duration of speech utterance. We believe that, for developing efficient multimodal fusion algorithms on an application, the multimodal usage pattern analysis for the given application, similar to our work for TV guide application, have to be done in advance.

  • PDF

Developed a golf course scorecard App that improved UI/UX based on C/S (C/S 기반의 UI/UX를 개선한 골프장 스코어카드 App 개발)

  • Jung, Chul-Jong
    • Journal of Digital Contents Society
    • /
    • v.19 no.8
    • /
    • pp.1433-1442
    • /
    • 2018
  • This study develops and improves the EZ Touch App of the scorecard application (app) using the smartphone and the pad, and works with the customer management system (C/S). The research was conducted as follows. First, how do you handle the EZ Touch input method on a scorecard? Second, how to configure the platform of customer (member) management system (C/S) and data server system? Third, does EZ Touch App work organically with customer management system (C/S)? The developed EZ Touch is entered into the scorecard as an input method using the gesture as a result of this research, and it is linked with the C/S system to organize the review function, hall information function, field coaching function through score, It can be used for applications such as information management functions and statistics through differentiated statistics. However, there are some problems and improvements in user convenience in real time use. I think there is a need to study to solve this problem in the future. EZ Touch input method is input to the scorecard by inputting gesture of the finger as a result of this study, and it is linked with this, and it is possible to use differentiated statistics such as review function, hall information function, field coaching function, It is the purpose of the study to improve the technical competitiveness of the product by developing the application.

Analyzing Input Patterns of Smartphone Applications in Touch Interfaces

  • Bahn, Hyokyung;Kim, Jisun
    • International journal of advanced smart convergence
    • /
    • v.10 no.4
    • /
    • pp.30-37
    • /
    • 2021
  • Touch sensor interface has become the most useful input device in a smartphone. Unlike keypad/keyboard interfaces used in electronic dictionaries and feature phones, smartphone's touch interfaces allow for the recognition of various gestures that represent distinct features of each application's input. In this paper, we analyze application-specific input patterns that appear in smartphone's touch interfaces. Specifically, we capture touch input patterns from various Android applications, and analyze them. Based on this analysis, we observe a certain unique characteristics of application's touch input patterns. This can be utilized in various useful areas like user authentications, prevention of executing application by illegal users, or digital forensic based on logged touch patterns.

Analysis of Users' Gestures by Application in Smartphone Touch Interfaces (스마트폰 터치 인터페이스에서 애플리케이션별 사용자 제스처의 분석)

  • Kim, Jisun;Bahn, Hyokyung
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.15 no.2
    • /
    • pp.9-14
    • /
    • 2015
  • Touch interface is widely used in a smartphone instead of a keyboard or a keypad interface that has been adopted in a PC or a featurephone, respectively. Touch interface can recognize a variety of gestures that clearly represent the distinct features of each application's input. This paper analyzes users' gesture of each application captured by the touch interface of a smartphone. Specifically, we extract touch input traces from various application categories such as game, web browser, youtube, image and e-book viewer, video player, camera, and map applications, and then analyzed them. Through this analysis, we observed a certain unique characteristics of each application's touch input, and this can be utilized in various useful areas such as identification of an application user, prevention of running an application by an illegal user, or design of a new interface convenient to a specific user.

Implementation of non-Wearable Air-Finger Mouse by Infrared Diffused Illumination (적외선 확산 투광에 의한 비장착형 공간 손가락 마우스 구현)

  • Lee, Woo-Beom
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.15 no.2
    • /
    • pp.167-173
    • /
    • 2015
  • Extraction of Finger-end points is one of the most process for user multi-commands in the Hand-Gesture interface technology. However, most of previous works use the geometric and morphological method for extracting a finger-end points. Therefore, this paper proposes the method of user finger-end points extraction that is motivated a ultrared diffused illumination, which is used for the user commands in the multi-touch display device. Proposed air-mouse is worked by the quantity state and moving direction of extracted finger-end points. Also, our system includes a basic mouse event, as well as the continuous command function for expending a user multi-gesture. In order to evaluate the performance of the our proposed method, after applying to the web browser application as a command device. As a result, the proposed method showed the average 90% success-rate for the various user-commands.