• Title/Summary/Keyword: Touch gestures

Search Result 44, Processing Time 0.023 seconds

A Unit Touch Gesture Model of Performance Time Prediction for Mobile Devices

  • Kim, Damee;Myung, Rohae
    • Journal of the Ergonomics Society of Korea
    • /
    • v.35 no.4
    • /
    • pp.277-291
    • /
    • 2016
  • Objective: The aim of this study is to propose a unit touch gesture model, which would be useful to predict the performance time on mobile devices. Background: When estimating usability based on Model-based Evaluation (MBE) in interfaces, the GOMS model measured 'operators' to predict the execution time in the desktop environment. Therefore, this study used the concept of operator in GOMS for touch gestures. Since the touch gestures are comprised of possible unit touch gestures, these unit touch gestures can predict to performance time with unit touch gestures on mobile devices. Method: In order to extract unit touch gestures, manual movements of subjects were recorded in the 120 fps with pixel coordinates. Touch gestures are classified with 'out of range', 'registration', 'continuation' and 'termination' of gesture. Results: As a results, six unit touch gestures were extracted, which are hold down (H), Release (R), Slip (S), Curved-stroke (Cs), Path-stroke (Ps) and Out of range (Or). The movement time predicted by the unit touch gesture model is not significantly different from the participants' execution time. The measured six unit touch gestures can predict movement time of undefined touch gestures like user-defined gestures. Conclusion: In conclusion, touch gestures could be subdivided into six unit touch gestures. Six unit touch gestures can explain almost all the current touch gestures including user-defined gestures. So, this model provided in this study has a high predictive power. The model presented in the study could be utilized to predict the performance time of touch gestures. Application: The unit touch gestures could be simply added up to predict the performance time without measuring the performance time of a new gesture.

Towards Establishing a Touchless Gesture Dictionary based on User Participatory Design

  • Song, Hae-Won;Kim, Huhn
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.515-523
    • /
    • 2012
  • Objective: The aim of this study is to investigate users' intuitive stereotypes on non-touch gestures and establish the gesture dictionary that can be applied to gesture-based interaction designs. Background: Recently, the interaction based on non-touch gestures is emerging as an alternative for natural interactions between human and systems. However, in order for non-touch gestures to become a universe interaction method, the studies on what kinds of gestures are intuitive and effective should be prerequisite. Method: In this study, as applicable domains of non-touch gestures, four devices(i.e. TV, Audio, Computer, Car Navigation) and sixteen basic operations(i.e. power on/off, previous/next page, volume up/down, list up/down, zoom in/out, play, cancel, delete, search, mute, save) were drawn from both focus group interview and survey. Then, a user participatory design was performed. The participants were requested to design three gestures suitable to each operation in the devices, and they evaluated intuitiveness, memorability, convenience, and satisfaction of their derived gestures. Through the participatory design, agreement scores, frequencies and planning times of each distinguished gesture were measured. Results: The derived gestures were not different in terms of four devices. However, diverse but common gestures were derived in terms of kinds of operations. In special, manipulative gestures were suitable for all kinds of operations. On the contrary, semantic or descriptive gestures were proper to one-shot operations like power on/off, play, cancel or search. Conclusion: The touchless gesture dictionary was established by mapping intuitive and valuable gestures onto each operation. Application: The dictionary can be applied to interaction designs based on non-touch gestures. Moreover, it will be used as a basic reference for standardizing non-touch gestures.

Implementation of new gestures on the Multi-touch table

  • Park, Sang Bong;Kim, Beom jin
    • International Journal of Advanced Culture Technology
    • /
    • v.1 no.1
    • /
    • pp.15-18
    • /
    • 2013
  • This paper describes new gestures on the Multi-touch table. The 2 new gestures with 3 fingers are used for minimizing of all windows that is already open and converting Aero mode. We also implement a FTIR (Frustrated Total Internal Reflection) Multi-touch table that consists of sheet of acrylic, infrared LEDs, camera and rear projector. The operation of proposed gestures is verified on the implemented Multi-touch table.

  • PDF

Direction of Touch Gestures and Perception of Inner Scroll in Smartphone UI (스마트폰 UI에서 터치 제스처의 방향성과 이너 스크롤의 인지)

  • Lee, Young-Ju
    • Journal of Digital Convergence
    • /
    • v.19 no.2
    • /
    • pp.409-414
    • /
    • 2021
  • In this paper, we investigated the touch gestures of the scroll direction of a small and long UI due to the characteristics of a device in a smartphone environment that has become popular and used. Touch gestures are touched and directed by triggers such as metaphors and affordances based on past experiences. Different types of touch gestures are used depending on the type of navigation, motion, and transformation gesture, but scrolling is the most frequently used among them. In general, the scroll is vertically scrolled, but recently, a design pattern that can be scrolled left and right inside is arranged to cause cognitive dissonance of users. In the use of an inner scroll that can scroll left and right by covering a part of the right content, the mixing of a non-scrollable design pattern becomes a factor that requires attention to the user. Therefore, it was found that the use of triggers and the use of consistent design patterns can enhance the user experience even in the inner scroll environment.

SATS: Structure-Aware Touch-Based Scrolling

  • Kim, Dohyung;Gweon, Gahgene;Lee, Geehyuk
    • ETRI Journal
    • /
    • v.38 no.6
    • /
    • pp.1104-1113
    • /
    • 2016
  • Non-linear document navigation refers to the process of repeatedly reading a document at different levels to provide an overview, including selective reading to search for useful information within a document under time constraints. Currently, this function is not supported well by small-screen tablets. In this study, we propose the concept of structure-aware touch-based scrolling (SATS), which allows structural document navigation using region-dependent touch gestures for non-sequential navigation within tablets or tablet-sized e-book readers. In SATS, the screen is divided into four vertical sections representing the different structural levels of a document, where dragging into the different sections allows navigating from the macro to micro levels. The implementation of a prototype is presented, as well as details of a comparative evaluation using typical non-sequential navigation tasks performed under time constraints. The results showed that SATS obtained better performance, higher user satisfaction, and a lower usability workload compared with a conventional structural overview interface.

English Input Keypad Method Using Picker -Based Interface

  • Kwon, Soon-Kak;Kim, Heung-Jun
    • Journal of Korea Multimedia Society
    • /
    • v.18 no.11
    • /
    • pp.1383-1390
    • /
    • 2015
  • By according to the development of the mobile devices, a touch screen provides the variety of inputting character and the flexibility of user interface. Currently, the physically simple touch method is widely used for English input but this simple touch is not increasing the variety of inputs and flexibility of the user interfaces. In this paper, we propose a new method to input English characters continuously by recognizing gestures instead of the simple touches. The proposed method places the rotational pickers on the screen for changing the alphabetical sequence instead of the keys and inputs English characters through the flick gestures and the touches. Simulation results show that the proposed keypad method has better performance than the keypad of the conventional methods.

The Design of Efficient User Environment on the FTIR Multi Touch (FTIR 멀티터치 테이블에서 효율적인 사용자 환경 개발)

  • Park, Sang-Bong;Ahn, Jung-Seo
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.12 no.2
    • /
    • pp.85-94
    • /
    • 2012
  • In this paper, we develop the new gestures of screen control with fingers on the FTIR multi touch table. It also describes recognition of mobile devices on the table using infrared camera. The FTIR multi touch table was incovenient to the existiog Bluetooth connection, because it is not provided with an HID(Human Input Device) interface. The proposed data transmission method using mobile device is to relieve the inconvenience of the data transfer and proceed more effectively. The new gestures and data transmission method is verified by FTIR multi touch table that is implemented for testing.

Interacting with Touchless Gestures: Taxonomy and Requirements

  • Kim, Huhn
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.475-481
    • /
    • 2012
  • Objective: The aim of this study is to make the taxonomy for classifying diverse touchless gestures and establish the design requirements that should be considered in determining suitable gestures during gesture-based interaction design. Background: Recently, the applicability of touchless gestures is more and more increasing as relevant technologies are being advanced. However, before touchless gestures are widely applied to various devices or systems, the understanding on human gestures' natures and their standardization should be prerequisite. Method: In this study, diverse gesture types in various literatures were collected and, based on those, a new taxonomy for classifying touchless gestures was proposed. And many gesture-based interaction design cases and studies were analyzed. Results: The proposed taxonomy consisted of two dimensions: shape (deictic, manipulative, semantic, or descriptive) and motion(static or dynamic). The case analysis based on the taxonomy showed that manipulative and dynamic gestures were widely applied. Conclusion: Four core requirements for valuable touchless gestures were intuitiveness, learnability, convenience and discriminability. Application: The gesture taxonomy can be applied to produce alternatives of applicable touchless gestures, and four design requirements can be used as the criteria for evaluating the alternatives.

Design of Multi-Finger Flick Interface for Fast File Management on Capacitive-Touch-Sensor Device (정전기식 입력 장치에서의 빠른 파일 관리를 위한 다중 손가락 튕김 인터페이스 설계)

  • Park, Se-Hyun;Park, Tae-Jin;Choy, Yoon-Chul
    • Journal of Korea Multimedia Society
    • /
    • v.13 no.8
    • /
    • pp.1235-1244
    • /
    • 2010
  • Most emerging smart phones support capacitive touch sensors. It renders existing gesture-based interfaces not suitable since they were developed for the resistive touch sensors and pen-based input. Unlike the flick gestures from the existing gesture interfaces, the finger flick gesture used in this paper reduces the workload about half by selecting the target and the command to perform on the target at a single touch input. With the combination with multi-touch interface, it supports various menu commands without having to learn complex gestures, and is suitable for the touch-based devices hence it minimizes input error. This research designs and implements the multi-touch and flick interface to provide an effective file management system on the smart phones with capacitive touch input. The evaluation proves that the suggested interface is superior to the existing methods on the capacitive touch input devices.

Dynamic Association and Natural Interaction for Multi-Displays Using Smart Devices (다수의 스마트 디바이스를 활용한 멀티 디스플레이 동적 생성 및 인터랙션)

  • Kim, Minseok;Lee, Jae Yeol
    • Korean Journal of Computational Design and Engineering
    • /
    • v.20 no.4
    • /
    • pp.337-347
    • /
    • 2015
  • This paper presents a dynamic association and natural interaction method for multi-displays composed of smart devices. Users can intuitively associate relations among smart devices by shake gestures, flexibly modify the layout of the display by tilt gestures, and naturally interact with the multi-display by multi-touch interactions. First of all, users shake their smart devices to create and bind a group for a multi-display with a matrix configuration in an ad-hoc and collaborative situation. After the creation of the group, if needed, their display layout can be flexibly changed by tilt gestures that move the tilted device to the nearest vacant cell in the matrix configuration. During the tilt gestures, the system automatically modifies the relation, view, and configuration of the multi-display. Finally, users can interact with the multi-display through multi-touch interactions just as they interact with a single large display. Furthermore, depending on the context or role, synchronous or asynchronous mode is available to them for providing a split view or another UI. We will show the effectiveness and advantages of the proposed approach by demonstrating implementation results and evaluating the method by the usability study.