• Title/Summary/Keyword: Touch interaction

Search Result 170, Processing Time 0.033 seconds

A Study on Touch Interaction Styles of Mobile phone (휴대폰의 터치 인터랙션 유형에 관한 연구 (시스템 모델 중심으로))

  • Jo, Han-Kyung;Pan, Young-Hwan;Jeung, Ji-Hong
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.971-975
    • /
    • 2009
  • 최근 다양하게 출시되는 터치스크린 휴대폰 시장의 흐름에 맞춰 터치 인터랙션 유형에 관한 연구 또한 활발히 진행되고 있다. 그러나 기존의 연구들은 터치 입력 인터랙션 유형의 사용자 모델에만 초점이 맞추어져 있는 상태이다. 시스템 모델의 이해부족 현상과, 터치 인터랙션의 사용자 모델과 시스템 모델사이의 차이점에서 나타나는 충돌은 잘못된 인터랙션 문제를 발생시키며, 사용자들에게 혼란 줄 수 있다. 때문에 본 연구는 터치 입력 인터랙션 유형의 시스템 모델에 초점을 맞춰서 진행하였다. 터치 인터랙션 유형 시스템 모델의 정의에 있어서 시간 축을 기준으로 터치 인터랙션 유형을 정의하였다. 정의된 시스템 모델을 기반으로 6개의 기본 입력 인터랙션을 각각 비교하여 상관관계를 도출하였고, 이 결과의 유효성 검증을 위해 전문가 인터뷰를 실시했다. 정리된 터치 인터랙션 유형의 시스템 모델 정의 및 상관관계는 앞으로 다양한 터치 기반 인터랙션 개발 시 최적화된 터치 유형 개발에 활용 될 수 있으며, 개발에 참여하는 디자이너, 개발자, 기획자들이 시스템 기능을 이해하는데 도움을 주며 그들의 원활한 의사소통 수단으로 쓰임과 동시에 고객들과의 의사소통에 효과적으로 쓰일 것으로 기대된다.

  • PDF

General-Purpose Multi-Touch Inter action System for Multi-I/O Content Control (다중 입출력 컨텐츠 제어를 위한 범용 멀티 터치 인터렉션 시스템)

  • Bae, Ki-Tae;Kwon, Doo-Young
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.12 no.4
    • /
    • pp.1933-1939
    • /
    • 2011
  • The former people who made musical instruments used touch devices for sound control. As the first multi-touch system appeared in 1982, the performance of the system has been improved rapidly by many researches. In spite of such performance improvement, the popularization of multi-touch interface was looked difficult. However, in 2007, multi-touch interfaces have become popular with Apple Iphone and people have been able to experience easily multi-touch interface using smart phones. In this paper we propose a general-purpose multi-touch interaction system for multi-touch content producer and market invigoration of multi-touch interface. We show by real field tests that the proposed method has benefits in the aspects of price and performance compared with other techniques.

A Study on User Behavior of Input Method for Touch Screen Mobile Phone (터치스크린 휴대폰 입력 방식에 따른 사용자 행태에 관한 연구)

  • Jun, Hye-Sun;Choi, Woo-Sik;Pan, Young-Hwan
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02b
    • /
    • pp.173-178
    • /
    • 2008
  • Due to a rapid increase in demand for bigger-screen-equipped mobile phones in recent years, many big-name-manufactures have been releasing touch-screen-enabled devices. In this paper, various touch-screen-input methods have been summarized into 6 different categories. How? By tracing each user's finger print path, user's input pattern and behavior have been carefully recorded and analyzed. Through this analysis, what to be considered before designing UI is presented in great details.

  • PDF

Comparing Elder Users' Interaction Behavior to the Younger: Focusing on Tap, Move and Flick Tasks on a Mobile Touch Screen Device

  • Lim, Ji-Hyoun;Ryu, Tae-Beum
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.2
    • /
    • pp.413-419
    • /
    • 2012
  • Objective: This study presents an observation and analysis on behavioral characteristics of old users in comparison to young users in the use of control on display interface. Background: Touch interface which allows users to control directly on display, is conceived as delight and easy way of human-computer interaction. Due to the advantage in stimulus-response ensemble, the old users, who typically experiencing difficulties in interacting with computer, would expected to have better experience in using computing machines. Method: Twenty nine participants who are over 50 years old and 14 participants who are in 20s years old were participated in this study. Three primary tasks in touch interface, which are tap, move, and flick, were delivered by the users. For the tap task, response time and point of touch response were collected and the response bias was calculated for each trial. For the move task, delivery time and the distance of finger movements were recorded for each trial. For the flick task, task completion time and flicking distance were recorded. Results: From the collected behavioral data, temporal and spatial differences between young and old users behavior were analyzed. The older users showed difficulty in completing move task requiring eye-hand coordination.

Validating one-handed interaction modes for supporting touch dead-zone in large screen smartphones (대화면 스마트폰의 한 손 조작 시 터치 사각영역 지원 인터랙션의 유용성)

  • Park, Minji;Kim, Huhn
    • Journal of the HCI Society of Korea
    • /
    • v.12 no.1
    • /
    • pp.25-32
    • /
    • 2017
  • The purpose of this study is to evaluate the effectiveness of one-handed interaction modes for supporting the dead zone that users must be difficulty in performing the touch manipulation with only one hand. For the purpose, this study analyzed two existing one-handed modes in iPhone and Android smartphones, and proposed and implemented two additional one-handed modes. In order to investigate effectiveness of the one-handed modes, we performed the experiment that compared normal touch mode with the four one-handed modes. Experimental results showed that all one-handed modes required more time than normal touch mode because of the time requiring in both mode change and recognition. However, the participants had difficulty in manipulating continuous touches at dead zone area with only normal touch. Moreover, the subjective satisfaction was high in one-handed modes thanks to touch convenience and smooth transition effects in mode change. In special, the one-handed mode at iPhone was the most effective out of the tested modes.

A Unit Touch Gesture Model of Performance Time Prediction for Mobile Devices

  • Kim, Damee;Myung, Rohae
    • Journal of the Ergonomics Society of Korea
    • /
    • v.35 no.4
    • /
    • pp.277-291
    • /
    • 2016
  • Objective: The aim of this study is to propose a unit touch gesture model, which would be useful to predict the performance time on mobile devices. Background: When estimating usability based on Model-based Evaluation (MBE) in interfaces, the GOMS model measured 'operators' to predict the execution time in the desktop environment. Therefore, this study used the concept of operator in GOMS for touch gestures. Since the touch gestures are comprised of possible unit touch gestures, these unit touch gestures can predict to performance time with unit touch gestures on mobile devices. Method: In order to extract unit touch gestures, manual movements of subjects were recorded in the 120 fps with pixel coordinates. Touch gestures are classified with 'out of range', 'registration', 'continuation' and 'termination' of gesture. Results: As a results, six unit touch gestures were extracted, which are hold down (H), Release (R), Slip (S), Curved-stroke (Cs), Path-stroke (Ps) and Out of range (Or). The movement time predicted by the unit touch gesture model is not significantly different from the participants' execution time. The measured six unit touch gestures can predict movement time of undefined touch gestures like user-defined gestures. Conclusion: In conclusion, touch gestures could be subdivided into six unit touch gestures. Six unit touch gestures can explain almost all the current touch gestures including user-defined gestures. So, this model provided in this study has a high predictive power. The model presented in the study could be utilized to predict the performance time of touch gestures. Application: The unit touch gestures could be simply added up to predict the performance time without measuring the performance time of a new gesture.

Two camera based touch screen system for human computer interaction (인간과 컴퓨터 상호 작용을 위한 2개의 카메라 기반의 터치 스크린 시스템)

  • Kim, Jin-Kuk;Min, Kyung-Won;Ko, Han-Seok
    • Proceedings of the IEEK Conference
    • /
    • 2006.06a
    • /
    • pp.319-320
    • /
    • 2006
  • In this paper, we propose a vision based system employing two cameras to provide effective touch screen function. The two main processes - determining touch (or no-touch) and contact location of screen plane - are essential for enabling touch screen function. First region of interest is found by using color characteristic and histogram for determining the contact mode. Second, if the hand touches the mirror, the fingertip point in image is found using the correlation coefficient based on the mirror attribute. Subsequently, the fingertip coordinate in image is transformed to the location in mirror plane by using four predefined points (termed as four-point method) and bilinear transform. Representative experimental results show that the proposed system is suited to touch screen.

  • PDF

Interactive Digital Storytelling Based on Interests (흥미도를 반영한 인터렉티브 디지털 스토리텔링)

  • Kim, Yang-Wook;Kim, Jong-Hun;Park, Jun
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.508-511
    • /
    • 2009
  • In Interactive Storytelling, storyline is developed according to the user's interaction. Diffrerent from linear, fixed storytelling, users may select an event or make decisions which affect on the story plotting. Therefore user's feeling of immersion and interest may be greatly enhanced. In this paper, we used markers and multi-touch pad for user's interaction for interactive storytelling. Users could present his/her level of interest and provide feedback through markers and multi-touch pad, through which storyline was differently developed.

  • PDF

Manipulation of the Windows Interface Based on Haptic Feedback (촉각 기반 윈도우 인터페이스)

  • Lee, Jun-Young;Kyung, Ki-Uk;Park, Jun-Seok
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.366-371
    • /
    • 2008
  • In this paper, we suggest a haptic interface and a framework of interaction with haptic feedback based Windows graphical user interface (GUI) in a computing device with touch screen. The events that occur during a user interacts with Windows interfaces through a touch screen are filtered out by the Windows Interface Message Filter (WIMF) and converted into appropriate haptic feedback information by the Haptic Information Provider (HIP). The haptic information are conveyed to users through a stylus-like haptic interface interacting with a touch screen. Major Windows interaction schemes including button click, menu selection/pop-up, window selection/movement, icon selection/drag & drop and scroll have been implemented and user tests show the improved usability since the haptic feedback helps intuition and precise manipulation.

  • PDF

Tactile Sensation Display with Electrotactile Interface

  • Yarimaga, Oktay;Lee, Jun-Hun;Lee, Beom-Chan;Ryu, Je-Ha
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.145-150
    • /
    • 2005
  • This paper presents an Electrotactile Display System (ETCS). One of the most important human sensory systems for human computer interaction is the sense of touch, which can be displayed to human through tactile output devices. To realize the sense of touch, electrotactile display produces controlled, localized touch sensation on the skin by passing small electric current. In electrotactile stimulation, the mechanoreceptors in the skin may be stimulated individually in order to display the sense of vibration, touch, itch, tingle, pressure etc. on the finger, palm, arm or any suitable location of the body by using appropriate electrodes and waveforms. We developed an ETCS and investigated effectiveness of the proposed system in terms of the perception of roughness of a surface by stimulating the palmar side of hand with different waveforms and the perception of direction and location information through forearm. Positive and negative pulse trains were tested with different current intensities and electrode switching times on the forearm or finger of the user with an electrode-embedded armband in order to investigate how subjects recognize displayed patterns and directions of stimulation.

  • PDF