• Title/Summary/Keyword: Touch user interface

Search Result 156, Processing Time 0.025 seconds

Improving Eye-gaze Mouse System Using Mouth Open Detection and Pop Up Menu (입 벌림 인식과 팝업 메뉴를 이용한 시선추적 마우스 시스템 성능 개선)

  • Byeon, Ju Yeong;Jung, Keechul
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.12
    • /
    • pp.1454-1463
    • /
    • 2020
  • An important factor in eye-tracking PC interface for general paralyzed patients is the implementation of the mouse interface, for manipulating the GUI. With a successfully implemented mouse interface, users can generate mouse events exactly at the point of their choosing. However, it is difficult to define this interaction in the eye-tracking interface. This problem has been defined as the Midas touch problem and has been a major focus of eye-tracking research. There have been many attempts to solve this problem using blink, voice input, etc. However, it was not suitable for general paralyzed patients because some of them cannot wink or speak. In this paper, we propose a mouth-pop-up, eye-tracking mouse interface that solves the Midas touch problem as well as becoming a suitable interface for general paralyzed patients using a common RGB camera. The interface presented in this paper implements a mouse interface that detects the opening and closing of the mouth to activate a pop-up menu that the user can select the mouse event. After implementation, a performance experiment was conducted. As a result, we found that the number of malfunctions and the time to perform tasks were reduced compared to the existing method.

Interaction Technique in Smoke Simulations using Mouth-Wind on Mobile Devices (모바일 디바이스에서 사용자의 입 바람을 이용한 연기 시뮬레이션의 상호작용 방법)

  • Kim, Jong-Hyun
    • Journal of the Korea Computer Graphics Society
    • /
    • v.24 no.4
    • /
    • pp.21-27
    • /
    • 2018
  • In this paper, we propose a real-time interaction method using user's mouth wind in mobile device. In mobile and virtual reality, user interaction technology is important, but various user interface methods is still lacking. Most of the interaction technologies are hand touch screen touch or motion recognition. In this study, we propose an interface technology that can interact with real time using user's mouth wind. The direction of the wind is determined by using the angle and the position between the user and the mobile device, and the size of the wind is calculated by using the magnitude of user's mouth wind. To show the superiority of the proposed technique, we show the result of visualizing the flow of the vector field in real time by integrating the mouth-wind interface into the Navier-Stokes equations. We show the results of the paper on mobile devices, but can be applied in the Agumented reality(AR) and Virtual reality(VR) fields requiring interface technology.

GripLaunch: a Novel Sensor-Based Mobile User Interface with Touch Sensing Housing

  • Chang, Wook;Park, Joon-Ah;Lee, Hyun-Jeong;Cho, Joon-Kee;Soh, Byung-Seok;Shim, Jung-Hyun;Yang, Gyung-Hye;Cho, Sung-Jung
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.6 no.4
    • /
    • pp.304-313
    • /
    • 2006
  • This paper describes a novel way of applying capacitive sensing technology to a mobile user interface. The key idea is to use grip-pattern, which is naturally produced when a user tries to use the mobile device, as a clue to determine an application to be launched. To this end, a capacitive touch sensing system is carefully designed and installed underneath the housing of the mobile device to capture the information of the user's grip-pattern. The captured data is then recognized by dedicated recognition algorithms. The feasibility of the proposed user interface system is thoroughly evaluated with various recognition tests.

A New Eye Tracking Method as a Smartphone Interface

  • Lee, Eui Chul;Park, Min Woo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.4
    • /
    • pp.834-848
    • /
    • 2013
  • To effectively use these functions many kinds of human-phone interface are used such as touch, voice, and gesture. However, the most important touch interface cannot be used in case of hand disabled person or busy both hands. Although eye tracking is a superb human-computer interface method, it has not been applied to smartphones because of the small screen size, the frequently changing geometric position between the user's face and phone screen, and the low resolution of the frontal cameras. In this paper, a new eye tracking method is proposed to act as a smartphone user interface. To maximize eye image resolution, a zoom lens and three infrared LEDs are adopted. Our proposed method has following novelties. Firstly, appropriate camera specification and image resolution are analyzed in order to smartphone based gaze tracking method. Secondly, facial movement is allowable in case of one eye region is included in image. Thirdly, the proposed method can be operated in case of both landscape and portrait screen modes. Fourthly, only two LED reflective positions are used in order to calculate gaze position on the basis of 2D geometric relation between reflective rectangle and screen. Fifthly, a prototype mock-up design module is made in order to confirm feasibility for applying to actual smart-phone. Experimental results showed that the gaze estimation error was about 31 pixels at a screen resolution of $480{\times}800$ and the average hit ratio of a $5{\times}4$ icon grid was 94.6%.

Menu Layout for Touch-screen Phones Based on Various Grip Postures (다양한 파지 방법에 따른 터치스크린 폰 메뉴 레이아웃에 관한 연구)

  • Cho, Sung-Il;Park, Sung-Joon;Jung, Eui-S.;Im, Young-Jae;Choe, Jea-Ho
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.36 no.1
    • /
    • pp.52-58
    • /
    • 2010
  • The level of competition has reached the limits in cellular phone market and the cellular phone manufacture companies started to focus their solution in user interface. Design issues with controllability led the development and renovation of such products to the use of the touch-screen phone. Depending upon the readability, technical advances, portability and controllability, user satisfaction of touch-screen phones could vary significantly. In this research, the controllability was dealt in regard to various grip postures, in order to improve menu layout which fits for using the thumbs of both hands and a thumbs of single hand. Regression models are found to the suggest the location of buttons on the screen by redesigning the menu layout, it is expected to improve both controllability and satisfaction of the user. This result can be applicable not only to mobile phone design, but also to the design of various hand-held devices using a touch screen.

A study on user satisfaction in TUI environment (TUI 환경의 유저 사용 만족도 연구)

  • Choi, heungyeorl;Yang, seungyong
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.11 no.4
    • /
    • pp.113-127
    • /
    • 2015
  • An interface in smart device environment is changing to TUI(touch user interface) environment where a system is being controlled by physical touch, differently from a system controlled through conventional mouse and keyboard. What is more important than anything else in this TUI environment is to implement interface in consideration of learn ability and cognitive constructivism according to user's experience. Therefore, now is the time when it is necessary to carry out various studies on smart content design process going a step farther together with discussing the details of user's experience factor. Hence, this study was intended to look into what effect a user's experiential traits had on the production of contents for the purpose of measures for improving TUI user satisfaction in order to effectively realize contents in smart environment. Results were yielded by using a statistical empirical analysis such as cross-tabulation analysis according to important variable and user, paired t-test, multiple response analysis, and preference frequency analysis of user preference on the basis of a survey. As a result, a system was presented for implementing DFSS(Design For Six Sigma) process. TUI experience factor can be divided into direct habitual experience, direct learning experience, indirect habitual experience, and indirect learning experience. And in the results of study, it was possible to find that the important variables of this study had a positive effect on the improvement of use satisfaction with contents on the whole according to the user convenience of smart contents. This study is expected to have a positive effect on efficient smart device-based contents production by providing objective information according to empirical analysis to smart media-based developer and designer and presenting a model for improving the changed TUI usability.

A User Interface Style Guide for the Cabinet Operator Module (캐비닛운전원모듈을 위한 사용자인터페이스 스타일가이드)

  • Lee, Hyun-Chul;Lee, Dong-Young;Lee, Jung-Woon
    • Proceedings of the KIEE Conference
    • /
    • 2005.05a
    • /
    • pp.203-205
    • /
    • 2005
  • A reactor protection system (RPS) plays the roles of generating the reactor trip signal and the engineered safety features (ESF) actuation signal when the monitored plant processes reach predefined limits. A Korean project group is developing a new digitalized RPS and the Cabinet Operator Module (COM) of the RPS which is used for the RPS integrity testing and monitoring by an equipment operator. A flat panel display (FPD) with a touch screen capability is provided as a main user interface for the RPS operation. To support the RPS COM user interface design, actually the FPD screen design, we developed a user interface style guide because the system designer could not properly deal with the many general human factors design guidelines. To develop the user interface style guide, various design guideline gatherings, a walk-though with a video recorder, guideline selection with respect to user interface design elements, determination of the properties of the design elements, discussion with the system designers, and a conversion of the properties into a screen design were carried out. This paper describes the process in detail and the findings in the course of the style guide development.

  • PDF

The Application of a Quantitative Performance Assessment Model in Accordance with Button Menu Form Changes in Touch Screen Input Methods (터치스크린 입력방식에서 버튼메뉴의 형상변화에 따른 정량적 수행평가 모델 적용)

  • Han, Sang-Bok;Pyo, Jung-Sun
    • Journal of Digital Convergence
    • /
    • v.13 no.11
    • /
    • pp.337-348
    • /
    • 2015
  • Touch input method is unintended difficulties regarding touch and input have increased with touch input methods when compared to using a mouse for input in previously existing menus. This study attempted to analyze usability evaluations according to form and size changes of minimal button forms in button menus. This attempted to verify possibilities through more effective applications of the quantitative performance prediction evaluation model through comparative analysis of Fitts' Law, the representative model of the regression model formula and the interface human performance evaluation method, so that applications can be made in interface designs of touch input methods. Therefore it was significant in that it made reflections on design with consideration given to use user times according to button forms and size changes that can be applied to touch screens.

A Study on Air Interface System (AIS) Using Infrared Ray (IR) Camera (적외선 카메라를 이용한 에어 인터페이스 시스템(AIS) 연구)

  • Kim, Hyo-Sung;Jung, Hyun-Ki;Kim, Byung-Gyu
    • The KIPS Transactions:PartB
    • /
    • v.18B no.3
    • /
    • pp.109-116
    • /
    • 2011
  • In this paper, we introduce non-touch style interface system technology without any touch style controlling mechanism, which is called as "Air-interface". To develop this system, we used the full reflection principle of infrared (IR) light and then user's hand is separated from the background with the obtained image at every frame. The segmented hand region at every frame is used as input data for an hand-motion recognition module, and the hand-motion recognition module performs a suitable control event that has been mapped into the specified hand-motion through verifying the hand-motion. In this paper, we introduce some developed and suggested methods for image processing and hand-motion recognition. The developed air-touch technology will be very useful for advertizement panel, entertainment presentation system, kiosk system and so many applications.

Development of a Smartphone Interface using Infrared Approach (적외선 방식의 스마트 폰 인터페이스 개발)

  • Jang, Jae-Hyeok;Kim, Byung-Ki;Song, Chang-Geun;Ko, Young-Woong
    • The KIPS Transactions:PartA
    • /
    • v.18A no.2
    • /
    • pp.53-60
    • /
    • 2011
  • Touch screen technologies are widely used as the basic input for mobile devices. However, for smartphones, touch screens have been utilized as a simple device that merely process text data. It has been considered an inappropriate multimedia input device for one to use design applications, such as painting, due to its narrow touch screen space, which makes it inconvenient to draw pictures. In this study, we propose to enhance this weakness by using infrared approach on a smartphone's input interface. The usage of infrared approach will allow for quick and accurate operations without facing any space constraints. In this paper, we provide a detailed description of the design and implementation of a smartphone user interface using infrared pointing device. Additionally, using experimental results, we prove that our proposed approach is more convenient and efficient than traditional touch screen approaches used in smartphones.