• Title/Summary/Keyword: Touch User Interface

Search Result 156, Processing Time 0.025 seconds

A Study on Remote Multi-Touch System Based on Camera Interface (카메라 인터페이스 기반 원격 멀티터치 시스템 연구)

  • Nam, byeong-cheol;Bae, ki-tae
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2011.05a
    • /
    • pp.505-506
    • /
    • 2011
  • 21세기 초까지 멀티터치 시스템은 많은 연구와 성능 향상에도 불구하고 대중적인 인기를 얻지 못하였다. 그러나 2007년 아이폰의 등장으로 멀티터치 인터페이스는 각광을 받기 시작하고 NUI(Natural User Interface)에 대한 관심도 폭발적으로 증가하였다. 2010년 전체 모바일 시장에서 스마트폰이 차지하는 비중은 약20% 수준이지만 그 성장 곡선은 가파르게 상승하고 있다. 본 논문에서는 NUI의 사용이 스마트폰 혹은 멀티터치 모니터에 치우친 경향을 가만하여 원격 멀티터치 인터페이스 시스템을 제안한다.

  • PDF

Design of Contactless Gesture-based Rhythm Action Game Interface for Smart Mobile Devices

  • Ju, Da-Young
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.585-591
    • /
    • 2012
  • Objective: The aim of this study is to propose the contactless gesture-based interface on smart mobile devices for especially rhythm action games. Background: Most existing approaches about interactions of smart mobile games are tab on the touch screen. However that way is such undesirable for someone or for sometimes, because of the disabled person, or the inconvenience that users need to touch/tab specific devices. Moreover more importantly, new interaction can derive new possibilities from stranded game genre. Method: In this paper, I present a smart mobile game with contactless gesture-based interaction and the interfaces using computer vision technology. Discovering the gestures which are easy to recognize and research of interaction system that fits to game on smart mobile device are conducted as previous studies. A combination between augmented reality technique and contactless gesture interaction is also tried. Results: The rhythm game allows a user to interact with smart mobile devices using hand gestures, without touching or tabbing the screen. Moreover users can feel fun in the game as other games. Conclusion: Evaluation results show that users make low failure numbers, and the game is able to recognize gestures with quite high precision in real time. Therefore the contactless gesture-based interaction has potentials to smart mobile game. Application: The results are applied to the commercial game application.

A Study on Implementing Kinect-Based Control for LCD Display Contents (LCD Display 설비 Contents의 Kinect기반 동작제어 기술 구현에 관한 연구)

  • Rho, Jungkyu
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.63 no.4
    • /
    • pp.565-569
    • /
    • 2014
  • Recently, various kinds of new computer controlled devices have been introduced in a wide range of areas, and convenient user interfaces for controlling the devices are strongly needed. To implement natural user interfaces(NUIs) on top of the devices, new technologies like a touch screen, Wii Remote, wearable interfaces, and Microsoft Kinect were presented. This paper presents a natural and intuitive gesture-based model for controlling contents of LCD display. Microsoft Kinect sensor and its SDK are used to recognize human gestures, and the gestures are interpreted into corresponding commands to be executed. A command dispatch model is also proposed in order to handle the commands more naturally. I expect the proposed interface can be used in various fields, including display contents control.

Multi Spatial Interaction Interface in Large-scale Interactive Display Environment (대규모 인터랙티브 디스플레이 환경에서의 멀티 공간 인터랙션 인터페이스)

  • Yun, Chang-Ok;Park, Jung-Pil;Yun, Tae-Soo;Lee, Dong-Hoon
    • The Journal of the Korea Contents Association
    • /
    • v.10 no.2
    • /
    • pp.43-53
    • /
    • 2010
  • The interactive display is providing various interaction modes to users through various ubiquitous computing technologies. These methods were studied for their interactions, but the limits that it is provided to only single user and the device usability were generated. In this paper, we propose a new type of spatial multi interaction interface that provide the various spatial touch interactive to multi users in the ambient display environment. Therefore, we generate the interaction surface so that a user can interact through the IR-LEDs Array Bar installed in the ceiling of the ambient display environment. At this time, a user can experience the various interactions through the spatial touch in an interaction surface. Consequently, this system offers the interactive display and interface method that the users can interact through natural hand movement without the portable devices.

Performance Comparison of Manual and Touch Interface using Video-based Behavior Analysis

  • Lee, Chai-Woo;Bahn, Sang-Woo;Kim, Ga-Won;Yun, Myung-Hwan
    • Journal of the Ergonomics Society of Korea
    • /
    • v.29 no.4
    • /
    • pp.655-659
    • /
    • 2010
  • The objective of this study is to quantitatively incorporate user observation into usability evaluation of mobile interfaces using monitoring techniques in first- and third-person points of view. In this study, an experiment was conducted to monitor and record users' behavior using Ergoneers Dikablis, a gaze tracking device. The experiment was done with 2 mobile phones each with a button keypad interface and a touchscreen interface for comparative analysis. The subjects included 20 people who have similar experiences and proficiency in using mobile devices. Data from video recordings were coded with Noldus Observer XT to find usage patterns and to gather quantitative data for analysis in terms of effectiveness, efficiency and satisfaction. Results showed that the button keypad interface was generally better than the touchcreen interface. The movements of the fingers and gaze were much simpler when performing given tasks on the button keypad interface. While previous studies have mostly evaluated usability with performance measures by only looking at task results, this study can be expected to contribute by suggesting a method in which the behavioral patterns of interaction is evaluated.

The Design and fabrication of food waste system

  • Yeo, Seok-ki;Kim, Gye-Kuk;Seo, Chang Ok
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.1
    • /
    • pp.101-105
    • /
    • 2016
  • After a weight-rate disposal system for food waste has been implemented we have to install the food waste system at all apartment. In this paper, we supplied electric energy to the food waste system using solar heat panels. The weight of the food waste is displayed on the LCD panel, and its price is calculated based on its weight. Since there would be some cases that touch-typed card can't be well recognized if it is contaminated by foreign material, we designed a recognition device by no-touch sensitive card reader to embody the food waste system. The food waste system was designed using a GUI(graphical user interface) so that users can easily understand it.

The Design of Efficient User Environment on the FTIR Multi Touch (FTIR 멀티터치 테이블에서 효율적인 사용자 환경 개발)

  • Park, Sang-Bong;Ahn, Jung-Seo
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.12 no.2
    • /
    • pp.85-94
    • /
    • 2012
  • In this paper, we develop the new gestures of screen control with fingers on the FTIR multi touch table. It also describes recognition of mobile devices on the table using infrared camera. The FTIR multi touch table was incovenient to the existiog Bluetooth connection, because it is not provided with an HID(Human Input Device) interface. The proposed data transmission method using mobile device is to relieve the inconvenience of the data transfer and proceed more effectively. The new gestures and data transmission method is verified by FTIR multi touch table that is implemented for testing.

Implementation of non-Wearable Air-Finger Mouse by Infrared Diffused Illumination (적외선 확산 투광에 의한 비장착형 공간 손가락 마우스 구현)

  • Lee, Woo-Beom
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.15 no.2
    • /
    • pp.167-173
    • /
    • 2015
  • Extraction of Finger-end points is one of the most process for user multi-commands in the Hand-Gesture interface technology. However, most of previous works use the geometric and morphological method for extracting a finger-end points. Therefore, this paper proposes the method of user finger-end points extraction that is motivated a ultrared diffused illumination, which is used for the user commands in the multi-touch display device. Proposed air-mouse is worked by the quantity state and moving direction of extracted finger-end points. Also, our system includes a basic mouse event, as well as the continuous command function for expending a user multi-gesture. In order to evaluate the performance of the our proposed method, after applying to the web browser application as a command device. As a result, the proposed method showed the average 90% success-rate for the various user-commands.

A Photo Viewing Interface in a Ubiquitous Display Environment (유비쿼터스 디스플레이 환경에서의 사진 감상 인터페이스)

  • Ryu, Han-Sol;Choi, Soo-Mi
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.10
    • /
    • pp.1363-1373
    • /
    • 2009
  • In a ubiquitous computing environment, people more and more become to use multiple displays. When using such ubiquitous displays, it is needed for users to seamlessly interact with the displays for a certain content. In this paper, we propose a seamless interface that allows users to see photos across multiple displays. Many photos can be displayed on a screen at the same time because they are shown in 3D space. And a group of photos is represented using a cube metaphor. A user can manipulate photos directly using a touch screen and also see the photos continuously using PDAs when the user moves to other places. In addition, the user can interact with the photos seamlessly using another display. The developed system was designed to interact according to user's movements and not to draw user's attention too much as a kind of ambient display. Using RFID and ultrasonic sensors it recognizes a user and the user's proximity to the display, and then automatically determines the screen image and the user interface accordingly.

  • PDF

Design of Handwriting-based Text Interface for Support of Mobile Platform Education Contents (모바일 플랫폼 교육 콘텐츠 지원을 위한 손 글씨 기반 텍스트 인터페이스 설계)

  • Cho, Yunsik;Cho, Sae-Hong;Kim, Jinmo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.27 no.5
    • /
    • pp.81-89
    • /
    • 2021
  • This study proposes a text interface for support of language-based educational contents in a mobile platform environment. The proposed interface utilizes deep learning as an input structure to write words through handwriting. Based on GUI (Graphical User Interface) using buttons and menus of mobile platform contents and input methods such as screen touch, click, and drag, we design a text interface that can directly input and process handwriting from the user. It uses the EMNIST (Extended Modified National Institute of Standards and Technology database) dataset and a trained CNN (Convolutional Neural Network) to classify and combine alphabetic texts to complete words. Finally, we conduct experiments to analyze the learning support effect of the interface proposed by directly producing English word education contents and to compare satisfaction. We compared the ability to learn English words presented by users who have experienced the existing keypad-type interface and the proposed handwriting-based text interface in the same educational environment, and we analyzed the overall satisfaction in the process of writing words by manipulating the interface.