• Title/Summary/Keyword: Touch Interface

Search Result 267, Processing Time 0.024 seconds

Mini-Teleoperation system with a Force-Feedback Haptic Interface within a Virtual Environment (가상환경에서 힘 반영 촉각장치를 이용한 소형 원격조정 시스템)

  • 김대현;김영동;이현의
    • Proceedings of the KIPE Conference
    • /
    • 1998.07a
    • /
    • pp.116-122
    • /
    • 1998
  • This Paper presents some of challenges of creating feedback force, through manipulation of master manipulator, allowing the user to feel objects within a virtual environment. A sense of touch for the virtual environment. A sense of touch for the virtual environments was generated by a virtual compliance control method. In theis system data communication between the master and slave, we used TCP protocol. In the experiments. A position error between the master and slave arm was about $13.56^{\circ}$ in case that the master and slave arm had not compliance properties of the virtual object, while they have the its properties the position error reduced by $2.43^{\circ}$.

  • PDF

A Study on Railway Vehicles Fire Detection using HMI Touch Screen (HMI 터치스크린을 이용한 철도차량용 복합화재수신기 개발 연구)

  • Park, In-Deok;Kim, Chang
    • Journal of the Korean Institute of Illuminating and Electrical Installation Engineers
    • /
    • v.30 no.1
    • /
    • pp.38-43
    • /
    • 2016
  • Recent social needs for promoting traffic safety increased and the demand social security in economic, increasing the demand for environmentally friendly rail transport. In particular, when train express such as to secure reliability KTX(Korea Train eXpress) from potential disaster(fire) in the train operation caused by the train express running has been very important. Railroad fire extinguishing system is operated to fire exploding before reaching the flashing point more important than early to quickly detect because of CAN(Controller Area Network) communication to fire suppression and fire receiver, interface, fire fighting equipment from HMI((Human Machine Interface) and fire high-performance to research and development for intelligent composite fire receiver is required.

Multi Spatial Interaction Interface in Large-scale Interactive Display Environment (대규모 인터랙티브 디스플레이 환경에서의 멀티 공간 인터랙션 인터페이스)

  • Yun, Chang-Ok;Park, Jung-Pil;Yun, Tae-Soo;Lee, Dong-Hoon
    • The Journal of the Korea Contents Association
    • /
    • v.10 no.2
    • /
    • pp.43-53
    • /
    • 2010
  • The interactive display is providing various interaction modes to users through various ubiquitous computing technologies. These methods were studied for their interactions, but the limits that it is provided to only single user and the device usability were generated. In this paper, we propose a new type of spatial multi interaction interface that provide the various spatial touch interactive to multi users in the ambient display environment. Therefore, we generate the interaction surface so that a user can interact through the IR-LEDs Array Bar installed in the ceiling of the ambient display environment. At this time, a user can experience the various interactions through the spatial touch in an interaction surface. Consequently, this system offers the interactive display and interface method that the users can interact through natural hand movement without the portable devices.

A study on user satisfaction in TUI environment (TUI 환경의 유저 사용 만족도 연구)

  • Choi, heungyeorl;Yang, seungyong
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.11 no.4
    • /
    • pp.113-127
    • /
    • 2015
  • An interface in smart device environment is changing to TUI(touch user interface) environment where a system is being controlled by physical touch, differently from a system controlled through conventional mouse and keyboard. What is more important than anything else in this TUI environment is to implement interface in consideration of learn ability and cognitive constructivism according to user's experience. Therefore, now is the time when it is necessary to carry out various studies on smart content design process going a step farther together with discussing the details of user's experience factor. Hence, this study was intended to look into what effect a user's experiential traits had on the production of contents for the purpose of measures for improving TUI user satisfaction in order to effectively realize contents in smart environment. Results were yielded by using a statistical empirical analysis such as cross-tabulation analysis according to important variable and user, paired t-test, multiple response analysis, and preference frequency analysis of user preference on the basis of a survey. As a result, a system was presented for implementing DFSS(Design For Six Sigma) process. TUI experience factor can be divided into direct habitual experience, direct learning experience, indirect habitual experience, and indirect learning experience. And in the results of study, it was possible to find that the important variables of this study had a positive effect on the improvement of use satisfaction with contents on the whole according to the user convenience of smart contents. This study is expected to have a positive effect on efficient smart device-based contents production by providing objective information according to empirical analysis to smart media-based developer and designer and presenting a model for improving the changed TUI usability.

Prediction of Menu selection on Touch-screen Using A Cognitive Architecture: ACT-R (ACT-R을 이용한 터치스크린 메뉴 선택 수행 예측)

  • Min, Jung-Sang;Jo, Seong-Sik;Myung, Ro-Hae
    • Journal of the Ergonomics Society of Korea
    • /
    • v.29 no.6
    • /
    • pp.907-914
    • /
    • 2010
  • Cognitive model, that is cognitive architecture, is the model expressed with computer program to show the process how human solve a certain problem and it is continuously under investigation through various fields of study such as cognitive engineering, computer engineering, and cognitive psychology. In addition, the much extensive applicability of cognitive model usually helps it to be used for quantitative prediction of human Behavior or Natural programming of human performance in many HCI areas including User Interface Usability, artificial intelligence, natural programming language and also Robot engineering. Meanwhile, when a system designed, an usability test about conceptual design of interface is needed and in this case, analysis evaluation using cognitive model like GOMS or ACT-R is much more effective than empirical evaluation which naturally needs products and subjects. In particular, if we consider the recent trend of very short-end term between a previous technology development and the next new one, it would take time and much efforts to choose subjects and train them in order to conduct usability test which is repeatedly followed in the process of a system development and this finally would bring delays of development of a new system. In this study, we predicted quantitatively the human behavior processes which contains cognitive processes for menu selection in touch screen interface through ACT-R, one of the common method of usability test. Throughout the study, it was shown that the result using cognitive model was equal with the result using existing empirical evaluation. And it is expected that cognitive model has a possibility not only to be used as an effective methodology for evaluation of HCI products or system but also to contribute the activation of HCI cognitive modeling in Korea.

Implementation of Gesture Interface for Projected Surfaces

  • Park, Yong-Suk;Park, Se-Ho;Kim, Tae-Gon;Chung, Jong-Moon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.1
    • /
    • pp.378-390
    • /
    • 2015
  • Image projectors can turn any surface into a display. Integrating a surface projection with a user interface transforms it into an interactive display with many possible applications. Hand gesture interfaces are often used with projector-camera systems. Hand detection through color image processing is affected by the surrounding environment. The lack of illumination and color details greatly influences the detection process and drops the recognition success rate. In addition, there can be interference from the projection system itself due to image projection. In order to overcome these problems, a gesture interface based on depth images is proposed for projected surfaces. In this paper, a depth camera is used for hand recognition and for effectively extracting the area of the hand from the scene. A hand detection and finger tracking method based on depth images is proposed. Based on the proposed method, a touch interface for the projected surface is implemented and evaluated.

Implementation of Paper Keyboard Piano with a Kinect (키넥트를 이용한 종이건반 피아노 구현 연구)

  • Lee, Jung-Chul;Kim, Min-Seong
    • Journal of the Korea Society of Computer and Information
    • /
    • v.17 no.12
    • /
    • pp.219-228
    • /
    • 2012
  • In this paper, we propose a paper keyboard piano implementation using the finger movement detection with the 3D image data from a kinect. Keyboard pattern and keyboard depth information are extracted from the color image and depth image to detect the touch event on the paper keyboard and to identify the touched key. Hand region detection error is unavoidable when using the simple comparison method between input depth image and background depth image, and this error is critical in key touch detection. Skin color is used to minimize the error. And finger tips are detected using contour detection with area limit and convex hull. Finally decision of key touch is carried out with the keyboard pattern information at the finger tip position. The experimental results showed that the proposed method can detect key touch with high accuracy. Paper keyboard piano can be utilized for the easy and convenient interface for the beginner to learn playing piano with the PC-based learning software.

Design of Handwriting-based Text Interface for Support of Mobile Platform Education Contents (모바일 플랫폼 교육 콘텐츠 지원을 위한 손 글씨 기반 텍스트 인터페이스 설계)

  • Cho, Yunsik;Cho, Sae-Hong;Kim, Jinmo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.27 no.5
    • /
    • pp.81-89
    • /
    • 2021
  • This study proposes a text interface for support of language-based educational contents in a mobile platform environment. The proposed interface utilizes deep learning as an input structure to write words through handwriting. Based on GUI (Graphical User Interface) using buttons and menus of mobile platform contents and input methods such as screen touch, click, and drag, we design a text interface that can directly input and process handwriting from the user. It uses the EMNIST (Extended Modified National Institute of Standards and Technology database) dataset and a trained CNN (Convolutional Neural Network) to classify and combine alphabetic texts to complete words. Finally, we conduct experiments to analyze the learning support effect of the interface proposed by directly producing English word education contents and to compare satisfaction. We compared the ability to learn English words presented by users who have experienced the existing keypad-type interface and the proposed handwriting-based text interface in the same educational environment, and we analyzed the overall satisfaction in the process of writing words by manipulating the interface.

The Influence of Altering Mobile Phone Interface on the Generation of Mental Model (모바일 폰의 인터페이스 변경이 멘탈모델 형성에 미치는 영향)

  • Park, Ye-Jin;Kim, Bon-Han
    • Science of Emotion and Sensibility
    • /
    • v.11 no.4
    • /
    • pp.575-588
    • /
    • 2008
  • This study is to inquire respective patterns of mental models caused by wrongful usages which can be experienced when a user who is used to a keypad-based mobile phone starts using a touch screen mobile phone and to find out the features of the user's logical process of correcting such wrongful usages to a new mental model. In addition, design improvement to be considered for easy generation of the mental model regarding touch screen mobile phones was reviewed in this study. We set up test subjects for the most frequently used seven high priority functions among touch screen phone functions and carried out the subject assessment together with interview surveys after the video observation experiment. Our test results show that test subjects who were used to keypad-based mobile phones tend to use operation knowledge related to the computer operational system(Window) or the web browse, navigation including Tap or Double Tap in order to correct the mental model when a wrongful usage is made. In addition, the result of comparison and analysis of the subject assessment and the video observation experiment data shows that wrongful usages of touch screen mobile phones mostly occurred in the field of 'information feedback' and 'navigation' among mobile phone components.

  • PDF

A Study for the Accessibility of Camera-Based Mobile Applications on Touch Screen Devices for Blind People (스마트기기에서 시각장애인을 위한 카메라기반 인식 소프트웨어 인터페이스의 접근성 연구)

  • Choi, Yoonjung;Hong, Ki-Hyung
    • Journal of the HCI Society of Korea
    • /
    • v.7 no.2
    • /
    • pp.49-56
    • /
    • 2012
  • The camera-based mobile applications such as color, pattern and object reading can improve the living quality of blind people. However currently available camera-based applications are uncomfortable for the blind, since these applications do not reflect accessibility requirements of the blind especially on touch screen. We investigated accessibility requirements about rapidly growing camera-based mobile applications on touch screen devices for the blind. In order to identify accessibility requirements, we conducted a usability testing for color reading applications with three different types of interfaces on Android OS. The results of the usability testing were as follows: (1) users preferred short depth of menu hierarchy, (2) the initial audio help was more useful than just-in-time help, (3) users needed both manual and automatic camera shooting modes although they preferred manual to automatic mode, (4) users wanted the OS supported screen reader function to be turned off during the color reading application was running, and (5) users required tactile feedback to identify touch screen boundary. We designed a new user interface for blind people by applying the identified accessibility requirements. From a usability testing of the new user interface with 10 blind people, we showed that the identified accessibility requirements were very useful accessibility guidelines for camera-based mobile applications.

  • PDF