• Title/Summary/Keyword: Hand-based User Interface

Search Result 126, Processing Time 0.025 seconds

Human factors guidelines for designing anchors in the moving pictures on multimedia systems (멀티미디어 시스템의 동영상 노드를 위한 앵커의 인간공학적 설계지침)

  • Han, Sung-H.;Kim, Mi-Jeong;Kwahk, Ji-Young
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.22 no.2
    • /
    • pp.265-276
    • /
    • 1996
  • Multimedia systems present information by various media, for example, video, sound, music, animation, movie, etc., in addition to the text which has long been used for conveying the information. Among many multimedia applications, the multimedia information retrieval systems commercialized in the forms of multimedia encyclopedia CD-ROMs, benefited from various media for their ability to present information in an efficient and complete way. But using several media, on the other hand, may cause end users' confusion and furthermore, poorly designed user interface often exacerbates the situation. In this study, the multimedia systems were studied from the standpoint of usability. The conceptual framework of the user interface of the multimedia system was newly defined. And 100 initial variables for user interface design of general multimedia systems were suggested through literature survey and expert opinions based upon the framework developed. Among various application areas, the multimedia information retrieval systems were chosen for investigation, and 36 variables particularly relevant to user interface of the multimedia information retrieval systems were selected. According to the sequential research strategy, the variables that were considered to be most important were finally selected through a screening stage. A part of selected variables were verified through a human factors experiment as the first step of sequential research. Based upon the result of the experiment, guidelines for user interface design were provided. For future study, the variables remained will be Investigated and the study will expand to another application areas.

  • PDF

Intuitive Spatial Drawing System based on Hand Interface (손 인터페이스 기반 직관적인 공간 드로잉 시스템)

  • Ko, Ginam;Kim, Serim;Kim, YoungEun;Nam, SangHun
    • Journal of Digital Contents Society
    • /
    • v.18 no.8
    • /
    • pp.1615-1620
    • /
    • 2017
  • The development of Virtual Reality (VR)-related technologies has resulted in the improved performance of VR devices as well as affordable price arrangements, granting many users easy access to VR technology. VR drawing applications are not complicated for users and are also highly mature, being used for education, performances, and more. For controller-based spatial drawing interfaces, the user's drawing interface becomes constrained by the controller. This study proposes hand interaction based spatial drawing system where the user, who has never used the controller before, can intuitively use the drawing application by mounting LEAP Motion at the front of the Head Mounted Display (HMD). This traces the motion of the user's hand in front of the HMD to draw curved surfaces in virtual environments.

Functional Analysis and Design of Touch User Interface in Mobile Game (모바일게임 터치사용자인터페이스(TUI)의 기능적 분석 및 설계)

  • Kim, Mi-Jin;Yoon, Jin-Hong
    • The Journal of the Korea Contents Association
    • /
    • v.10 no.1
    • /
    • pp.138-146
    • /
    • 2010
  • Currently mobile phones possess the new features including the control interface provided with an ease, an intuition, and a variety and the display ensured for wide area. Mobile phones mounted with the touch screen release actively due to such strengths. This is the mega trend of the development of the latest mobile game. Mobile games set to the past keypad input system have changed for adaptation in the input environment and the progressive development. Consequently it is necessary to research for 'Touch User Interface(TUI)' of mobile games fixed into input environment by "Touch screen". This study have concreted the application method of touch game through the comparison analysis with the past game and implemented touch mobile game based on usability for ten touch mobile game titles released from the inside and outside of the country in oder to apply the touch interface fixed in the game to the hand-hold device with the function of touch interface. The result of this study have two implications. First it enhances the playability and diversity of game genre restricted by reason of the limitation of the past keypad input device. Second, it utilizes the basis for the standard of the interface of the touch mobile game by genre.

Intelligent interface using hand gestures recognition based on artificial intelligence (인공지능 기반 손 체스처 인식 정보를 활용한 지능형 인터페이스)

  • Hangjun Cho;Junwoo Yoo;Eun Soo Kim;Young Jae Lee
    • Journal of Platform Technology
    • /
    • v.11 no.1
    • /
    • pp.38-51
    • /
    • 2023
  • We propose an intelligent interface algorithm using hand gesture recognition information based on artificial intelligence. This method is functionally an interface that recognizes various motions quickly and intelligently by using MediaPipe and artificial intelligence techniques such as KNN, LSTM, and CNN to track and recognize user hand gestures. To evaluate the performance of the proposed algorithm, it is applied to a self-made 2D top-view racing game and robot control. As a result of applying the algorithm, it was possible to control various movements of the virtual object in the game in detail and robustly. And the result of applying the algorithm to the robot control in the real world, it was possible to control movement, stop, left turn, and right turn. In addition, by controlling the main character of the game and the robot in the real world at the same time, the optimized motion was implemented as an intelligent interface for controlling the coexistence space of virtual and real world. The proposed algorithm enables sophisticated control according to natural and intuitive characteristics using the body and fine movement recognition of fingers, and has the advantage of being skilled in a short period of time, so it can be used as basic data for developing intelligent user interfaces.

  • PDF

Study on User Interface for a Capacitive-Sensor Based Smart Device

  • Jung, Sun-IL;Kim, Young-Chul
    • Smart Media Journal
    • /
    • v.8 no.3
    • /
    • pp.47-52
    • /
    • 2019
  • In this paper, we designed HW / SW interfaces for processing the signals of capacitive sensors like Electric Potential Sensor (EPS) to detect the surrounding electric field disturbance as feature signals in motion recognition systems. We implemented a smart light control system with those interfaces. In the system, the on/off switch and brightness adjustment are controlled by hand gestures using the designed and fabricated interface circuits. PWM (Pulse Width Modulation) signals of the controller with a driver IC are used to drive the LED and to control the brightness and on/off operation. Using the hand-gesture signals obtained through EPS sensors and the interface HW/SW, we can not only construct a gesture instructing system but also accomplish the faster recognition speed by developing dedicated interface hardware including control circuitry. Finally, using the proposed hand-gesture recognition and signal processing methods, the light control module was also designed and implemented. The experimental result shows that the smart light control system can control the LED module properly by accurate motion detection and gesture classification.

NUI/NUX of the Virtual Monitor Concept using the Concentration Indicator and the User's Physical Features (사용자의 신체적 특징과 뇌파 집중 지수를 이용한 가상 모니터 개념의 NUI/NUX)

  • Jeon, Chang-hyun;Ahn, So-young;Shin, Dong-il;Shin, Dong-kyoo
    • Journal of Internet Computing and Services
    • /
    • v.16 no.6
    • /
    • pp.11-21
    • /
    • 2015
  • As growing interest in Human-Computer Interaction(HCI), research on HCI has been actively conducted. Also with that, research on Natural User Interface/Natural User eXperience(NUI/NUX) that uses user's gesture and voice has been actively conducted. In case of NUI/NUX, it needs recognition algorithm such as gesture recognition or voice recognition. However these recognition algorithms have weakness because their implementation is complex and a lot of time are needed in training because they have to go through steps including preprocessing, normalization, feature extraction. Recently, Kinect is launched by Microsoft as NUI/NUX development tool which attracts people's attention, and studies using Kinect has been conducted. The authors of this paper implemented hand-mouse interface with outstanding intuitiveness using the physical features of a user in a previous study. However, there are weaknesses such as unnatural movement of mouse and low accuracy of mouse functions. In this study, we designed and implemented a hand mouse interface which introduce a new concept called 'Virtual monitor' extracting user's physical features through Kinect in real-time. Virtual monitor means virtual space that can be controlled by hand mouse. It is possible that the coordinate on virtual monitor is accurately mapped onto the coordinate on real monitor. Hand-mouse interface based on virtual monitor concept maintains outstanding intuitiveness that is strength of the previous study and enhance accuracy of mouse functions. Further, we increased accuracy of the interface by recognizing user's unnecessary actions using his concentration indicator from his encephalogram(EEG) data. In order to evaluate intuitiveness and accuracy of the interface, we experimented it for 50 people from 10s to 50s. As the result of intuitiveness experiment, 84% of subjects learned how to use it within 1 minute. Also, as the result of accuracy experiment, accuracy of mouse functions (drag(80.4%), click(80%), double-click(76.7%)) is shown. The intuitiveness and accuracy of the proposed hand-mouse interface is checked through experiment, this is expected to be a good example of the interface for controlling the system by hand in the future.

Development of educational software for beam loading analysis using pen-based user interfaces

  • Suh, Yong S.
    • Journal of Computational Design and Engineering
    • /
    • v.1 no.1
    • /
    • pp.67-77
    • /
    • 2014
  • Most engineering software tools use typical menu-based user interfaces, and they may not be suitable for learning tools because the solution processes are hidden and students can only see the results. An educational tool for simple beam analyses is developed using a pen-based user interface with a computer so students can write and sketch by hand. The geometry of beam sections is sketched, and a shape matching technique is used to recognize the sketch. Various beam loads are added by sketching gestures or writing singularity functions. Students sketch the distributions of the loadings by sketching the graphs, and they are automatically checked and the system provides aids in grading the graphs. Students receive interactive graphical feedback for better learning experiences while they are working on solving the problems.

Touch User Interface of Relative Coordinate Style based on Drag and Diversion Operations (드래그 및 방향전환 동작 기반의 상대좌표형 터치 유저 인터페이스)

  • Paik, Jung-Hoon;Choi, Kyung-Soon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.1
    • /
    • pp.89-98
    • /
    • 2011
  • In this paper, a new touch user interface which is based on the hand operations of dragging and diversion is presented. With it, convenience and quickness for inputting of texts as well as searching and selecting of multi-layered menus are improved. The new interface also applies relative coordinate systems which display texts on the touch positions corresponding to the moving of touch location. It accommodates more text codes than those in conventional fixed coordinate systems which allocates texts to fixed location on touch screen. The suggested interface is implemented to IPTV remote control and set-top box to prove its practicality and effectiveness.

Comparative Study on the Interface and Interaction for Manipulating 3D Virtual Objects in a Virtual Reality Environment (가상현실 환경에서 3D 가상객체 조작을 위한 인터페이스와 인터랙션 비교 연구)

  • Park, Kyeong-Beom;Lee, Jae Yeol
    • Korean Journal of Computational Design and Engineering
    • /
    • v.21 no.1
    • /
    • pp.20-30
    • /
    • 2016
  • Recently immersive virtual reality (VR) becomes popular due to the advanced development of I/O interfaces and related SWs for effectively constructing VR environments. In particular, natural and intuitive manipulation of 3D virtual objects is still considered as one of the most important user interaction issues. This paper presents a comparative study on the manipulation and interaction of 3D virtual objects using different interfaces and interactions in three VR environments. The comparative study includes both quantitative and qualitative aspects. Three different experimental setups are 1) typical desktop-based VR using mouse and keyboard, 2) hand gesture-supported desktop VR using a Leap Motion sensor, and 3) immersive VR by wearing an HMD with hand gesture interaction using a Leap Motion sensor. In the desktop VR with hand gestures, the Leap Motion sensor is put on the desk. On the other hand, in the immersive VR, the sensor is mounted on the HMD so that the user can manipulate virtual objects in the front of the HMD. For the quantitative analysis, a task completion time and success rate were measured. Experimental tasks require complex 3D transformation such as simultaneous 3D translation and 3D rotation. For the qualitative analysis, various factors relating to user experience such as ease of use, natural interaction, and stressfulness were evaluated. The qualitative and quantitative analyses show that the immersive VR with the natural hand gesture provides more intuitive and natural interactions, supports fast and effective performance on task completion, but causes stressful condition.

A Gesture Interface based on Hologram and Haptics Environments for Interactive and Immersive Experiences (상호작용과 몰입 향상을 위한 홀로그램과 햅틱 환경 기반의 동작 인터페이스)

  • Pyun, Hae-Gul;An, Haeng-A;Yuk, Seongmin;Park, Jinho
    • Journal of Korea Game Society
    • /
    • v.15 no.1
    • /
    • pp.27-34
    • /
    • 2015
  • This paper proposes a user interface for enhancing immersiveness and usability by combining hologram and haptic device with common Leap Motion. While Leap Motion delivers physical motion of user hand to control virtual environment, it is limited to handle virtual hands on screen and interact with virtual environment in one way. In our system, hologram is coupled with Leap Motion to improve user immersiveness by arranging real and virtual hands in the same place. Moreover, we provide a interaction prototype of sense by designing a haptic device to convey touch sense in virtual environment to user's hand.