• Title/Summary/Keyword: Interaction interface

Search Result 1,287, Processing Time 0.023 seconds

A Study on Developmental Direction of Interface Design for Gesture Recognition Technology

  • Lee, Dong-Min;Lee, Jeong-Ju
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.499-505
    • /
    • 2012
  • Objective: Research on the transformation of interaction between mobile machines and users through analysis on current gesture interface technology development trend. Background: For smooth interaction between machines and users, interface technology has evolved from "command line" to "mouse", and now "touch" and "gesture recognition" have been researched and being used. In the future, the technology is destined to evolve into "multi-modal", the fusion of the visual and auditory senses and "3D multi-modal", where three dimensional virtual world and brain waves are being used. Method: Within the development of computer interface, which follows the evolution of mobile machines, actively researching gesture interface and related technologies' trend and development will be studied comprehensively. Through investigation based on gesture based information gathering techniques, they will be separated in four categories: sensor, touch, visual, and multi-modal gesture interfaces. Each category will be researched through technology trend and existing actual examples. Through this methods, the transformation of mobile machine and human interaction will be studied. Conclusion: Gesture based interface technology realizes intelligent communication skill on interaction relation ship between existing static machines and users. Thus, this technology is important element technology that will transform the interaction between a man and a machine more dynamic. Application: The result of this study may help to develop gesture interface design currently in use.

Interaction and mechanical effect of materials interface of contact zone composite samples: Uniaxial compression experimental and numerical studies

  • Wang, Weiqi;Ye, Yicheng;Wang, Qihu;Luo, Binyu;Wang, Jie;Liu, Yang
    • Geomechanics and Engineering
    • /
    • v.21 no.6
    • /
    • pp.571-582
    • /
    • 2020
  • Aiming at the mechanical and structural characteristics of the contact zone composite rock, the uniaxial compression tests and numerical studies were carried out. The interaction forms and formation mechanisms at the contact interfaces of different materials were analyzed to reveal the effect of interaction on the mechanical behavior of composite samples. The research demonstrated that there are three types of interactions between the two materials at the contact interface: constraint parallel to the interface, squeezing perpendicular to the interface, and shear stress on the interface. The interaction is mainly affected by the differences in Poisson's ratio and elastic modulus of the two materials, stronger interface adhesion, and larger interface inclination. The interaction weakens the strength and stiffness of the composite sample, and the magnitude of weakening is positively correlated with the degree of difference in the mechanical properties of the materials. The tensile-shear stress derived from the interaction results in the axial tensile fracture perpendicular to the interface and the interfacial shear facture. Tensile cracks in stronger material will propagation into the weaker material through the bonded interface. The larger inclination angle of the interface enhances the effect of composite tensile/shear failure on the overall sample.

MPEG-U-based Advanced User Interaction Interface Using Hand Posture Recognition

  • Han, Gukhee;Choi, Haechul
    • IEIE Transactions on Smart Processing and Computing
    • /
    • v.5 no.4
    • /
    • pp.267-273
    • /
    • 2016
  • Hand posture recognition is an important technique to enable a natural and familiar interface in the human-computer interaction (HCI) field. This paper introduces a hand posture recognition method using a depth camera. Moreover, the hand posture recognition method is incorporated with the Moving Picture Experts Group Rich Media User Interface (MPEG-U) Advanced User Interaction (AUI) Interface (MPEG-U part 2), which can provide a natural interface on a variety of devices. The proposed method initially detects positions and lengths of all fingers opened, and then recognizes the hand posture from the pose of one or two hands, as well as the number of fingers folded when a user presents a gesture representing a pattern in the AUI data format specified in MPEG-U part 2. The AUI interface represents a user's hand posture in the compliant MPEG-U schema structure. Experimental results demonstrate the performance of the hand posture recognition system and verified that the AUI interface is compatible with the MPEG-U standard.

MPEG-U part 2 based Advanced User Interaction Interface System (MPEG-U part 2 기반 향상된 사용자 상호작용 인터페이스 시스템)

  • Han, Gukhee;Baek, A-Ram;Choi, Haechul
    • The Journal of the Korea Contents Association
    • /
    • v.12 no.12
    • /
    • pp.54-62
    • /
    • 2012
  • AUI(Advanced User Interaction) interface aims to enhance interaction between various input/output devices and scene descriptions represented by video, audio, and graphic. Recently, MPEG-U part 2 standardization for the AUI interface is under development by MPEG(moving picture experts group). This paper introduces MPEG-U part 2 standard and presents MPEG-U part 2 AUI interface system. The AUI interface system consists of user interface in/output modules and MPEG-U XML generation/interpretation modules. The former and the latter are for UID data handling and XML data processing, respectively. This system MPEG-U standards-based input/output devices and to improve the interaction with the user can be used as a framework. By implementation of the proposed AUI interface system, MPEG-U usage scenario is introduced and it is verified that the AUI interface system conforms to MPEG-U standard.

A Portable Mediate Interface 'Handybot' for the Rich Human-Robot Interaction (인관과 로봇의 다양한 상호작용을 위한 휴대 매개인터페이스 ‘핸디밧’)

  • Hwang, Jung-Hoon;Kwon, Dong-Soo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.8
    • /
    • pp.735-742
    • /
    • 2007
  • The importance of the interaction capability of a robot increases as the application of a robot is extended to a human's daily life. In this paper, a portable mediate interface Handybot is developed with various interaction channels to be used with an intelligent home service robot. The Handybot has a task-oriented channel of an icon language as well as a verbal interface. It also has an emotional interaction channel that recognizes a user's emotional state from facial expression and speech, transmits that state to the robot, and expresses the robot's emotional state to the user. It is expected that the Handybot will reduce spatial problems that may exist in human-robot interactions, propose a new interaction method, and help creating rich and continuous interactions between human users and robots.

Design of Gaming Interaction Control using Gesture Recognition and VR Control in FPS Game (FPS 게임에서 제스처 인식과 VR 컨트롤러를 이용한 게임 상호 작용 제어 설계)

  • Lee, Yong-Hwan;Ahn, Hyochang
    • Journal of the Semiconductor & Display Technology
    • /
    • v.18 no.4
    • /
    • pp.116-119
    • /
    • 2019
  • User interface/experience and realistic game manipulation play an important role in virtual reality first-person-shooting game. This paper presents an intuitive hands-free interface of gaming interaction scheme for FPS based on user's gesture recognition and VR controller. We focus on conventional interface of VR FPS interaction, and design the player interaction wearing head mounted display with two motion controllers; leap motion to handle low-level physics interaction and VIVE tracker to control movement of the player joints in the VR world. The FPS prototype system shows that the design interface helps to enjoy playing immersive FPS and gives players a new gaming experience.

A Comparative Study for User Interface Design between TV and Mobile Phone (TV와 휴대폰의 사용자 인터페이스 디자인 비교 연구)

  • Pan, Young-Hwan
    • Journal of the Ergonomics Society of Korea
    • /
    • v.27 no.1
    • /
    • pp.29-35
    • /
    • 2008
  • An estimated 1 billion mobile phone were sold globally in the year 2006. In Korea, people watch television 3.17 hours in a day. Television isn't what it used to be. Digital TV provides both interactivity and high definition. Mobile phone also transferred from 2G to 3G or 3.5G. This means the complexity of TV and mobile phone is increased, design of user interface is more difficult. Unlike the personal computer industry, TV and mobile phone industries have no standard user interface. A comparative study for user interface between TV and mobile phone is studied. User, task, system are analyzed in requirement analysis. User interface model and interaction are also analyzed between TV and mobile phone. This study provides some insights for user interface design. First, the UI designer have to consider another products because one user using one product at the same time using another products. Experience for one product affects that for another product. Second, TV and mobile phone show very similar pattern, especially interaction task and input interaction. Third, there are not sometimes optimized experience between service operator and device manufacturer. Cooperative design between them is required.