• 제목/요약/키워드: Interaction interface

검색결과 1,286건 처리시간 0.264초

A Study on Developmental Direction of Interface Design for Gesture Recognition Technology

  • Lee, Dong-Min;Lee, Jeong-Ju
    • 대한인간공학회지
    • /
    • 제31권4호
    • /
    • pp.499-505
    • /
    • 2012
  • Objective: Research on the transformation of interaction between mobile machines and users through analysis on current gesture interface technology development trend. Background: For smooth interaction between machines and users, interface technology has evolved from "command line" to "mouse", and now "touch" and "gesture recognition" have been researched and being used. In the future, the technology is destined to evolve into "multi-modal", the fusion of the visual and auditory senses and "3D multi-modal", where three dimensional virtual world and brain waves are being used. Method: Within the development of computer interface, which follows the evolution of mobile machines, actively researching gesture interface and related technologies' trend and development will be studied comprehensively. Through investigation based on gesture based information gathering techniques, they will be separated in four categories: sensor, touch, visual, and multi-modal gesture interfaces. Each category will be researched through technology trend and existing actual examples. Through this methods, the transformation of mobile machine and human interaction will be studied. Conclusion: Gesture based interface technology realizes intelligent communication skill on interaction relation ship between existing static machines and users. Thus, this technology is important element technology that will transform the interaction between a man and a machine more dynamic. Application: The result of this study may help to develop gesture interface design currently in use.

Interaction and mechanical effect of materials interface of contact zone composite samples: Uniaxial compression experimental and numerical studies

  • Wang, Weiqi;Ye, Yicheng;Wang, Qihu;Luo, Binyu;Wang, Jie;Liu, Yang
    • Geomechanics and Engineering
    • /
    • 제21권6호
    • /
    • pp.571-582
    • /
    • 2020
  • Aiming at the mechanical and structural characteristics of the contact zone composite rock, the uniaxial compression tests and numerical studies were carried out. The interaction forms and formation mechanisms at the contact interfaces of different materials were analyzed to reveal the effect of interaction on the mechanical behavior of composite samples. The research demonstrated that there are three types of interactions between the two materials at the contact interface: constraint parallel to the interface, squeezing perpendicular to the interface, and shear stress on the interface. The interaction is mainly affected by the differences in Poisson's ratio and elastic modulus of the two materials, stronger interface adhesion, and larger interface inclination. The interaction weakens the strength and stiffness of the composite sample, and the magnitude of weakening is positively correlated with the degree of difference in the mechanical properties of the materials. The tensile-shear stress derived from the interaction results in the axial tensile fracture perpendicular to the interface and the interfacial shear facture. Tensile cracks in stronger material will propagation into the weaker material through the bonded interface. The larger inclination angle of the interface enhances the effect of composite tensile/shear failure on the overall sample.

MPEG-U-based Advanced User Interaction Interface Using Hand Posture Recognition

  • Han, Gukhee;Choi, Haechul
    • IEIE Transactions on Smart Processing and Computing
    • /
    • 제5권4호
    • /
    • pp.267-273
    • /
    • 2016
  • Hand posture recognition is an important technique to enable a natural and familiar interface in the human-computer interaction (HCI) field. This paper introduces a hand posture recognition method using a depth camera. Moreover, the hand posture recognition method is incorporated with the Moving Picture Experts Group Rich Media User Interface (MPEG-U) Advanced User Interaction (AUI) Interface (MPEG-U part 2), which can provide a natural interface on a variety of devices. The proposed method initially detects positions and lengths of all fingers opened, and then recognizes the hand posture from the pose of one or two hands, as well as the number of fingers folded when a user presents a gesture representing a pattern in the AUI data format specified in MPEG-U part 2. The AUI interface represents a user's hand posture in the compliant MPEG-U schema structure. Experimental results demonstrate the performance of the hand posture recognition system and verified that the AUI interface is compatible with the MPEG-U standard.

MPEG-U part 2 기반 향상된 사용자 상호작용 인터페이스 시스템 (MPEG-U part 2 based Advanced User Interaction Interface System)

  • 한국희;백아람;최해철
    • 한국콘텐츠학회논문지
    • /
    • 제12권12호
    • /
    • pp.54-62
    • /
    • 2012
  • 향상된 사용자 상호작용 (AUI: Advanced User Interaction) 인터페이스(interface)의 목적은 다양한 입/출력 장치와 비디오, 오디오, 그래픽 등의 객체로 표현되는 장면 기술(scene description) 사이에서 정보의 상호연동을 향상시키는 것이다. 이를 위해서 국제 표준화 기구인 MPEG(moving picture experts group)에서는 MPEG-U part 2: AUI Interface 프로젝트를 통해서 AUI 인터페이스 데이터 포맷의 표준화를 진행 중이다. 본 논문에서는 MPEG-U part 2의 표준을 소개하고, 이 표준에 기반을 둔 AUI 인터페이스 시스템을 제안한다. 제안하는 AUI 인터페이스 시스템은 크게 UID(User Interaction Device)의 데이터를 처리하는 사용자 인터페이스 입/출력부와 XML 문서를 처리하는 MPEG-U XML 생성/해석부로 구성된다. 본 시스템은 MPEG-U 표준 기반 입/출력 장치와 사용자와의 상호작용을 향상시키기 위한 시스템의 프레임 워크로 활용될 수 있다. 실험에서는 제안하는 사용자 상호작용 인터페이스 시스템이 MPEG-U part2 표준에 적합한지를 보이며 이를 이용하여 MPEG-U part 2 표준의 타당성을 검증한다.

인관과 로봇의 다양한 상호작용을 위한 휴대 매개인터페이스 ‘핸디밧’ (A Portable Mediate Interface 'Handybot' for the Rich Human-Robot Interaction)

  • 황정훈;권동수
    • 제어로봇시스템학회논문지
    • /
    • 제13권8호
    • /
    • pp.735-742
    • /
    • 2007
  • The importance of the interaction capability of a robot increases as the application of a robot is extended to a human's daily life. In this paper, a portable mediate interface Handybot is developed with various interaction channels to be used with an intelligent home service robot. The Handybot has a task-oriented channel of an icon language as well as a verbal interface. It also has an emotional interaction channel that recognizes a user's emotional state from facial expression and speech, transmits that state to the robot, and expresses the robot's emotional state to the user. It is expected that the Handybot will reduce spatial problems that may exist in human-robot interactions, propose a new interaction method, and help creating rich and continuous interactions between human users and robots.

FPS 게임에서 제스처 인식과 VR 컨트롤러를 이용한 게임 상호 작용 제어 설계 (Design of Gaming Interaction Control using Gesture Recognition and VR Control in FPS Game)

  • 이용환;안효창
    • 반도체디스플레이기술학회지
    • /
    • 제18권4호
    • /
    • pp.116-119
    • /
    • 2019
  • User interface/experience and realistic game manipulation play an important role in virtual reality first-person-shooting game. This paper presents an intuitive hands-free interface of gaming interaction scheme for FPS based on user's gesture recognition and VR controller. We focus on conventional interface of VR FPS interaction, and design the player interaction wearing head mounted display with two motion controllers; leap motion to handle low-level physics interaction and VIVE tracker to control movement of the player joints in the VR world. The FPS prototype system shows that the design interface helps to enjoy playing immersive FPS and gives players a new gaming experience.

TV와 휴대폰의 사용자 인터페이스 디자인 비교 연구 (A Comparative Study for User Interface Design between TV and Mobile Phone)

  • 반영환
    • 대한인간공학회지
    • /
    • 제27권1호
    • /
    • pp.29-35
    • /
    • 2008
  • An estimated 1 billion mobile phone were sold globally in the year 2006. In Korea, people watch television 3.17 hours in a day. Television isn't what it used to be. Digital TV provides both interactivity and high definition. Mobile phone also transferred from 2G to 3G or 3.5G. This means the complexity of TV and mobile phone is increased, design of user interface is more difficult. Unlike the personal computer industry, TV and mobile phone industries have no standard user interface. A comparative study for user interface between TV and mobile phone is studied. User, task, system are analyzed in requirement analysis. User interface model and interaction are also analyzed between TV and mobile phone. This study provides some insights for user interface design. First, the UI designer have to consider another products because one user using one product at the same time using another products. Experience for one product affects that for another product. Second, TV and mobile phone show very similar pattern, especially interaction task and input interaction. Third, there are not sometimes optimized experience between service operator and device manufacturer. Cooperative design between them is required.