• Title/Summary/Keyword: Gesture Design

Search Result 152, Processing Time 0.025 seconds

Usability Test Guidelines for Speech-Oriented Multimodal User Interface (음성기반 멀티모달 사용자 인터페이스의 사용성 평가 방법론)

  • Hong, Ki-Hyung
    • MALSORI
    • /
    • no.67
    • /
    • pp.103-120
    • /
    • 2008
  • Basic components for multimodal interface, such as speech recognition, speech synthesis, gesture recognition, and multimodal fusion, have their own technological limitations. For example, the accuracy of speech recognition decreases for large vocabulary and in noisy environments. In spite of those technological limitations, there are lots of applications in which speech-oriented multimodal user interfaces are very helpful to users. However, in order to expand application areas for speech-oriented multimodal interfaces, we have to develop the interfaces focused on usability. In this paper, we introduce usability and user-centered design methodology in general. There has been much work for evaluating spoken dialogue systems. We give a summary for PARADISE (PARAdigm for Dialogue System Evaluation) and PROMISE (PROcedure for Multimodal Interactive System Evaluation) that are the generalized evaluation frameworks for voice and multimodal user interfaces. Then, we present usability components for speech-oriented multimodal user interfaces and usability testing guidelines that can be used in a user-centered multimodal interface design process.

  • PDF

Guidelines for Satisfactory Flick Performances in Touch Screen Mobile Phone (풀터치 휴대폰의 플릭(Flick) 성능에 대한 평가 및 가이드라인)

  • Kim, Huhn
    • Journal of the Ergonomics Society of Korea
    • /
    • v.29 no.4
    • /
    • pp.541-546
    • /
    • 2010
  • The gesture 'Flick' is the most fundamental and important part for efficient interactions in the touch screen that are being extensively applied to mobile phones. This study investigated users' satisfaction of the flick operation in representative touch phones, and measured their performances with established three measures: gap between finger and initial cursor, the number of moved lists per 0.2 seconds, and the number of moved lists after ten continuous flicks. The measurement was performed with high speed camera and motion analysis software. The flick movement in mobile phone with high users' satisfaction showed that the gap between finger and cursor positions was less and the speed reached high within 0.6 seconds quickly and then was drastically slow down. Especially, maximal and common time intervals between continuous flicks were measured with an experiment. Based on the evaluation and measurement, several design guidelines for efficient flick performances were suggested.

Design and Implementation of Immersive Media System Based on Dynamic Projection Mapping and Gesture Recognition (동적 프로젝션 맵핑과 제스처 인식 기반의 실감 미디어 시스템 설계 및 구현)

  • Kim, Sang Joon;Koh, You Jon;Choi, Yoo-Joo
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.9 no.3
    • /
    • pp.109-122
    • /
    • 2020
  • In recent, projection mapping, which has attracted high attention in the field of realistic media, is regarded as a technology to increase the users' immersion. However, most existing methods perform projection mapping on static objects. In this paper, we developed a technology to track the movements of users and dynamically map the media contents to the users' bodies. The projected media content is built by predefined gestures just using the user's bare hands without the special devices. An interactive immersive media system has been implemented by integrating these dynamic projection mapping technologies and gesture-based drawing technologies. The proposed realistic media system recognizes the movements and open / closed states of the user 's hands, selects the functions necessary to draw a picture. The users can freely draw the picture by changing the color of the brush using the colors of any real objects. In addition, the user's drawing is dynamically projected on the user's body, allowing the user to design and wear his t-shirt in real-time.

피지컬 인터페이스의 구현에 관한 연구

  • 오병근
    • Archives of design research
    • /
    • v.16 no.2
    • /
    • pp.131-140
    • /
    • 2003
  • The input for computer interaction design is very limited for the users to control the interface by only using keyboard and mouse. However, using the basic electrical engineering, the input design can be different from the existing method. Interactive art using computer technology is recently emersed, which is completed by people's participation. The electric signal transmitted in digital and analogue type from the interface controled by people to the computer can be used in multimedia interaction design. The electric circuit design will be necessary to transmit very safe electric signal from the interface. Electric switch, sensor, and camera technologies can be applied to input interface design, which would be alternative physical interaction without computer keyboard and mouse. This type of interaction design using human's body language and gesture would convey the richness of humanity.

  • PDF

Design and Control of Wire-driven Flexible Robot Following Human Arm Gestures (팔 동작 움직임을 모사하는 와이어 구동 유연 로봇의 설계 및 제어)

  • Kim, Sanghyun;Kim, Minhyo;Kang, Junki;Son, SeungJe;Kim, Dong Hwan
    • The Journal of Korea Robotics Society
    • /
    • v.14 no.1
    • /
    • pp.50-57
    • /
    • 2019
  • This work presents a design and control method for a flexible robot arm operated by a wire drive that follows human gestures. When moving the robot arm to a desired position, the necessary wire moving length is calculated and the motors are rotated accordingly to the length. A robotic arm is composed of a total of two module-formed mechanism similar to real human motion. Two wires are used as a closed loop in one module, and universal joints are attached to each disk to create up, down, left, and right movements. In order to control the motor, the anti-windup PID was applied to limit the sudden change usually caused by accumulated error in the integral control term. In addition, master/slave communication protocol and operation program for linking 6 motors to MYO sensor and IMU sensor output were developed at the same time. This makes it possible to receive the image information of the camera attached to the robot arm and simultaneously send the control command to the robot at high speed.

Interaction between BIM Model and Physical Model During Conceptual Design Stage (설계 초기 단계에서 BIM 모델과 물리적 모델의 상호작용 방안)

  • Yi, Ingeun;Kim, Sung-Ah
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.38C no.4
    • /
    • pp.344-350
    • /
    • 2013
  • It is essential to consider geometry in the early design stage for rational design decisions. However, a crucial decision had been taken by conversation, physical model, and gesture rather than BIM mode which can analyze geometry efficiently. This research proposes the framework of interaction between BIM model and physical model for real-time BIM analysis. Through this real-time system framework of two models, architects can adopt BIM data at early design stage to review analysis of BIM model. It should facilitate dynamic design based on rich BIM information from an early stage to a final stage.

A Method of Immersive Media Display using Multi-Gesture Recognition (멀티 제스처 인식을 이용한 실감 미디어 표출 기법)

  • Yang, Ji-hee;Kim, Young-ae;Park, Goo-man
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2016.06a
    • /
    • pp.19-22
    • /
    • 2016
  • 최근 다양한 실감 미디어 기술과 표출 방법이 제시되고 있다. 특히 개인 방송에서 제작자 및 사용자에게 편리성을 제공하고 시청자의 실감성 체험을 극대화할 필요가 있다. 이에 따라 본 논문에서는 멀티 제스처가 필요한 실감 미디어를 멀티 디스플레이, 시점 선택, 3차원 객체 복원으로 정의하였다. 또한 사용자 및 제작자의 편리성을 위해 필요한 제스처에 대한 종류를 분석하였으며, 사용자의 다양한 제스처를 인식하고 실감 콘텐츠를 표출하도록 시스템을 구성하였다.

  • PDF

Scenario and Content Design System for Immersive Stage Direction (몰입형 무대 연출을 위한 시나리오 및 콘텐츠 설계 시스템)

  • Wen, Mingyun;Xi, Yulong;Kook, Yoonchang;Hong, Tony;Kim, Junoh;Cho, Kyungeun
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2017.04a
    • /
    • pp.1079-1080
    • /
    • 2017
  • Today multimedia technologies are playing an increasingly important role in games, movies, and live performances. In this paper, we design a flexible interactive system integrated with gesture recognition, skeleton tracking, internet communication, and content edition using multi-sensors to direct and control the performance on stage. In this system, the performer can control the elements showed on stage through corresponding gestures and body movements during the performance. The system provides an easier way for users to change the content of the performance if they intent to do.

Cognitive and Emotional Structure of a Robotic Game Player in Turn-based Interaction

  • Yang, Jeong-Yean
    • International journal of advanced smart convergence
    • /
    • v.4 no.2
    • /
    • pp.154-162
    • /
    • 2015
  • This paper focuses on how cognitive and emotional structures affect humans during long-term interaction. We design an interaction with a turn-based game, the Chopstick Game, in which two agents play with numbers using their fingers. While a human and a robot agent alternate turn, the human user applies herself to play the game and to learn new winning skills from the robot agent. Conventional valence and arousal space is applied to design emotional interaction. For the robotic system, we implement finger gesture recognition and emotional behaviors that are designed for three-dimensional virtual robot. In the experimental tests, the properness of the proposed schemes is verified and the effect of the emotional interaction is discussed.

A 3D Parametric CAD System for Smart Devices (스마트 디바이스를 위한 3D 파라메트릭 CAD 시스템)

  • Kang, Yuna;Han, Soonhung
    • Korean Journal of Computational Design and Engineering
    • /
    • v.19 no.2
    • /
    • pp.191-201
    • /
    • 2014
  • A 3D CAD system that can be used on a smart device is proposed. Smart devices are now a part of everyday life and also are widely being used in various industry domains. The proposed 3D CAD system would allow modeling in a rapid and intuitive manner on a smart device when an engineer makes a 3D model of a product while moving in an engineering site. There are several obstacles to develop a 3D CAD system on a smart device such as low computing power and the small screen of smart devices, imprecise touch inputs, and transfer problems of created 3D models between PCs and smart devices. The author discusses the system design of a 3D CAD system on a smart device. The selection of the modeling operations, the assignment of touch gestures to these operations, and the construction of a geometric kernel for creating both meshes and a procedural CAD model are introduced. The proposed CAD system has been implemented for validation by user tests and to demonstrate case studies using test examples. Using the proposed system, it is possible to produce a 3D editable model swiftly and simply in a smart device environment to reduce design time of engineers.