• Title/Summary/Keyword: 제스처 제안

Search Result 265, Processing Time 0.023 seconds

Development for Multi-modal Realistic Experience I/O Interaction System (멀티모달 실감 경험 I/O 인터랙션 시스템 개발)

  • Park, Jae-Un;Whang, Min-Cheol;Lee, Jung-Nyun;Heo, Hwan;Jeong, Yong-Mu
    • Science of Emotion and Sensibility
    • /
    • v.14 no.4
    • /
    • pp.627-636
    • /
    • 2011
  • The purpose of this study is to develop the multi-modal interaction system. This system provides realistic and an immersive experience through multi-modal interaction. The system recognizes user behavior, intention, and attention, which overcomes the limitations of uni-modal interaction. The multi-modal interaction system is based upon gesture interaction methods, intuitive gesture interaction and attention evaluation technology. The gesture interaction methods were based on the sensors that were selected to analyze the accuracy of the 3-D gesture recognition technology using meta-analysis. The elements of intuitive gesture interaction were reflected through the results of experiments. The attention evaluation technology was developed by the physiological signal analysis. This system is divided into 3 modules; a motion cognitive system, an eye gaze detecting system, and a bio-reaction sensing system. The first module is the motion cognitive system which uses the accelerator sensor and flexible sensors to recognize hand and finger movements of the user. The second module is an eye gaze detecting system that detects pupil movements and reactions. The final module consists of a bio-reaction sensing system or attention evaluating system which tracks cardiovascular and skin temperature reactions. This study will be used for the development of realistic digital entertainment technology.

  • PDF

Collaborative 3D Design Workspace for Geographically Distributed Designers - With the Emphasis on Augmented Reality Based Interaction Techniques Supporting Shared Manipulation and Telepresence - (지리적으로 분산된 디자이너들을 위한 3D 디자인 협업 환경 - 공유 조작과 원격 실재감을 지원하는 증강현실 기반 인터랙션 기법을 중심으로 -)

  • SaKong Kyung;Nam Tek-Jin
    • Archives of design research
    • /
    • v.19 no.4 s.66
    • /
    • pp.71-80
    • /
    • 2006
  • Collaboration has become essential in the product design process due to internationalized and specialized business environments. This study presents a real-time collaborative 3D design workspace for distributed designers, focusing on the development and the evaluation of new interaction techniques supporting nonverbal communication such as awareness of participants, shared manipulation and tele-presence. Requirements were identified in terms of shared objects, shared workspaces and awareness through literature reviews and an observational study. An Augmented Reality based collaborative design workspace was developed, in which two main interaction techniques, Turn-table and Virtual Shadow, were incorporated to support shared manipulation and tele-presence. Turn-table provides intuitive shared manipulation of 3D models and physical cues for awareness of remote participants. Virtual shadow supports natural and continuous awareness of location, gestures and pointing of partners. A lab-based evaluation was conducted and the results showed that interaction techniques effectively supported awareness of general pointing and facilitated discussion in 3D model reviews. The workspace and the interaction techniques can facilitate more natural communication and increase the efficiency of collaboration on virtual 3D models between distributed participants (designer-designer, engineer, or modeler) in collaborative design environments.

  • PDF

A Study of a Virtual Reality Interface of Person Search in Multimedia Database for the US Defense Industry (미국 방위산업체 상황실의 인물검색 활동을 돕는 가상현실 공간 인터페이스 환경에 관한 연구)

  • Kim, Na-Young;Lee, Chong-Ho
    • Journal of Korea Game Society
    • /
    • v.11 no.5
    • /
    • pp.67-78
    • /
    • 2011
  • This paper introduces an efficient and satisfactory search interface that enables users to browse and find the video data they want from a massively huge video database widely used in various multimedia environment. The target user group is information analysts at US defense industry or governmental intelligence agencies whose job is to identify a certain person from a lot of video footage taken from CCTV(Closed-circuit Television) cameras. For the first user test, we suggested the CAVE-like virtual reality interface to be the most optimal for the tasks we designed for, so we compared this interface with desktop interface. The softwares and database developed and optimized for each task were used in this user test. For the second user test, we researched on what input devices would be most optimal for enhancing efficiency of search task in the CAVE-like virtual reality system. Especially we focused our effort on measuring the effectiveness and user satisfaction of three different types of devices that embody gestural interface input system that encourages users' ergonomic control of the interface. We also measured the time consumed for performing each task to find out the most efficient input device among the ones tested.

Experience Design Guideline for Smart Car Interface (스마트카의 인터페이스를 위한 경험 디자인 가이드라인)

  • Yoo, Hoon Sik;Ju, Da Young
    • Design Convergence Study
    • /
    • v.15 no.1
    • /
    • pp.135-150
    • /
    • 2016
  • Due to the development of communication technology and expansion of Intelligent Transport System (ITS), the car is changing from a simple mechanical device to second living space which has comprehensive convenience function and is evolved into the platform which is playing as an interface for this role. As the interface area to provide various information to the passenger is being expanded, the research importance about smart car based user experience is rising. This study has a research objective to propose the guidelines regarding the smart car user experience elements. In order to conduct this study, smart car user experience elements were defined as function, interaction, and surface and through the discussions of UX/UI experts, 8 representative techniques, 14 representative techniques, and 8 locations of the glass windows were specified for each element. Following, the smart car users' priorities of the experience elements, which were defined through targeting 100 drivers, were analyzed in the form of questionnaire survey. The analysis showed that the users' priorities in applying the main techniques were in the order of safety, distance, and sensibility. The priorities of the production method were in the order of voice recognition, touch, gesture, physical button, and eye tracking. Furthermore, regarding the glass window locations, users prioritized the front of the driver's seat to the back. According to the demographic analysis on gender, there were no significant differences except for two functions. Therefore this showed that the guidelines of male and female can be commonly applied. Through user requirement analysis about individual elements, this study provides the guides about the requirement in each element to be applied to commercialized product with priority.

NUI/NUX of the Virtual Monitor Concept using the Concentration Indicator and the User's Physical Features (사용자의 신체적 특징과 뇌파 집중 지수를 이용한 가상 모니터 개념의 NUI/NUX)

  • Jeon, Chang-hyun;Ahn, So-young;Shin, Dong-il;Shin, Dong-kyoo
    • Journal of Internet Computing and Services
    • /
    • v.16 no.6
    • /
    • pp.11-21
    • /
    • 2015
  • As growing interest in Human-Computer Interaction(HCI), research on HCI has been actively conducted. Also with that, research on Natural User Interface/Natural User eXperience(NUI/NUX) that uses user's gesture and voice has been actively conducted. In case of NUI/NUX, it needs recognition algorithm such as gesture recognition or voice recognition. However these recognition algorithms have weakness because their implementation is complex and a lot of time are needed in training because they have to go through steps including preprocessing, normalization, feature extraction. Recently, Kinect is launched by Microsoft as NUI/NUX development tool which attracts people's attention, and studies using Kinect has been conducted. The authors of this paper implemented hand-mouse interface with outstanding intuitiveness using the physical features of a user in a previous study. However, there are weaknesses such as unnatural movement of mouse and low accuracy of mouse functions. In this study, we designed and implemented a hand mouse interface which introduce a new concept called 'Virtual monitor' extracting user's physical features through Kinect in real-time. Virtual monitor means virtual space that can be controlled by hand mouse. It is possible that the coordinate on virtual monitor is accurately mapped onto the coordinate on real monitor. Hand-mouse interface based on virtual monitor concept maintains outstanding intuitiveness that is strength of the previous study and enhance accuracy of mouse functions. Further, we increased accuracy of the interface by recognizing user's unnecessary actions using his concentration indicator from his encephalogram(EEG) data. In order to evaluate intuitiveness and accuracy of the interface, we experimented it for 50 people from 10s to 50s. As the result of intuitiveness experiment, 84% of subjects learned how to use it within 1 minute. Also, as the result of accuracy experiment, accuracy of mouse functions (drag(80.4%), click(80%), double-click(76.7%)) is shown. The intuitiveness and accuracy of the proposed hand-mouse interface is checked through experiment, this is expected to be a good example of the interface for controlling the system by hand in the future.