• Title/Summary/Keyword: intuitive interaction

Search Result 122, Processing Time 0.03 seconds

Hand Haptic Interface for Intuitive 3D Interaction (직관적인 3D 인터랙션을 위한 핸드 햅틱 인터페이스)

  • Jang, Yong-Seok;Kim, Yong-Wan;Son, Wook-Ho;Kim, Kyung-Hwan
    • Journal of the HCI Society of Korea
    • /
    • v.2 no.2
    • /
    • pp.53-59
    • /
    • 2007
  • Several researches in 3D interaction have identified and extensively studied the four basic interaction tasks for 3D/VE applications, namely, navigation, selection, manipulation and system control. These interaction schemes in the real world or VE are generally suitable for interacting with small graspable objects. In some applications, it is important to duplicate real world behavior. For example, a training system for a manual assembly task and usability verification system benefits from a realistic system for object grasping and manipulation. However, it is not appropriate to instantly apply these interaction technologies to such applications, because the quality of simulated grasping and manipulation has been limited. Therefore, we introduce the intuitive and natural 3D interaction haptic interface supporting high-precision hand operations and realistic haptic feedback.

  • PDF

Towards Establishing a Touchless Gesture Dictionary based on User Participatory Design

  • Song, Hae-Won;Kim, Huhn
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.515-523
    • /
    • 2012
  • Objective: The aim of this study is to investigate users' intuitive stereotypes on non-touch gestures and establish the gesture dictionary that can be applied to gesture-based interaction designs. Background: Recently, the interaction based on non-touch gestures is emerging as an alternative for natural interactions between human and systems. However, in order for non-touch gestures to become a universe interaction method, the studies on what kinds of gestures are intuitive and effective should be prerequisite. Method: In this study, as applicable domains of non-touch gestures, four devices(i.e. TV, Audio, Computer, Car Navigation) and sixteen basic operations(i.e. power on/off, previous/next page, volume up/down, list up/down, zoom in/out, play, cancel, delete, search, mute, save) were drawn from both focus group interview and survey. Then, a user participatory design was performed. The participants were requested to design three gestures suitable to each operation in the devices, and they evaluated intuitiveness, memorability, convenience, and satisfaction of their derived gestures. Through the participatory design, agreement scores, frequencies and planning times of each distinguished gesture were measured. Results: The derived gestures were not different in terms of four devices. However, diverse but common gestures were derived in terms of kinds of operations. In special, manipulative gestures were suitable for all kinds of operations. On the contrary, semantic or descriptive gestures were proper to one-shot operations like power on/off, play, cancel or search. Conclusion: The touchless gesture dictionary was established by mapping intuitive and valuable gestures onto each operation. Application: The dictionary can be applied to interaction designs based on non-touch gestures. Moreover, it will be used as a basic reference for standardizing non-touch gestures.

Analysis on the Characteristics of Cognitive & Affective Learning Style of Engineering University Students (공과대학생의 인지적.정의적 학습양식 특성 분석)

  • Kim, Eun Jeong
    • Journal of Engineering Education Research
    • /
    • v.17 no.6
    • /
    • pp.20-29
    • /
    • 2014
  • The purpose of this study is to analyze the traits on the cognitive and affective learning style of university students. CALSIU(The Cognitive & Affective Learning Style Inventory for University School Students) by Kim, E. J. was modified for applying to university students and performed with 399 university students from three universities in Daejeon and Chungnam. Statistical analysis done in this study were ANOVA and Scheffe's test. Findings of the study are as follows : First, the students with high academic achievements have intuitive perception type, whole processing type, and deep storage & recall type. Secondly, the students with low academic achievement have strong non-academic learning type. Third, interaction attitude of affective learning styles is the important element to determine their academic achievement. The students with independent type get high academic achievements. Therefore, instructor should consider the learning styles of students, and it should be used to improve their teaching & learning strategy for better academic achievements of university students.

Intuitive Manipulation of Deformable Cloth Object Based on Augmented Reality for Mobile Game (모바일 게임을 위한 증강현실 기반 직관적 변형 직물객체 조작)

  • Kim, Sang-Joon;Hong, Min;Choi, Yoo-Joo
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.7 no.4
    • /
    • pp.159-168
    • /
    • 2018
  • In recent, mobile augmented reality game which has been attracting high attention is considered to be an good approach to increase immersion. In conventional augmented reality-based games that recognize target objects using a mobile camera and show the matching game characters, touch-based interaction is mainly used. In this paper, we propose an intuitive interaction method which manipulates a deformable game object by moving a target image of augmented reality in order to enhacne the immersion of the game. In the proposed method, the deformable object is intuitively manipulated by calculating the distance and direction between the target images and by adjusting the external force applied to the deformable object using them. In this paper, we focus on the cloth deformable object which is widely used for natural object animation in game contents and implement natural cloth simulation interacting with game objects represented by wind and rigid objects. In the experiments, we compare the previous commercial cloth model with the proposed method and show the proposed method can represent cloth animation more realistically.

Comparative Study on the Interface and Interaction for Manipulating 3D Virtual Objects in a Virtual Reality Environment (가상현실 환경에서 3D 가상객체 조작을 위한 인터페이스와 인터랙션 비교 연구)

  • Park, Kyeong-Beom;Lee, Jae Yeol
    • Korean Journal of Computational Design and Engineering
    • /
    • v.21 no.1
    • /
    • pp.20-30
    • /
    • 2016
  • Recently immersive virtual reality (VR) becomes popular due to the advanced development of I/O interfaces and related SWs for effectively constructing VR environments. In particular, natural and intuitive manipulation of 3D virtual objects is still considered as one of the most important user interaction issues. This paper presents a comparative study on the manipulation and interaction of 3D virtual objects using different interfaces and interactions in three VR environments. The comparative study includes both quantitative and qualitative aspects. Three different experimental setups are 1) typical desktop-based VR using mouse and keyboard, 2) hand gesture-supported desktop VR using a Leap Motion sensor, and 3) immersive VR by wearing an HMD with hand gesture interaction using a Leap Motion sensor. In the desktop VR with hand gestures, the Leap Motion sensor is put on the desk. On the other hand, in the immersive VR, the sensor is mounted on the HMD so that the user can manipulate virtual objects in the front of the HMD. For the quantitative analysis, a task completion time and success rate were measured. Experimental tasks require complex 3D transformation such as simultaneous 3D translation and 3D rotation. For the qualitative analysis, various factors relating to user experience such as ease of use, natural interaction, and stressfulness were evaluated. The qualitative and quantitative analyses show that the immersive VR with the natural hand gesture provides more intuitive and natural interactions, supports fast and effective performance on task completion, but causes stressful condition.

Design of Gaming Interaction Control using Gesture Recognition and VR Control in FPS Game (FPS 게임에서 제스처 인식과 VR 컨트롤러를 이용한 게임 상호 작용 제어 설계)

  • Lee, Yong-Hwan;Ahn, Hyochang
    • Journal of the Semiconductor & Display Technology
    • /
    • v.18 no.4
    • /
    • pp.116-119
    • /
    • 2019
  • User interface/experience and realistic game manipulation play an important role in virtual reality first-person-shooting game. This paper presents an intuitive hands-free interface of gaming interaction scheme for FPS based on user's gesture recognition and VR controller. We focus on conventional interface of VR FPS interaction, and design the player interaction wearing head mounted display with two motion controllers; leap motion to handle low-level physics interaction and VIVE tracker to control movement of the player joints in the VR world. The FPS prototype system shows that the design interface helps to enjoy playing immersive FPS and gives players a new gaming experience.

A Study on interaction factor in intuitive gesture (직관적 제스쳐 인터랙션 요소 추출에 관한 연구)

  • Kim, Yong-U;Hwang, Min-Cheol;Kim, Jong-Hwa;U, Jin-Cheol;Kim, Chi-Jung;Kim, Ji-Hye
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2009.11a
    • /
    • pp.66-69
    • /
    • 2009
  • 본 연구는 사용자에게 편리하고 자연스러운 인터페이스를 제공하는 직관적 제스쳐 인터랙션의 요소 추출에 대해 분석하였다. 대학생 30명을 대상으로 마우스와 키보드의 8가지 기본 인터랙션 요소를 제시하여 그 요소와 적합한 의미의 어휘를 선택하도록 설문을 실시하였다. 다음으로 선택한 어휘가 가지는 의미를 표현한 제스쳐를 빈도수에 따라 최종 선정한 후 검증을 실시하였다. 첫 번째 설문시 마우스의 상하좌우 이동과 ESC의 의미는 30명 전원이 상하좌우 이동과 취소라는 동일한 의미를 선택하였다. 마우스의 좌클릭은 30명 중 28명이 선택, 우클릭은 26명이 설정 탐색, 키보드 엔터의 경우 25명이 실행이라는 의미를 선택하였다. 최종 선정된 제스쳐의 검증 결과 상하좌우 이동과 취소 요소로 제시했던 손전체 상하좌우 이동과 손전체X는 70~100%의 높은 결과가 도출되었으며, 선택 요소로 제시했던 정지상태 검지손 아래는 60%의 결과가 도출되었다. 설정 탐색을 위해 제시되었던 손전체 회전과 중지손 클릭 중 손전체 회전이 60%의 결과를 도출하였으며, 중지손 클릭은 선택과 하단 이동이 73%로 설정 탐색의 요소로는 적합하지 않은 것으로 도출되었다. 실행 요소로 제시했던 손전체 두번 클릭은 27%의 낮은 검증 결과가 도출되어 적합하지 않은 것으로 제시되었다.

  • PDF

Immersive Virtual Custom-made Model House (몰입감 있는 맞춤형 가상 모델하우스)

  • Hwang, Sun-Uk;Kim, Yeong-Mi;Seo, Yong-Won;Ko, Kwang-Hee;Ryu, Je-Ha;Lee, Kwan-Heng;Lee, Yong-Gu
    • Korean Journal of Computational Design and Engineering
    • /
    • v.13 no.1
    • /
    • pp.8-17
    • /
    • 2008
  • Putting a high value on individual preferences is a modern trend that more and more companies are considering for their product design and development and the apartment design is not an exception. Most apartments today are built using similar design with no room for customization. People in general want their tastes to be reflected in the design of their apartment. However, delivering what customers like to the construction company may not be an easy task in practice. For this reason, an intuitive and effective medium between the company and customers for effective communication is needed to ameliorate such a difficulty and in response to this necessity, we developed a test platform for the virtual model house which provides a user with the customization of the apartment using haptic interactions. In our virtual environment, a user can explore an apartment and change the interior based on their taste and feel through intuitive haptic interactions.

Gadget Arms: Interactive Data Visualization using Hand Gesture in Extended Reality (가젯암: 확장현실을 위한 손 제스처 기반 대화형 데이터 시각화 시스템)

  • Choi, JunYoung;Jeong, HaeJin;Jeong, Won-Ki
    • Journal of the Korea Computer Graphics Society
    • /
    • v.25 no.2
    • /
    • pp.31-41
    • /
    • 2019
  • Extended Reality (XR), such as virtual and augmented reality, has huge potential for immersive data visualization and analysis. In XR, users can interact with data and other users realistically by navigating the shared virtual space, allowing for more intuitive data analysis. However, creating a visualization in XR also poses a challenge because complicated, low-level programming is required, which hinders broad adaptation in visual analytics. This paper proposes an interactive visualization authoring tool based on hand gesture for immersive data visualization-Gadget Arms. The proposed system provides a novel user interaction to create and place visualization in the 3D virtual world. This simple, but intuitive, user interaction enables user designs the entire visualization space in the XR without using a host computer and low-level programming. Our user study also confirmed that the proposed user interaction significantly improves the usability of the visualization authoring tool.