• Title/Summary/Keyword: virtual interaction

Search Result 810, Processing Time 0.026 seconds

Comparative Study on the Interface and Interaction for Manipulating 3D Virtual Objects in a Virtual Reality Environment (가상현실 환경에서 3D 가상객체 조작을 위한 인터페이스와 인터랙션 비교 연구)

  • Park, Kyeong-Beom;Lee, Jae Yeol
    • Korean Journal of Computational Design and Engineering
    • /
    • v.21 no.1
    • /
    • pp.20-30
    • /
    • 2016
  • Recently immersive virtual reality (VR) becomes popular due to the advanced development of I/O interfaces and related SWs for effectively constructing VR environments. In particular, natural and intuitive manipulation of 3D virtual objects is still considered as one of the most important user interaction issues. This paper presents a comparative study on the manipulation and interaction of 3D virtual objects using different interfaces and interactions in three VR environments. The comparative study includes both quantitative and qualitative aspects. Three different experimental setups are 1) typical desktop-based VR using mouse and keyboard, 2) hand gesture-supported desktop VR using a Leap Motion sensor, and 3) immersive VR by wearing an HMD with hand gesture interaction using a Leap Motion sensor. In the desktop VR with hand gestures, the Leap Motion sensor is put on the desk. On the other hand, in the immersive VR, the sensor is mounted on the HMD so that the user can manipulate virtual objects in the front of the HMD. For the quantitative analysis, a task completion time and success rate were measured. Experimental tasks require complex 3D transformation such as simultaneous 3D translation and 3D rotation. For the qualitative analysis, various factors relating to user experience such as ease of use, natural interaction, and stressfulness were evaluated. The qualitative and quantitative analyses show that the immersive VR with the natural hand gesture provides more intuitive and natural interactions, supports fast and effective performance on task completion, but causes stressful condition.

Automatic Adaptation Based Metaverse Virtual Human Interaction (자동 적응 기반 메타버스 가상 휴먼 상호작용 기법)

  • Chung, Jin-Ho;Jo, Dongsik
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.11 no.2
    • /
    • pp.101-106
    • /
    • 2022
  • Recently, virtual human has been widely used in various fields such as education, training, information guide. In addition, it is expected to be applied to services that interact with remote users in metaverse. In this paper, we propose a novel method to make a virtual human' interaction to perceive the user's surroundings. We use the editing authoring tool to apply user's interaction for providing the virtual human's response. The virtual human can recognize users' situations based on fuzzy, present optimal response to users. With our interaction method by context awareness to address our paper, the virtual human can provide interaction suitable for the surrounding environment based on automatic adaptation.

Designing Effective Virtual Training: A Case Study in Maritime Safety

  • Jung, Jinki;Kim, Hongtae
    • Journal of the Ergonomics Society of Korea
    • /
    • v.36 no.5
    • /
    • pp.385-394
    • /
    • 2017
  • Objective: The aim of this study is to investigate how to design effective virtual reality-based training (i.e., virtual training) in maritime safety and to present methods for enhancing interface fidelity by employing immersive interaction and 3D user interface (UI) design. Background: Emerging virtual reality technologies and hardware enable to provide immersive experiences to individuals. There is also a theory that the improvement of fidelity can improve the training efficiency. Such a sense of immersion can be utilized as an element for realizing effective training in the virtual space. Method: As an immersive interaction, we implemented gesture-based interaction using leap motion and Myo armband type sensors. Hand gestures captured from both sensors are used to interact with the virtual appliance in the scenario. The proposed 3D UI design is employed to visualize appropriate information for tasks in training. Results: A usability study to evaluate the effectiveness of the proposed method has been carried out. As a result, the usability test of satisfaction, intuitiveness of UI, ease of procedure learning, and equipment understanding showed that virtual training-based exercise was superior to existing training. These improvements were also independent of the type of input devices for virtual training. Conclusion: We have shown through experiments that the proposed interaction design results are more efficient interactions than the existing training method. The improvement of interface fidelity through intuitive and immediate feedback on the input device and the training information improve user satisfaction with the system, as well as training efficiency. Application: Design methods for an effective virtual training system can be applied to other areas by which trainees are required to do sophisticated job with their hands.

A Design and Implementation of User Interaction-Oriented Integrated Virtual Education System (사용자간 상호작용 지향적 통합 가상교육시스템의 설계 및 구현)

  • 박경환;문석원
    • Journal of Korea Multimedia Society
    • /
    • v.1 no.2
    • /
    • pp.215-223
    • /
    • 1998
  • This paper introduces the method for a design and implementation of an integrated virtual education system WebClass which is based on the World Wide Web and maximizes the interaction among users. Existing virtual education systems did not provide a flexible integration of their functions as they included various interaction functions without user interaction models. We designed an user interaction model for supporting various instructional model and implemented user interfaces of WebClass based on the interaction model. Thus we developed an intergrated virtual education system that is based on user interaction model instead of amalgam of various interaction functions. WebClass support both synchronous and asynchronous sharing functions for user interactions. Our goal was to support and efficient virtual education by maximizing user interaction.

  • PDF

Using Spatial Ontology in the Semantic Integration of Multimodal Object Manipulation in Virtual Reality

  • Irawati, Sylvia;Calderon, Daniela;Ko, Hee-Dong
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.884-892
    • /
    • 2006
  • This paper describes a framework for multimodal object manipulation in virtual environments. The gist of the proposed framework is the semantic integration of multimodal input using spatial ontology and user context to integrate the interpretation results from the inputs into a single one. The spatial ontology, describing the spatial relationships between objects, is used together with the current user context to solve ambiguities coming from the user's commands. These commands are used to reposition the objects in the virtual environments. We discuss how the spatial ontology is defined and used to assist the user to perform object placements in the virtual environment as it will be in the real world.

  • PDF

Tracking and Interaction Based on Hybrid Sensing for Virtual Environments

  • Jo, Dongsik;Kim, Yongwan;Cho, Eunji;Kim, Daehwan;Kim, Ki-Hong;Lee, Gil-Haeng
    • ETRI Journal
    • /
    • v.35 no.2
    • /
    • pp.356-359
    • /
    • 2013
  • We present a method for tracking and interaction based on hybrid sensing for virtual environments. The proposed method is applied to motion tracking of whole areas, including the user's occlusion space, for a high-precision interaction. For real-time motion tracking surrounding a user, we estimate each joint position in the human body using a combination of a depth sensor and a wand-type physical user interface, which is necessary to convert gyroscope and acceleration values into positional data. Additionally, we construct virtual contents and evaluate the validity of results related to hybrid sensing-based whole-body tracking of human motion methods used to compensate for the occluded areas.

Evaluation of Global Force and Interaction Body Force Density in Permanent Magnet Employing Virtual Air-gap Concept (가상공극개념을 이용한 연구자석의 전체전자기력과 상호체적력밀도 계산)

  • Lee, Se-Hee
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.58 no.2
    • /
    • pp.278-284
    • /
    • 2009
  • The global force and interaction body force density were evaluated in permanent magnets by using the virtual air-gap scheme incorporating the finite-element method. Until now, the virtual air-gap concept has been successfully applied to calculate a contact force and a body force density in soft magnetic materials. These force calculating methods have been called as generalized methods such as the generalized magnetic charge force density method, the generalized magnetizing current force density method, and the generalized Kelvin force density method. For permanent magnets, however, there have been few research works on a contact force and a force density field. Unlike the conventional force calculating methods resulting in surface force densities, the generalized methods are novel methods of evaluating body force density. These generalized methods yield the actual total force, but their distributions have an irregularity, which seems to be random distributions of body force density. Inside permanent magnets, however, a smooth pattern was obtained in the interaction body force density, which represents the interacting force field among magnetic materials. To evaluate the interaction body force density, the intrinsic force density should be withdrawn from the total force density. Several analysis models with permanent magnets were tested to verify the proposed methods evaluating the interaction body force density and the contact force, in which the permanent magnet contacts with a soft magnetic material.

Context-Driven Framework for High Level Configuration of Virtual Businesses (가상기업의 형성을 위한 컨텍스트 기반 프레임워크)

  • Lee, Kyung-Huy;Oh, Sang-Bong
    • Journal of Information Technology Applications and Management
    • /
    • v.14 no.2
    • /
    • pp.11-26
    • /
    • 2007
  • In this paper we suggest a context-driven configuration model of virtual businesses to form a business network model consisting of role-based, interaction-centered business partners. The model makes use of the subcontext concept which explicitly represents actors and interactions in virtual business (VB) context. We separate actors who have capacities on tasks in a specific kind of role and actor subcontext which models requirements in specific interaction subcontext. Three kinds of actors are defined in virtual service chains, service user, service provider, and external service supporter. Interaction subcontext models a service exchange process between two actor subcontexts with consideration of context dependencies like task and quality dependencies. Each subcontext may be modeled in the form of a situation network which consists of a finite set of situation nodes and transitions. A specific situation is given in a corresponding context network of actors and interactions. It is illustrated with a simple example.

  • PDF

HUD Interface and VR content interaction: VR+HUD (HUD Interface와 VR 콘텐츠 인터렉션: VR+HUD)

  • Park, Keonhee;Chin, Seongah
    • Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology
    • /
    • v.8 no.3
    • /
    • pp.925-932
    • /
    • 2018
  • Virtual reality seems to be the center of the next generation platform, which is founded on various engines that can easily make device progress and content. However, the interaction between virtual reality contents and users is thought of as relatively requiring technological advances. In this paper, we propose a technique to improve the interaction technique based on the case of Virtual Figure Model Crafting (VFMC) to analyze the problem of interaction caused by virtual reality contents. We introduced the concept of Head-Up Display (HUD) to present a more natural interaction method. The HUD is the digital visual interface of the aircraft. The advantage of HUD visual interface is to minimizes the user's visual movement by displaying the information of the scattered view to the forward direction of the pilot. In other words, we can reduce unnecessary left and right movements that make it is possible to expect an effect of reducing fatigue and increasing immersion.

Exploring the Effects of Passive Haptic Factors When Interacting with a Virtual Pet in Immersive VR Environment (몰입형 VR 환경에서 가상 반려동물과 상호작용에 관한 패시브 햅틱 요소의 영향 분석)

  • Donggeun KIM;Dongsik Jo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.30 no.3
    • /
    • pp.125-132
    • /
    • 2024
  • Recently, with immersive virtual reality(IVR) technologies, various services such as education, training, entertainment, industry, healthcare and remote collaboration have been applied. In particular, researches are actively being studied to visualize and interact with virtual humans, research on virtual pets in IVR is also emerging. For interaction with the virtual pet, similar to real-world interaction scenarios, the most important thing is to provide physical contact such as haptic and non-verbal interaction(e.g., gesture). This paper investigates the effects on factors (e.g., shape and texture) of passive haptic feedbacks using mapping physical props corresponding to the virtual pet. Experimental results show significant differences in terms of immersion, co-presence, realism, and friendliness depending on the levels of texture elements when interacting with virtual pets by passive haptic feedback. Additionally, as the main findings of this study by statistical interaction between two variables, we found that there was Uncanny valley effect in terms of friendliness. With our results, we will expect to be able to provide guidelines for creating interactive contents with the virtual pet in immersive VR environments.