• Title/Summary/Keyword: 3D user interface

Search Result 424, Processing Time 0.029 seconds

Augmented Book as an Interface Between Real Environment and Virtual Environment (현실과 가상환경의 인터페이스로써의 증강 책)

  • Kim, Dong-Hee;Yoon, Jong-Hyun;Park, Jong-Seung
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.801-806
    • /
    • 2008
  • This paper proposes an augmented reality application called augmented book. The augmented book provides an interface to the virtual objects and virtual environments using a real book. In the augmented book a user observes and manipulates the 3D visualization of virtual objects. Three different types of markers are introduced for the manipulation of virtual objects. By moving the control marker, the user controls the virtual objects. Our augmented book can be used as an interface for various kinds of 3D contents such as contents of real books or product catalogs.

  • PDF

Multimodal Interaction on Automultiscopic Content with Mobile Surface Haptics

  • Kim, Jin Ryong;Shin, Seunghyup;Choi, Seungho;Yoo, Yeonwoo
    • ETRI Journal
    • /
    • v.38 no.6
    • /
    • pp.1085-1094
    • /
    • 2016
  • In this work, we present interactive automultiscopic content with mobile surface haptics for multimodal interaction. Our system consists of a 40-view automultiscopic display and a tablet supporting surface haptics in an immersive room. Animated graphics are projected onto the walls of the room. The 40-view automultiscopic display is placed at the center of the front wall. The haptic tablet is installed at the mobile station to enable the user to interact with the tablet. The 40-view real-time rendering and multiplexing technology is applied by establishing virtual cameras in the convergence layout. Surface haptics rendering is synchronized with three-dimensional (3D) objects on the display for real-time haptic interaction. We conduct an experiment to evaluate user experiences of the proposed system. The results demonstrate that the system's multimodal interaction provides positive user experiences of immersion, control, user interface intuitiveness, and 3D effects.

Cubical User Interface for Toy Block Composition in Augmented Reality (증강 현실에서의 장난감 블록 결합을 위한 큐브형 사용자 인터페이스)

  • Lee, Hyeong-Mook;Lee, Young-Ho;Woo, Woon-Tack
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.363-367
    • /
    • 2009
  • We propose Cubical User Interface(CUI) for toy block composition in Augmented Reality. The creation of new object by composing virtual object is able to construct various AR contents effectively. However, existing GUI method requires learning time or is lacking of intuitiveness between act of user and offered interface. In case of AR interfaces, they mainly have been supported one handed operation and it did not consider composition property well. Therefore, the CUI provide tangible cube as the manipulation tool for virtual toy block composition in AR. The tangible cube which is attached multi-markers, magnets, and buttons supports free rotation, combination, and button input. Also, we propose two kinds of two-handed composing interactions based on CUI. First is Screw Driving(SD) method which is possible to free 3-D positioning and second is Block Assembly(BA) method which support visual guidance and is fast and intuitive. We expected that proposed interface can apply as the authoring system for content such as education, entertainment, Digilogbook.

  • PDF

Development of Motion based Training Contents: "3D Space Exploration" Case Study (동작 기반의 훈련콘텐츠 : "3D 우주탐험" 개발사례)

  • Lim, C.J.;Park, Seung Goo;Jeong, Yun Guen
    • Journal of Korea Game Society
    • /
    • v.13 no.5
    • /
    • pp.63-72
    • /
    • 2013
  • To enhance the effect of science educational contents, we developed a motion based training content: 3D space exploration. In this content, we used the 3D depth camera for user's motion recognition. Learners have to conduct the space station maintenance mission using the motion based natural and intuitive interface. The result this study is expected to propose the immersive training simulation for young science learners.

Implementation of interactive 3D floating image pointing device (인터렉티브 3D 플로팅 영상 포인팅 장치의 구현)

  • Shin, Dong-Hak;Kim, Eun-Soo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.12 no.8
    • /
    • pp.1481-1487
    • /
    • 2008
  • In this paper, we propose a novel interactive 3D floating image pointing device for the use of 3D environment. The proposed system consists of the 3D floating image generation system by use of a floating lens array and the a user interface based on real-time finger detection. In the proposed system, a user selects single image among the floating images so that the interaction function are performed effectively by pressing the button event through the finger recognition using two cameras. To show the usefulness of the proposed system, we carry out the experiment and the preliminary results are presented.

Design and Implementation of a Real-time Region Pointing System using Arm-Pointing Gesture Interface in a 3D Environment

  • Han, Yun-Sang;Seo, Yung-Ho;Doo, Kyoung-Soo;Choi, Jong-Soo
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.290-293
    • /
    • 2009
  • In this paper, we propose a method to estimate pointing region in real-world from images of cameras. In general, arm-pointing gesture encodes a direction which extends from user's fingertip to target point. In the proposed work, we assume that the pointing ray can be approximated to a straight line which passes through user's face and fingertip. Therefore, the proposed method extracts two end points for the estimation of pointing direction; one from the user's face and another from the user's fingertip region. Then, the pointing direction and its target region are estimated based on the 2D-3D projective mapping between camera images and real-world scene. In order to demonstrate an application of the proposed method, we constructed an ICGS (interactive cinema guiding system) which employs two CCD cameras and a monitor. The accuracy and robustness of the proposed method are also verified on the experimental results of several real video sequences.

  • PDF

Efficient Shadow-Test Algorithm for the Simulation of Dry Etching and Topographical Evolution (건식 식각 공정 시뮬레이션을 위한 효율적인 그림자 테스트 알고리즘과 토포그래피 진화에 대한 연구)

  • Kwon, Oh-Seop;Ban, Yong-Chan;Won, Tae-Young
    • Journal of the Korean Institute of Telematics and Electronics D
    • /
    • v.36D no.2
    • /
    • pp.41-47
    • /
    • 1999
  • In this paper, we report 3D-simulations of a plasma etching process by employing cell-removal algorithm takes into account the mask shadow effect os well as spillover errors. The developed simulator haas an input interface to take not only an analytic form but a Monte Carlo distribution of the ions. The graphic user interface(GUI) was also built into the simulator for UNIX environment. To demonstrate the capability of 3D-SURFILER(SURface proFILER), we have simulated for a typical contact hole structure with 36,000($30{\times}40{\times}30$) cells, which takes about 20 minutes with 10 Mbytes memory on sun ultra sparc 1. as an exemplary case, we calculated the etch profile during the reactive ion etching(RIE) of a contact hole wherein the aspect ratio is 1.57. Furthermore, we also simulated the dependence of a damage parameter and the evolution of topography as a function of the chamber pressure and the incident ion flux.

  • PDF

A Study on Production of iPhone-Based Augmented Reality 3D Fashion Fitting Contents (아이폰 기반의 증강현실 3D 패션피팅 콘텐츠 제작에 관한 연구)

  • Tak, Myung-Ja;Kim, Cheeyong
    • Journal of Korea Multimedia Society
    • /
    • v.16 no.6
    • /
    • pp.708-719
    • /
    • 2013
  • Purchase of clothes has recently picked up pace at mobile fashion shopping malls. One of the biggest weak points is that consumers can't coordinate clothes. Since individuals increasingly want to make coordination, a fashion coordination system to satisfy the demand should be developed. As the technology of digital clothing which is about reproducing dresses using computer graphic has been activated in the fashion industry, many changes in consumers' life patterns and interests in fashion shopping malls are taking place. Some consumers are increasingly more keenly interested in shopping on the Internet and Smart phones than in offline stores. This study was conducted to understand production of iPhone-based augmented reality fashion fitting contents which is suitable for Koreans' body shape. This system is about designing and materializing UI(User Interface), an augmented reality fitting system, so that users can confirm if they look nice with those fashion items using Smart phones. A new fashion shopping method satisfying user convenience was suggested using the materialized system.

Development of Motion Recognition Platform Using Smart-Phone Tracking and Color Communication (스마트 폰 추적 및 색상 통신을 이용한 동작인식 플랫폼 개발)

  • Oh, Byung-Hun
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.17 no.5
    • /
    • pp.143-150
    • /
    • 2017
  • In this paper, we propose a novel motion recognition platform using smart-phone tracking and color communication. The interface requires only a camera and a personal smart-phone to provide a motion control interface rather than expensive equipment. The platform recognizes the user's gestures by the tracking 3D distance and the rotation angle of the smart-phone, which acts essentially as a motion controller in the user's hand. Also, a color coded communication method using RGB color combinations is included within the interface. Users can conveniently send or receive any text data through this function, and the data can be transferred continuously even while the user is performing gestures. We present the result that implementation of viable contents based on the proposed motion recognition platform.

The Development of a Haptic Interface for Interacting with BIM Elements in Mixed Reality

  • Cho, Jaehong;Kim, Sehun;Kim, Namyoung;Kim, Sungpyo;Park, Chaehyeon;Lim, Jiseon;Kang, Sanghyeok
    • International conference on construction engineering and project management
    • /
    • 2022.06a
    • /
    • pp.1179-1186
    • /
    • 2022
  • Building Information Modeling (BIM) is widely used to efficiently share, utilize and manage information generated in every phase of a construction project. Recently, mixed reality (MR) technologies have been introduced to more effectively utilize BIM elements. This study deals with the haptic interactions between humans and BIM elements in MR to improve BIM usability. As the first step in interacting with virtual objects in mixed reality, we challenged moving a virtual object to the desired location using finger-pointing. This paper presents the procedure of developing a haptic interface system where users can interact with a BIM object to move it to the desired location in MR. The interface system consists of an MR-based head-mounted display (HMD) and a mobile application developed using Unity 3D. This study defined two segments to compute the scale factor and rotation angle of the virtual object to be moved. As a result of testing a cuboid, the user can successfully move it to the desired location. The developed MR-based haptic interface can be used for aligning BIM elements overlaid in the real world at the construction site.

  • PDF