• Title/Summary/Keyword: haptic interaction

Search Result 113, Processing Time 0.023 seconds

Energy Bounding Algorithm for Stable Haptic Interaction

  • Kim, Jong-Phil;Ryu, Je-Ha
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.2765-2770
    • /
    • 2003
  • This paper introduces a novel control algorithm, energy bounding algorithm, for stable haptic interaction. The energy bounding algorithm restricts energy generated by zero-order hold within consumable energy by physical damping that is energy consumption element in the haptic interface. The passivity condition can always be guaranteed by the energy bounding algorithm. The virtual coupling algorithm restricts the actuator force with respect to the penetration depth and restricts generated energy. In contrast, energy bounding algorithm restricts the change of actuator force with respect to time and restricts generated energy by zero-order hold. Therefore, much stiffer contact simulation can be implemented by the energy bounding algorithm. Moreover, the energy bounding algorithm doesn’t is not computationally intensive and the implementation of it is very simple.

  • PDF

An Efficient Virtual Teeth Modeling for Dental Training System

  • Kim, Lae-Hyun;Park, Se-Hyung
    • International Journal of CAD/CAM
    • /
    • v.8 no.1
    • /
    • pp.41-44
    • /
    • 2009
  • This paper describes an implementation of virtual teeth modeling for a haptic dental simulation. The system allows dental students to practice dental procedures with realistic tactual feelings. The system requires fast and stable haptic rendering and volume modeling techniques working on the virtual tooth. In our implementation, a volumetric implicit surface is used for intuitive shape modification without topological constraints and haptic rendering. The volumetric implicit surface is generated from input geometric model by using a closest point transformation algorithm. And for visual rendering, we apply an adaptive polygonization method to convert volumetric teeth model to geometric model. We improve our previous system using new octree design to save memory requirement while increase the performance and visual quality.

A Study on Development of Technology Roadmap of Haptic Interfaces in Games (게임용 Haptic 인터페이스 기술 로드맵 개발에 관한 연구)

  • Lee, Seon-Gil;Kim, Sung-Yong
    • IE interfaces
    • /
    • v.17 no.2
    • /
    • pp.158-168
    • /
    • 2004
  • A technology roadmap was developed for haptic interface technologies to be applied to games. Haptic interface technologies are expected to play an important role in games to provide gamers with interaction and immersive perception in near future, even though haptic interface technologies have been less studied than other perception-related technologies with respect to games. Information on two types of haptic interfaces - portable and desktop - and their evolution processes were analyzed in terms of technological demands. Haptic feedback technologies to realize these demands were inspected with the time frame and haptic feedback technologies were derived using a technology tree. The technology roadmap of haptic interfaces in game was finally constructed by mapping the technological demands in time with game technology trends. The technology roadmap of haptic interfaces will have implications on developing haptic interfaces to be applied to many applications including virtual realities and games.

Multimodal Interaction on Automultiscopic Content with Mobile Surface Haptics

  • Kim, Jin Ryong;Shin, Seunghyup;Choi, Seungho;Yoo, Yeonwoo
    • ETRI Journal
    • /
    • v.38 no.6
    • /
    • pp.1085-1094
    • /
    • 2016
  • In this work, we present interactive automultiscopic content with mobile surface haptics for multimodal interaction. Our system consists of a 40-view automultiscopic display and a tablet supporting surface haptics in an immersive room. Animated graphics are projected onto the walls of the room. The 40-view automultiscopic display is placed at the center of the front wall. The haptic tablet is installed at the mobile station to enable the user to interact with the tablet. The 40-view real-time rendering and multiplexing technology is applied by establishing virtual cameras in the convergence layout. Surface haptics rendering is synchronized with three-dimensional (3D) objects on the display for real-time haptic interaction. We conduct an experiment to evaluate user experiences of the proposed system. The results demonstrate that the system's multimodal interaction provides positive user experiences of immersion, control, user interface intuitiveness, and 3D effects.

AI-based Automatic Spine CT Image Segmentation and Haptic Rendering for Spinal Needle Insertion Simulator (척추 바늘 삽입술 시뮬레이터 개발을 위한 인공지능 기반 척추 CT 이미지 자동분할 및 햅틱 렌더링)

  • Park, Ikjong;Kim, Keehoon;Choi, Gun;Chung, Wan Kyun
    • The Journal of Korea Robotics Society
    • /
    • v.15 no.4
    • /
    • pp.316-322
    • /
    • 2020
  • Endoscopic spine surgery is an advanced surgical technique for spinal surgery since it minimizes skin incision, muscle damage, and blood loss compared to open surgery. It requires, however, accurate positioning of an endoscope to avoid spinal nerves and to locate the endoscope near the target disk. Before the insertion of the endoscope, a guide needle is inserted to guide it. Also, the result of the surgery highly depends on the surgeons' experience and the patients' CT or MRI images. Thus, for the training, a number of haptic simulators for spinal needle insertion have been developed. But, still, it is difficult to be used in the medical field practically because previous studies require manual segmentation of vertebrae from CT images, and interaction force between the needle and soft tissue has not been considered carefully. This paper proposes AI-based automatic vertebrae CT-image segmentation and haptic rendering method using the proposed need-tissue interaction model. For the segmentation, U-net structure was implemented and the accuracy was 93% in pixel and 88% in IoU. The needle-tissue interaction model including puncture force and friction force was implemented for haptic rendering in the proposed spinal needle insertion simulator.

Nonlinear Virtual Coupling for Stable Haptic Interaction (안정된 햅틱 인터페이스를 위한 비선형가상커플링)

  • 이문환;이두용
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.9 no.8
    • /
    • pp.610-615
    • /
    • 2003
  • This paper proposes a nonlinear virtual coupling fur haptic interface, which offers better performance while maintaining stability of the system. The nonlinear virtual coupling is designed based on a human response model. This human response model exploits delay between the human Intention and the actual change of arm impedance. The proposed approach provides with less conservative constraints for the design of stable haptic interface, compared with the traditional passivity condition. This allows increased performance that is verified through experiments.

Real-time Simulation Technique for Visual-Haptic Interaction between SPH-based Fluid Media and Soluble Solids (SPH 기반의 유체 및 용해성 강체에 대한 시각-촉각 융합 상호작용 시뮬레이션)

  • Kim, Seokyeol;Park, Jinah
    • Journal of the Korean Society of Visualization
    • /
    • v.15 no.1
    • /
    • pp.32-40
    • /
    • 2017
  • Interaction between fluid and a rigid object is frequently observed in everyday life. However, it is difficult to simulate their interaction as the medium and the object have different representations. One of the challenging issues arises especially in handling deformation of the object visually as well as rendering haptic feedback. In this paper, we propose a real-time simulation technique for multimodal interaction between particle-based fluids and soluble solids. We have developed the dissolution behavior model of solids, which is discretized based on the idea of smoothed particle hydrodynamics, and the changes in physical properties accompanying dissolution is immediately reflected to the object. The user is allowed to intervene in the simulation environment anytime by manipulating the solid object, where both visual and haptic feedback are delivered to the user on the fly. For immersive visualization, we also adopt the screen space fluid rendering technique which can balance realism and performance.

Hand Haptic Interface for Intuitive 3D Interaction (직관적인 3D 인터랙션을 위한 핸드 햅틱 인터페이스)

  • Jang, Yong-Seok;Kim, Yong-Wan;Son, Wook-Ho;Kim, Kyung-Hwan
    • Journal of the HCI Society of Korea
    • /
    • v.2 no.2
    • /
    • pp.53-59
    • /
    • 2007
  • Several researches in 3D interaction have identified and extensively studied the four basic interaction tasks for 3D/VE applications, namely, navigation, selection, manipulation and system control. These interaction schemes in the real world or VE are generally suitable for interacting with small graspable objects. In some applications, it is important to duplicate real world behavior. For example, a training system for a manual assembly task and usability verification system benefits from a realistic system for object grasping and manipulation. However, it is not appropriate to instantly apply these interaction technologies to such applications, because the quality of simulated grasping and manipulation has been limited. Therefore, we introduce the intuitive and natural 3D interaction haptic interface supporting high-precision hand operations and realistic haptic feedback.

  • PDF

A Study on Virtual Assembly Simulation Using Virtual Reality Technology (가상현실 기술을 이용한 가상 조립 시뮬레이션에 대한 연구)

  • Kim, Yong-Wan;Park, Jin-Ah
    • Journal of Korea Multimedia Society
    • /
    • v.13 no.11
    • /
    • pp.1715-1727
    • /
    • 2010
  • Although a hand haptic interaction which provides direct and natural sensation is the most natural way of interacting with VR environment, the hand haptic interaction has still limitations with respect to the complexity of articulated hand and related hardware capabilities. Particularly, virtual assembly simulation which refers to the verification process of digital mockup in product development lifecycle is one of the most challenging topics in virtual reality applications. However, hand haptic interaction is considered as a big obstacle, because difficulty initial grasping and non-dextrous manipulation remain as unsolved problems. In this paper, we propose that common hand haptic interactions involves two separate stages with different aspects. We present the hand haptic interaction method enables us to stably grasp a virtual object at initial grasping and delicately manipulate an object at task operating by one's intention. Therefore, proposed method provides the robustness using grasping quality and dextrous manipulation using physically simulation. We conducted experiments to evaluate the effectiveness of our proposed method under different display environments -monoscopic and stereoscopic. From 2-way ANOVA test, we show that the proposed method satisfies two aspects of hand haptic interaction. Finally, we demonstrated an actual application of various assembly simulation for relatively complex models.

Telepresence Robotic Technology for Individuals with Visual Impairments Through Real-time Haptic Rendering (실시간 햅틱 렌더링 기술을 통한 시각 장애인을 위한 원격현장감(Telepresence) 로봇 기술)

  • Park, Chung Hyuk;Howard, Ayanna M.
    • The Journal of Korea Robotics Society
    • /
    • v.8 no.3
    • /
    • pp.197-205
    • /
    • 2013
  • This paper presents a robotic system that provides telepresence to the visually impaired by combining real-time haptic rendering with multi-modal interaction. A virtual-proxy based haptic rendering process using a RGB-D sensor is developed and integrated into a unified framework for control and feedback for the telepresence robot. We discuss the challenging problem of presenting environmental perception to a user with visual impairments and our solution for multi-modal interaction. We also explain the experimental design and protocols, and results with human subjects with and without visual impairments. Discussion on the performance of our system and our future goals are presented toward the end.