• 제목/요약/키워드: haptic interaction

검색결과 113건 처리시간 0.023초

Energy Bounding Algorithm for Stable Haptic Interaction

  • Kim, Jong-Phil;Ryu, Je-Ha
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2003년도 ICCAS
    • /
    • pp.2765-2770
    • /
    • 2003
  • This paper introduces a novel control algorithm, energy bounding algorithm, for stable haptic interaction. The energy bounding algorithm restricts energy generated by zero-order hold within consumable energy by physical damping that is energy consumption element in the haptic interface. The passivity condition can always be guaranteed by the energy bounding algorithm. The virtual coupling algorithm restricts the actuator force with respect to the penetration depth and restricts generated energy. In contrast, energy bounding algorithm restricts the change of actuator force with respect to time and restricts generated energy by zero-order hold. Therefore, much stiffer contact simulation can be implemented by the energy bounding algorithm. Moreover, the energy bounding algorithm doesn’t is not computationally intensive and the implementation of it is very simple.

  • PDF

An Efficient Virtual Teeth Modeling for Dental Training System

  • Kim, Lae-Hyun;Park, Se-Hyung
    • International Journal of CAD/CAM
    • /
    • 제8권1호
    • /
    • pp.41-44
    • /
    • 2009
  • This paper describes an implementation of virtual teeth modeling for a haptic dental simulation. The system allows dental students to practice dental procedures with realistic tactual feelings. The system requires fast and stable haptic rendering and volume modeling techniques working on the virtual tooth. In our implementation, a volumetric implicit surface is used for intuitive shape modification without topological constraints and haptic rendering. The volumetric implicit surface is generated from input geometric model by using a closest point transformation algorithm. And for visual rendering, we apply an adaptive polygonization method to convert volumetric teeth model to geometric model. We improve our previous system using new octree design to save memory requirement while increase the performance and visual quality.

게임용 Haptic 인터페이스 기술 로드맵 개발에 관한 연구 (A Study on Development of Technology Roadmap of Haptic Interfaces in Games)

  • 이성일;김성용
    • 산업공학
    • /
    • 제17권2호
    • /
    • pp.158-168
    • /
    • 2004
  • A technology roadmap was developed for haptic interface technologies to be applied to games. Haptic interface technologies are expected to play an important role in games to provide gamers with interaction and immersive perception in near future, even though haptic interface technologies have been less studied than other perception-related technologies with respect to games. Information on two types of haptic interfaces - portable and desktop - and their evolution processes were analyzed in terms of technological demands. Haptic feedback technologies to realize these demands were inspected with the time frame and haptic feedback technologies were derived using a technology tree. The technology roadmap of haptic interfaces in game was finally constructed by mapping the technological demands in time with game technology trends. The technology roadmap of haptic interfaces will have implications on developing haptic interfaces to be applied to many applications including virtual realities and games.

Multimodal Interaction on Automultiscopic Content with Mobile Surface Haptics

  • Kim, Jin Ryong;Shin, Seunghyup;Choi, Seungho;Yoo, Yeonwoo
    • ETRI Journal
    • /
    • 제38권6호
    • /
    • pp.1085-1094
    • /
    • 2016
  • In this work, we present interactive automultiscopic content with mobile surface haptics for multimodal interaction. Our system consists of a 40-view automultiscopic display and a tablet supporting surface haptics in an immersive room. Animated graphics are projected onto the walls of the room. The 40-view automultiscopic display is placed at the center of the front wall. The haptic tablet is installed at the mobile station to enable the user to interact with the tablet. The 40-view real-time rendering and multiplexing technology is applied by establishing virtual cameras in the convergence layout. Surface haptics rendering is synchronized with three-dimensional (3D) objects on the display for real-time haptic interaction. We conduct an experiment to evaluate user experiences of the proposed system. The results demonstrate that the system's multimodal interaction provides positive user experiences of immersion, control, user interface intuitiveness, and 3D effects.

척추 바늘 삽입술 시뮬레이터 개발을 위한 인공지능 기반 척추 CT 이미지 자동분할 및 햅틱 렌더링 (AI-based Automatic Spine CT Image Segmentation and Haptic Rendering for Spinal Needle Insertion Simulator)

  • 박익종;김기훈;최건;정완균
    • 로봇학회논문지
    • /
    • 제15권4호
    • /
    • pp.316-322
    • /
    • 2020
  • Endoscopic spine surgery is an advanced surgical technique for spinal surgery since it minimizes skin incision, muscle damage, and blood loss compared to open surgery. It requires, however, accurate positioning of an endoscope to avoid spinal nerves and to locate the endoscope near the target disk. Before the insertion of the endoscope, a guide needle is inserted to guide it. Also, the result of the surgery highly depends on the surgeons' experience and the patients' CT or MRI images. Thus, for the training, a number of haptic simulators for spinal needle insertion have been developed. But, still, it is difficult to be used in the medical field practically because previous studies require manual segmentation of vertebrae from CT images, and interaction force between the needle and soft tissue has not been considered carefully. This paper proposes AI-based automatic vertebrae CT-image segmentation and haptic rendering method using the proposed need-tissue interaction model. For the segmentation, U-net structure was implemented and the accuracy was 93% in pixel and 88% in IoU. The needle-tissue interaction model including puncture force and friction force was implemented for haptic rendering in the proposed spinal needle insertion simulator.

안정된 햅틱 인터페이스를 위한 비선형가상커플링 (Nonlinear Virtual Coupling for Stable Haptic Interaction)

  • 이문환;이두용
    • 제어로봇시스템학회논문지
    • /
    • 제9권8호
    • /
    • pp.610-615
    • /
    • 2003
  • This paper proposes a nonlinear virtual coupling fur haptic interface, which offers better performance while maintaining stability of the system. The nonlinear virtual coupling is designed based on a human response model. This human response model exploits delay between the human Intention and the actual change of arm impedance. The proposed approach provides with less conservative constraints for the design of stable haptic interface, compared with the traditional passivity condition. This allows increased performance that is verified through experiments.

SPH 기반의 유체 및 용해성 강체에 대한 시각-촉각 융합 상호작용 시뮬레이션 (Real-time Simulation Technique for Visual-Haptic Interaction between SPH-based Fluid Media and Soluble Solids)

  • 김석열;박진아
    • 한국가시화정보학회지
    • /
    • 제15권1호
    • /
    • pp.32-40
    • /
    • 2017
  • Interaction between fluid and a rigid object is frequently observed in everyday life. However, it is difficult to simulate their interaction as the medium and the object have different representations. One of the challenging issues arises especially in handling deformation of the object visually as well as rendering haptic feedback. In this paper, we propose a real-time simulation technique for multimodal interaction between particle-based fluids and soluble solids. We have developed the dissolution behavior model of solids, which is discretized based on the idea of smoothed particle hydrodynamics, and the changes in physical properties accompanying dissolution is immediately reflected to the object. The user is allowed to intervene in the simulation environment anytime by manipulating the solid object, where both visual and haptic feedback are delivered to the user on the fly. For immersive visualization, we also adopt the screen space fluid rendering technique which can balance realism and performance.

직관적인 3D 인터랙션을 위한 핸드 햅틱 인터페이스 (Hand Haptic Interface for Intuitive 3D Interaction)

  • 장용석;김용완;손욱호;김경환
    • 한국HCI학회논문지
    • /
    • 제2권2호
    • /
    • pp.53-59
    • /
    • 2007
  • 3D/가상환경 애플리케이션을 위한 3D 인터랙션에 관한 연구는 이동(navigation), 선택(selection), 조작 (manipulation), 시스템 제어(system control)와 같은 기본적인 4가지 형태의 상호작용으로 정의하고 광범위하게 연구되어 왔으며, 일반적으로 현실세계나 가상환경에서 작은 물체라도 상호작용하기에 적합한 기술로 여겨져 왔다. 그러나 이러한 비직관적인 상호작용 방법은 최근 산업계에서 필요시 되고 있는 가상 훈련이나 가상 디자인/사용성 평가 시스템과 같이 사용자가 도구나 장치를 사용하여 간접적으로 물체를 조작해야 하는 비직관적인 상호작용 방법이 아닌, 자신의 손으로 직접 물체를 만지거나 조작할 수 있는 직관적인 상호작용 방법이 필요한 고품질, 고정밀 애플리케이션을 지원하기에는 적합하지 않은 방법이었다. 따라서 본 연구에서는 직관적이며 자연스러운 상호작용을 지원하기 위한 방법으로 고정밀 핸드 조작과 사실적 역.촉감을 제공하는 장갑형 핸드 인터페이스 장치 및 햅틱 장갑 장치와 6자유도 햅틱 장치로 구성된 핸드 햅틱 인터페이스를 제시하고자 한다.

  • PDF

가상현실 기술을 이용한 가상 조립 시뮬레이션에 대한 연구 (A Study on Virtual Assembly Simulation Using Virtual Reality Technology)

  • 김용완;박진아
    • 한국멀티미디어학회논문지
    • /
    • 제13권11호
    • /
    • pp.1715-1727
    • /
    • 2010
  • 인간의 손은 현실의 사물과 인터랙션하는 가장 직접적이고 자연스러운 인터페이스라고 할 수 있으나, 가상 환경에서는 인간 손 관절체의 높은 자유도와 관련 인터페이스 장치의 한계로 말미암아 가상현실 애플리케이션에 활발히 도입되고 있지 못한 상황이다. 특히, 가상 조립 시뮬레이션은 제품 개발 단계에서 디지털 목업의 검증을 위한 프로세스로서 가상현실 애플리케이션 중 도전적인 주제라고 할 수 있다. 하지만, 가상의 객체를 파지(grasp)하기 어렵고 지속적으로 세밀하게 조작 할 수 없는 점 등은 핸드 햅틱 인터랙션이 가상 조립 등의 가상현실 애플리케이션에 적용 시 커다란 장벽으로 인식되고 있다. 본 논문에서는 핸드 햅틱 인터랙션을 두 관점에서 분석하여 각 단계별로 단계적 절차를 수행하는 방법을 제안하고자 한다. 인간이 사물을 쥐고 조작할 때, 손 안에서 사물이 빠져나가지 않도록 사물의 외형에 따라 손의 힘의 균형을 조절하는 견고함과 일단 사물을 쥐게 되면 손 안에서 인간의 의도를 반영한 정밀한 사물 조작이 가능한 세밀성이 가능한 핸드 햅틱 인터랙션을 제안하고자 한다. 제안된 알고리즘은 초기 grasp를 용이하게 하기 위해 품질평가척도를 통해 grasp의 견고함을 확보하고, 세밀한 조작성을 확보하기 위해 물리 기반의 시뮬레이션을 수행한다. 제안된 방법의 효율성을 평가하기 위하여 서로 다른 디스플레이 환경-모노, 입체 디스플레이-에서 실험을 수행하였다. 그리고 2-way ANOVA 테스트를 통하여 본 연구에서 제안한 충돌 전 grasp 단계와 충돌 후 조작 단계로 구분된 알고리즘이 앞서 언급한 두 관점을 모두 만족함을 보였다. 마지막으로, 제안된 인터랙션 방법을 이용하여 복잡한 그래픽 모델에 관한 가상 조립 시뮬레이션에 적용된 실 사례를 보였다.

실시간 햅틱 렌더링 기술을 통한 시각 장애인을 위한 원격현장감(Telepresence) 로봇 기술 (Telepresence Robotic Technology for Individuals with Visual Impairments Through Real-time Haptic Rendering)

  • 박정혁;아야나 하워드
    • 로봇학회논문지
    • /
    • 제8권3호
    • /
    • pp.197-205
    • /
    • 2013
  • This paper presents a robotic system that provides telepresence to the visually impaired by combining real-time haptic rendering with multi-modal interaction. A virtual-proxy based haptic rendering process using a RGB-D sensor is developed and integrated into a unified framework for control and feedback for the telepresence robot. We discuss the challenging problem of presenting environmental perception to a user with visual impairments and our solution for multi-modal interaction. We also explain the experimental design and protocols, and results with human subjects with and without visual impairments. Discussion on the performance of our system and our future goals are presented toward the end.