• Title/Summary/Keyword: Haptic Rendering

Search Result 73, Processing Time 0.029 seconds

Telepresence Robotic Technology for Individuals with Visual Impairments Through Real-time Haptic Rendering (실시간 햅틱 렌더링 기술을 통한 시각 장애인을 위한 원격현장감(Telepresence) 로봇 기술)

  • Park, Chung Hyuk;Howard, Ayanna M.
    • The Journal of Korea Robotics Society
    • /
    • v.8 no.3
    • /
    • pp.197-205
    • /
    • 2013
  • This paper presents a robotic system that provides telepresence to the visually impaired by combining real-time haptic rendering with multi-modal interaction. A virtual-proxy based haptic rendering process using a RGB-D sensor is developed and integrated into a unified framework for control and feedback for the telepresence robot. We discuss the challenging problem of presenting environmental perception to a user with visual impairments and our solution for multi-modal interaction. We also explain the experimental design and protocols, and results with human subjects with and without visual impairments. Discussion on the performance of our system and our future goals are presented toward the end.

Multimodal Curvature Discrimination of 3D Objects

  • Kim, Kwang-Taek;Lee, Hyuk-Soo
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.14 no.4
    • /
    • pp.212-216
    • /
    • 2013
  • As virtual reality technologies are advanced rapidly, how to render 3D objects across modalities is becoming an important issue. This study is therefore aimed to investigate human discriminability on the curvature of 3D polygonal surfaces with focusing on the vision and touch senses because they are most dominant when explore 3D shapes. For the study, we designed a psychophysical experiment using signal detection theory to determine curvature discrimination for three conditions: haptic only, visual only, and both haptic and visual. The results show that there is no statistically significant difference among the conditions although the threshold in the haptic condition is the lowest. The results also indicate that rendering using both visual and haptic channels could degrade the performance of discrimination on a 3D global shape. These results must be considered when a multimodal rendering system is designed in near future.

Remote Control of a Mobile Robot using Haptic Device (촉각 정보를 이용한 이동로봇의 원격제어)

  • 권용태;강희준;노영식
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2004.10a
    • /
    • pp.737-741
    • /
    • 2004
  • A mobile robot system is developed which is remotely controlled by a haptic master called ‘PHANTOM’. The mobile robot has 4 ultrasonic sensors and single CCD camera which detects the distance from a mobile robot to obstacles in the environment and sends this information to a haptic master. For more convenient remote control, haptic rendering process is performed like viscosity forces and obstacle avoidance forces. In order to show the effectiveness of the developed system, we experiment that the mobile robot runs through the maze and the time is checked to complete the path of the maze with/without the haptic information. Through this repeated experiments, haptic information proves to be useful for remote control of a mobile robot.

  • PDF

Haptic Rendering Algorithm for Collision Situation of Two Objects (두 객체가 충돌하는 상황에서의 햅틱 렌더링 알고리즘)

  • Kim, Seonkyu;Kim, Hyebin;Ryu, Chul
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.18 no.3
    • /
    • pp.35-41
    • /
    • 2018
  • In this paper, we define a haptic rendering algorithm for a situation that has collision between static object and single object. We classified video scenes into four categories which can be easily seen in video sequence. The proposed algorithm can detect which frame is suitable for haptic rendering by detecting the change of direction using motion estimation and change of shape using object tracking. As a result, a total of 13 frames are extracted from the sample video and playing time of these frames were calculated. We confirmed that the haptic effect appears in expected playing time by adding the appropriate haptic generating waveform thtough the haptic editing program.

An Efficient Virtual Teeth Modeling for Dental Training System

  • Kim, Lae-Hyun;Park, Se-Hyung
    • International Journal of CAD/CAM
    • /
    • v.8 no.1
    • /
    • pp.41-44
    • /
    • 2009
  • This paper describes an implementation of virtual teeth modeling for a haptic dental simulation. The system allows dental students to practice dental procedures with realistic tactual feelings. The system requires fast and stable haptic rendering and volume modeling techniques working on the virtual tooth. In our implementation, a volumetric implicit surface is used for intuitive shape modification without topological constraints and haptic rendering. The volumetric implicit surface is generated from input geometric model by using a closest point transformation algorithm. And for visual rendering, we apply an adaptive polygonization method to convert volumetric teeth model to geometric model. We improve our previous system using new octree design to save memory requirement while increase the performance and visual quality.

Interaction Metaphors for Modeling Virtual Hair using Haptic Interfaces

  • Bonanni, Ugo;Kmoch, Petr;Magnenat-Thalmann, Nadia
    • International Journal of CAD/CAM
    • /
    • v.9 no.1
    • /
    • pp.93-102
    • /
    • 2010
  • Shaping realistic hairstyles for digital characters is a difficult, long and tedious task. The lack of appropriate interaction metaphors enabling efficient and simple, yet accurate hair modeling further aggravates the situation. This paper presents 3D interaction metaphors for modeling virtual hair using haptic interfaces. We discuss user tasks, ergonomic aspects, as well as haptics-based styling and fine-tuning tools on an experimental prototype. In order to achieve faster haptic rates with respect to the hair simulation and obtain a transparent rendering, we adapt our simulation models to comply with the specific requirements of haptic hairstyling actions and decouple the simulation of the hair strand dynamics from the haptic rendering while relying on the same physiochemical hair constants. Besides the direct use of the discussed interaction metaphors in the 3D modeling area, the presented results have further application potential in hair modeling facilities for the entertainment industry and the cosmetic sciences.

Haptic Collaboration System over High Resolution Tiled-display with QoE (QoE 지원 Tiled-display 기반 촉감 협업 시스템)

  • Son, Seok-Ho;Lee, Seok-Hee;Kim, Jong-Won
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.65-71
    • /
    • 2008
  • This paper proposes a structure of haptic collaboration system over high resolution tiled-display, and proposes a QoE (quality of experience) increase scheme in integrated system. Both haptic system and tiled-display system have requirements of computational power. A haptic device is unstable if haptic rendering rate is less than 1kHz, A requirement of tiled-display systeme is frame rate of display. It requires update of 30 frame fer sec. If we use these systems independently, we can satisfy each requirements. However, if we integrate two systems, performance of entire system significantly decreases because of lack of resources, and QoE of users also decrease. In this paper, therefore, we propose a QoE guaranty scheme which selectively allocates cpu resource between display update and haptic rendering. In order to increase QoE, we set a priority for visual and haptic in order to allocate more resource on visual or haptic. If a user sensitive about touch, then proposed scheme increases haptic rendering rate by allocating more resource. Otherwise a user more sensitive about visual than haptic, proposed scheme decreases haptic rendering rate and increases tiled-display update rate. Therefore, by selectively allocating limited cpu resource, proposed scheme guaranties QoE of both haptic and visual.

  • PDF

Development of PC-Based 6DOF Force Display System (PC기반의 6자유도 촉각장치의 개발)

  • Shin, Suck-Doo;Kang, Won-Chan;Kim, Dong-Ok;Kim, Won-Bae;Kim, Young-Dong
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.50 no.5
    • /
    • pp.211-217
    • /
    • 2001
  • In this paper, we have developed the 6 DOF force display system to be based on the single PC. The system is composed of the force display device, the force reflecting rendering algorithm and the high-speed controller. The previous systems had a problem, that must adopt high performance workstation or 2-PC in order to control the graphics speedily and stably. In this paper, it is possible to improve the problem as to develop its exclusive controller and new rendering algorithm. The proposed new rendering algorithm is based on the Proxy algorithm, which can convert information of the position, the velocity, and the haptic information into the force-data. Especially, as to use the proxy algorithm, we can construct dynamical virtual-environment with the elasticity, the viscosity, the mass, and the friction force. As the result of the experiment, we found that our system has much superior characteristics than some other haptic interfaces, because it can control of 30,000 polygon model constructed virtual object with 1[kHz] haptic interrupt cycle and 20[Hz] graphic interrupt cycle in the single PC based system.

  • PDF

Real-time Simulation Technique for Visual-Haptic Interaction between SPH-based Fluid Media and Soluble Solids (SPH 기반의 유체 및 용해성 강체에 대한 시각-촉각 융합 상호작용 시뮬레이션)

  • Kim, Seokyeol;Park, Jinah
    • Journal of the Korean Society of Visualization
    • /
    • v.15 no.1
    • /
    • pp.32-40
    • /
    • 2017
  • Interaction between fluid and a rigid object is frequently observed in everyday life. However, it is difficult to simulate their interaction as the medium and the object have different representations. One of the challenging issues arises especially in handling deformation of the object visually as well as rendering haptic feedback. In this paper, we propose a real-time simulation technique for multimodal interaction between particle-based fluids and soluble solids. We have developed the dissolution behavior model of solids, which is discretized based on the idea of smoothed particle hydrodynamics, and the changes in physical properties accompanying dissolution is immediately reflected to the object. The user is allowed to intervene in the simulation environment anytime by manipulating the solid object, where both visual and haptic feedback are delivered to the user on the fly. For immersive visualization, we also adopt the screen space fluid rendering technique which can balance realism and performance.

AI-based Automatic Spine CT Image Segmentation and Haptic Rendering for Spinal Needle Insertion Simulator (척추 바늘 삽입술 시뮬레이터 개발을 위한 인공지능 기반 척추 CT 이미지 자동분할 및 햅틱 렌더링)

  • Park, Ikjong;Kim, Keehoon;Choi, Gun;Chung, Wan Kyun
    • The Journal of Korea Robotics Society
    • /
    • v.15 no.4
    • /
    • pp.316-322
    • /
    • 2020
  • Endoscopic spine surgery is an advanced surgical technique for spinal surgery since it minimizes skin incision, muscle damage, and blood loss compared to open surgery. It requires, however, accurate positioning of an endoscope to avoid spinal nerves and to locate the endoscope near the target disk. Before the insertion of the endoscope, a guide needle is inserted to guide it. Also, the result of the surgery highly depends on the surgeons' experience and the patients' CT or MRI images. Thus, for the training, a number of haptic simulators for spinal needle insertion have been developed. But, still, it is difficult to be used in the medical field practically because previous studies require manual segmentation of vertebrae from CT images, and interaction force between the needle and soft tissue has not been considered carefully. This paper proposes AI-based automatic vertebrae CT-image segmentation and haptic rendering method using the proposed need-tissue interaction model. For the segmentation, U-net structure was implemented and the accuracy was 93% in pixel and 88% in IoU. The needle-tissue interaction model including puncture force and friction force was implemented for haptic rendering in the proposed spinal needle insertion simulator.