• Title/Summary/Keyword: Haptic interfaces

Search Result 37, Processing Time 0.028 seconds

A Study on Technique of Navigation with Power-Reflected of the Walker in the Indoor Environment

  • Kim, Min-Sik;Kwon, Hyouk-Gil;Ryu, Je-Goon;Shim, Hyeon-Min;Lee, Eung-Hyuk;Shim, Jea-Hong;Lee, Sang-Moo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.957-962
    • /
    • 2005
  • Today, the elderly is increasing gradually in the Republic of Korea society and this problem will be more serious in the near future. Therefore, engineering support for aged people is required. We are establishing a new field of healthcare engineering for elderly people and aiming to support for aged people and disabled people using adaptive control and instrument technology. In this paper, the goal is to implement the shared control of a robot mobility aid for the elderly. As using this type of assistive technology to be useful by its intended user community, it supports elderly people and handicapped people to live independently in their private homes. The interface transforms the force applied by the user into the robot's motion. Devices like buttons, joysticks, and levers already exist for relaying user input; however, they require hand displacement that would loosen or otherwise release the user's hold. Such interfaces make operation very difficult and potentially unsafe. Therefore, we propose a shared control system. It's safe more than joysticks and buttons. The shared control is a means of registering the user's intention through physical interaction. It's an important component in the development of robotic elderly assistant. The concept of shared control describes a system which is two or more independent control systems. We are using that the three component blocks consist of pressure sensor (flexible force sensor), circuit of measurement and transfer function. Experimental trials of this paper have been tested at the indoor environment. The robot is able to know the user intended direction through haptic device were logged along with the robot's force sensor.

  • PDF

Virtual Object Weight Information with Multi-modal Sensory Feedback during Remote Manipulation (다중 감각 피드백을 통한 원격 가상객체 조작 시 무게 정보 전달)

  • Changhyeon Park;Jaeyoung Park
    • Journal of Internet Computing and Services
    • /
    • v.25 no.1
    • /
    • pp.9-15
    • /
    • 2024
  • As virtual reality technology became popular, a high demand emerged for natural and efficient interaction with the virtual environment. Mid-air manipulation is one of the solutions to such needs, letting a user manipulate a virtual object in a 3D virtual space. In this paper, we focus on manipulating a remote virtual object while visually displaying the object and providing tactile information on the object's weight. We developed two types of wearable interfaces that can provide cutaneous or vibrotactile feedback on the virtual object weight to the user's fingertips. Human perception of the remote virtual object weight during manipulation was evaluated by conducting a psychophysics experiment. The results indicate a significant effect of haptic feedback on the perceived weight of the virtual object during manipulation.

Usability Test on Haptic Interaction With Real Object in Virtual Reality (실제 사물을 이용한 VR 햅틱 인터랙션 사용성 테스트)

  • Yang, Han Ul;Park, Jun
    • Journal of the Korean Society for Computer Game
    • /
    • v.31 no.4
    • /
    • pp.197-203
    • /
    • 2018
  • As people's interest in Virtual Reality has recently increased, peripherals have also made many progress. There is a lot of research being done from VR environment to VR configuration through scanning at room level with various interface devices that can interact with objects in the environment. According to current VR research Home VR uses multiple haptic interfaces to interact with objects configured in the VR environment, the method uses room scanning to some extent is beyond the spatial constraints and may use tracking equipment to interact with real objects. And advances in 3D printers have enabled the distribution of commercial 3D printers and home 3D printers, and made it easy for 3D printers to create models of their choice at home or at home. Considering the above two factors, We think it is necessary to study the difference between a model's object that people feel when interacting directly with an easy-to-create model in a VR environment. Therefore, in this paper, we are going to implement objects produced by 3D printers in VR space and study the differences between using real objects and other general interaction equipment through user testing with those that are actually implemented.

Authoring Tool for Augmented Reality based Product Design (증강현실 기반 제품 디자인을 위한 저작도구)

  • Ha, Tae-Jin;Billinghurst, Mark;Woo, Woon-Tack
    • Journal of the Korea Computer Graphics Society
    • /
    • v.13 no.2
    • /
    • pp.23-29
    • /
    • 2007
  • We are suggesting an authoring tool can be used for prototyping in the augmented reality based product design environment. This tool is for authors without an engineering background to use. Our authoring tool can adjust the properties of visual, sound, and haptic feedback at the same time for more practical prototyping. Also the proposed modulated architecture can be applied flexibly to changes of platforms or hardware. Also user interfaces can be dynamically updated by changing just description files. finally, the suggested authoring methods exploit the advantages of both graphical and tangible user interfaces. Authors can intuitively make adjustments to many parameters using the TUI, and then they can do the same thing precisely using the GVI. The proposed authoring methods can be used for exhibition and entertainment contents using multi-sensory feedback in AR environment. As a future work, qualitative and quantitative usability test will be conducted.

  • PDF

Human body learning system using multimodal and user-centric interfaces (멀티모달 사용자 중심 인터페이스를 적용한 인체 학습 시스템)

  • Kim, Ki-Min;Kim, Jae-Il;Park, Jin-Ah
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.85-90
    • /
    • 2008
  • This paper describes the human body learning system using the multi-modal user interface. Through our learning system, students can study about human anatomy interactively. The existing learning methods use the one-way materials like images, text and movies. But we propose the new learning system that includes 3D organ surface models, haptic interface and the hierarchical data structure of human organs to serve enhanced learning that utilizes sensorimotor skills.

  • PDF

Virtual Science Lab - Sensible Human Body Learning System (가상 과학 실험실 - 체감형 인체 구조 학습 시스템)

  • Kim, Ki-Min;Kim, Jae-Il;Kim, Seok-Yeol;Park, Jin-Ah
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.2078-2079
    • /
    • 2009
  • This research suggests the framework for human body learning system using various forms of bidirectional interfaces. The existing systems mostly use the limited and unidirectional methods which are merely focused on the visual information. Our system provides more realistic visual information using 3D organ models from the real human body. Also we combine the haptic and augmented reality techniques into our system for wider range of interaction means. Through this research, we aim to overcome the limitation of existing science education systems and explore the effective scheme to fuse the real and virtual educational environment into one.

  • PDF

Vibration Stimulus Generation using Sound Detection Algorithm for Improved Sound Experience (사운드 실감성 증진을 위한 사운드 감지 알고리즘 기반 촉각진동자극 생성)

  • Ji, Dong-Ju;Oh, Sung-Jin;Jun, Kyung-Koo;Sung, Mee-Young
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.158-162
    • /
    • 2009
  • Sound effects coming with appropriate tactile stimuli can strengthen its reality. For example, gunfire in games and movies, if it is accompanied by vibrating effects, can enhance the impressiveness. On a similar principle, adding the vibration information to existing sound data file and playing sound while generating vibration effects through haptic interfaces can augment the sound experience. In this paper, we propose a method to generate vibration information by analyzing the sound. The vibration information consists of vibration patterns and the timing within a sound file. Adding the vibration information is labor-intensive if it is done manually. We propose a sound detection algorithm to search the moments when specific sounds occur in a sound file and a method to create vibration effects at those moments. The sound detection algorithm compares the frequency characteristic of specific sounds and finds the moments which have similar frequency characteristic within a sound file. The detection ratio of the algorithm was 98% for five different kinds of gunfire. We also develop a GUI based vibrating pattern editor to easily perform the sound search and vibration generation.

  • PDF