• Title/Summary/Keyword: finger tracking

Search Result 47, Processing Time 0.026 seconds

A Convergence Research for Development of VR Education Contents for Core Fundamental Nursing Skills (핵심기본간호술 VR 교육 콘텐츠 개발을 위한 융복합 연구)

  • Kim, Jungki;Yu, Hye-Yon;Lee, Young-Soo
    • The Journal of the Korea Contents Association
    • /
    • v.21 no.9
    • /
    • pp.714-722
    • /
    • 2021
  • In this study, intends to propose virtual reality education contents for fundamental nursing skills to develop various teaching methods in nursing education. Blood sugar test & insulin subcutaneous injection among the 20 core fundamental nursing skills is one of that frequently performed and can be used for self-management education for diabetic patients. This study designed a core fundamental nursing skill on immersive VR contents by dividing the learner's experience into three stages: guide, mission, and feedback with these skills. And it is designed by tracking the movement of the hand through finger joint recognition without using a controller for immerse in training. This study will help develop VR nursing education contents that can improve clinical practice competency and the effect of the nursing education.

A Study of VR Interaction for Non-contact Hair Styling (비대면 헤어 스타일링 재현을 위한 VR 인터렉션 연구)

  • Park, Sungjun;Yoo, Sangwook;Chin, Seongah
    • The Journal of the Convergence on Culture Technology
    • /
    • v.8 no.2
    • /
    • pp.367-372
    • /
    • 2022
  • With the recent advent of the New Normal era, realistic technologies and non-contact technologies are receiving social attention. However, the hair styling field focuses on the direction of the hair itself, individual movements, and modeling, focusing on hair simulation. In order to create an improved practice environment and demand of the times, this study proposed a non-contact hair styling VR system. In the theoretical review, we studied the existing cases of hair cut research. Existing haircut-related research tend to be mainly focused on force-based feedback. Research on the interactive haircut work in the virtual environment as addressed in this paper has not been done yet. VR controllers capable of finger tracking the movements necessary for beauty enable selection, cutting, and rotation of beauty tools, and built a non-contact collaboration environment. As a result, we conducted two experiments for interactive hair cutting in VR. First, it is a haircut operation for synchronization using finger tracking and holding hook animation. We made position correction for accurate motion. Second, it is a real-time interactive cutting operation in a multi-user virtual collaboration environment. This made it possible for instructors and learners to communicate with each other through VR HMD built-in microphones and Photon Voice in non-contact situations.

RealBook: A Tangible Electronic Book Based on the Interface of TouchFace-V (RealBook: TouchFace-V 인터페이스 기반 실감형 전자책)

  • Song, Dae-Hyeon;Bae, Ki-Tae;Lee, Chil-Woo
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.12
    • /
    • pp.551-559
    • /
    • 2013
  • In this paper, we proposed a tangible RealBook based on the interface of TouchFace-V which is able to recognize multi-touch and hand gesture. The TouchFace-V is applied projection technology on a flat surface such as table, without constraint of space. The system's configuration is addressed installation, calibration, and portability issues that are most existing front-projected vision-based tabletop display. It can provide hand touch and gesture applying computer vision by adopting tracking technology without sensor and traditional input device. The RealBook deals with the combination of each advantage of analog sensibility on texts and multimedia effects of e-book. Also, it provides digitally created stories that would differ in experiences and environments with interacting users' choices on the interface of the book. We proposed e-book that is new concept of electronic book; named RealBook, different from existing and TouchFace-V interface, which can provide more direct viewing, natural and intuitive interactions with hand touch and gesture.

Motion Plane Estimation for Real-Time Hand Motion Recognition (실시간 손동작 인식을 위한 동작 평면 추정)

  • Jeong, Seung-Dae;Jang, Kyung-Ho;Jung, Soon-Ki
    • The KIPS Transactions:PartB
    • /
    • v.16B no.5
    • /
    • pp.347-358
    • /
    • 2009
  • In this thesis, we develop a vision based hand motion recognition system using a camera with two rotational motors. Existing systems were implemented using a range camera or multiple cameras and have a limited working area. In contrast, we use an uncalibrated camera and get more wide working area by pan-tilt motion. Given an image sequence provided by the pan-tilt camera, color and pattern information are integrated into a tracking system in order to find the 2D position and direction of the hand. With these pose information, we estimate 3D motion plane on which the gesture motion trajectory from approximately forms. The 3D trajectory of the moving finger tip is projected into the motion plane, so that the resolving power of the linear gesture patterns is enhanced. We have tested the proposed approach in terms of the accuracy of trace angle and the dimension of the working volume.

Big data, how to balance privacy and social values (빅데이터, 프라이버시와 사회적 가치의 조화방안)

  • Hwang, Joo-Seong
    • Journal of Digital Convergence
    • /
    • v.11 no.11
    • /
    • pp.143-153
    • /
    • 2013
  • Big data is expected to bring forth enormous public good as well as economic opportunity. However there is ongoing concern about privacy not only from public authorities but also from private enterprises. Big data is suspected to aggravate the existing privacy battle ground by introducing new types of privacy risks such as privacy risk of behavioral pattern. On the other hand, big data is asserted to become a new way to by-pass tradition behavioral tracking such as cookies, DPIs, finger printing${\cdots}$ and etc. For it is not based on a targeted person. This paper is to find out if big data could contribute to catching out behavioral patterns of consumers without threatening or damaging their privacy. The difference between traditional behavioral tracking and big data analysis from the perspective of privacy will be discerned.

NATURAL INTERACTION WITH VIRTUAL PET ON YOUR PALM

  • Choi, Jun-Yeong;Han, Jae-Hyek;Seo, Byung-Kuk;Park, Han-Hoon;Park, Jong-Il
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.341-345
    • /
    • 2009
  • We present an augmented reality (AR) application for cell phone where users put a virtual pet on their palms and play/interact with the pet by moving their hands and fingers naturally. The application is fundamentally based on hand/palm pose recognition and finger motion estimation, which is the main concern in this paper. We propose a fast and efficient hand/palm pose recognition method which uses natural features (e.g. direction, width, contour shape of hand region) extracted from a hand image with prior knowledge for hand shape or geometry (e.g. its approximated shape when a palm is open, length ratio between palm width and pal height). We also propose a natural interaction method which recognizes natural motion of fingers such as opening/closing palm based on fingertip tracking. Based on the proposed methods, we developed and tested the AR application on an ultra-mobile PC (UMPC).

  • PDF

Gesture based Natural User Interface for e-Training

  • Lim, C.J.;Lee, Nam-Hee;Jeong, Yun-Guen;Heo, Seung-Il
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.577-583
    • /
    • 2012
  • Objective: This paper describes the process and results related to the development of gesture recognition-based natural user interface(NUI) for vehicle maintenance e-Training system. Background: E-Training refers to education training that acquires and improves the necessary capabilities to perform tasks by using information and communication technology(simulation, 3D virtual reality, and augmented reality), device(PC, tablet, smartphone, and HMD), and environment(wired/wireless internet and cloud computing). Method: Palm movement from depth camera is used as a pointing device, where finger movement is extracted by using OpenCV library as a selection protocol. Results: The proposed NUI allows trainees to control objects, such as cars and engines, on a large screen through gesture recognition. In addition, it includes the learning environment to understand the procedure of either assemble or disassemble certain parts. Conclusion: Future works are related to the implementation of gesture recognition technology for a multiple number of trainees. Application: The results of this interface can be applied not only in e-Training system, but also in other systems, such as digital signage, tangible game, controlling 3D contents, etc.