• Title/Summary/Keyword: Hand user interface

Search Result 199, Processing Time 0.024 seconds

CNN-Based Hand Gesture Recognition for Wearable Applications (웨어러블 응용을 위한 CNN 기반 손 제스처 인식)

  • Moon, Hyeon-Chul;Yang, Anna;Kim, Jae-Gon
    • Journal of Broadcast Engineering
    • /
    • v.23 no.2
    • /
    • pp.246-252
    • /
    • 2018
  • Hand gestures are attracting attention as a NUI (Natural User Interface) of wearable devices such as smart glasses. Recently, to support efficient media consumption in IoT (Internet of Things) and wearable environments, the standardization of IoMT (Internet of Media Things) is in the progress in MPEG. In IoMT, it is assumed that hand gesture detection and recognition are performed on a separate device, and thus provides an interoperable interface between these modules. Meanwhile, deep learning based hand gesture recognition techniques have been recently actively studied to improve the recognition performance. In this paper, we propose a method of hand gesture recognition based on CNN (Convolutional Neural Network) for various applications such as media consumption in wearable devices which is one of the use cases of IoMT. The proposed method detects hand contour from stereo images acquisitioned by smart glasses using depth information and color information, constructs data sets to learn CNN, and then recognizes gestures from input hand contour images. Experimental results show that the proposed method achieves the average 95% hand gesture recognition rate.

A Study on the Need of the Usable Security in the Corelation between IT Security and User Experience

  • Lee, Soowook
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.9 no.4
    • /
    • pp.14-18
    • /
    • 2017
  • In this paper, an contemplate the direction for Usable Security in IT security and User Experience. To evaluate how the user interface is convenient to use, we examine the components such as the property, learnable property, memory simplicity, faults and satisfaction level. By considering for the security, we should bring positive effects on the user experience. By emphasizing usability and security at the same time, we should increase the satisfaction level of the user experience and then produce the valuable experience through participation, use and observation. The positive user experience is the important task for the software engineering, business administration and others., and this will result satisfaction of the users, brand trust, and success in the market. On the other hand, for the negative user experience, the users cannot achieve their desired goal and therefore, are unsatisfied due to emotional, rational and economic inconvenience. Due to this, we should try to maintain a certain level of usability and security of the system in IT security and User Experience.

Design of a 6-DOF Parallel Haptic Rand Controller Consisting of 5-Bar Linkages and Gimbal Mechanisms (5절링크와 짐벌기구로 구성된 병렬형 6자유도 햅틱 핸드컨트롤러의 설계)

  • Ryu, Dong-Seok;Sohn, Won-Sun;Song, Jae-Bok
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.27 no.1
    • /
    • pp.18-25
    • /
    • 2003
  • A haptic hand controller (HHC) operated by the user’s hand can receive information on position and orientation of the hand and display force and moment generated in the virtual environment to the hand. In this paper, a 3-DOF hand controller is first presented, in which all the actuators are mounted on the fixed base by combining a 5-bar linkage and a gimbal mechanism. The 6-DOF HHC is then designed by connecting these two 3-DOF devices through a handle which consists of a screw and nut. Analysis using performance index is carried out to determine the dimensions of the device. The HHC control system consists of the high-level controller for kinematic and static analysis and the low-level controller for position sensing and motor control. The HHC used as a user interface to control the mobile robot in the virtual environment is given as a simple application.

User-centric Immersible and Interactive Electronic Book based on the Interface of Tabletop Display (테이블탑 디스플레이 기반 사용자 중심의 실감형 상호작용 전자책)

  • Song, Dae-Hyeon;Park, Jae-Wan;Lee, Chil-Woo
    • The Journal of the Korea Contents Association
    • /
    • v.9 no.6
    • /
    • pp.117-125
    • /
    • 2009
  • In this paper, we propose user-centric immersible and interactive electronic book based on the interface of tabletop display. Electronic book is usually used for users that want to read the text book with multimedia contents like video, audio, animation and etc. It is based on tabletop display platform then the conventional input device like keyboard and mouse is not essentially needed. Users can interact with the contents based on the gestures defined for the interface of tabletop display using hand finger touches then it gives superior and effective interface for users to use the electronic book interestingly. This interface supports multiple users then it gives more diverse effects on the conventional electronic contents just made for one user. In this paper our method gives new way for the conventional electronics book and it can define the user-centric gestures and help users to interact with the book easily. We expect our method can be utilized for many edutainment contents.

Use of a gesture user interface as a touchless image navigation system in dental surgery: Case series report

  • Rosa, Guillermo M.;Elizondo, Maria L.
    • Imaging Science in Dentistry
    • /
    • v.44 no.2
    • /
    • pp.155-160
    • /
    • 2014
  • Purpose: The purposes of this study were to develop a workstation computer that allowed intraoperative touchless control of diagnostic and surgical images by a surgeon, and to report the preliminary experience with the use of the system in a series of cases in which dental surgery was performed. Materials and Methods: A custom workstation with a new motion sensing input device (Leap Motion) was set up in order to use a natural user interface (NUI) to manipulate the imaging software by hand gestures. The system allowed intraoperative touchless control of the surgical images. Results: For the first time in the literature, an NUI system was used for a pilot study during 11 dental surgery procedures including tooth extractions, dental implant placements, and guided bone regeneration. No complications were reported. The system performed very well and was very useful. Conclusion: The proposed system fulfilled the objective of providing touchless access and control of the system of images and a three-dimensional surgical plan, thus allowing the maintenance of sterile conditions. The interaction between surgical staff, under sterile conditions, and computer equipment has been a key issue. The solution with an NUI with touchless control of the images seems to be closer to an ideal. The cost of the sensor system is quite low; this could facilitate its incorporation into the practice of routine dental surgery. This technology has enormous potential in dental surgery and other healthcare specialties.

Development of Motion Recognition Platform Using Smart-Phone Tracking and Color Communication (스마트 폰 추적 및 색상 통신을 이용한 동작인식 플랫폼 개발)

  • Oh, Byung-Hun
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.17 no.5
    • /
    • pp.143-150
    • /
    • 2017
  • In this paper, we propose a novel motion recognition platform using smart-phone tracking and color communication. The interface requires only a camera and a personal smart-phone to provide a motion control interface rather than expensive equipment. The platform recognizes the user's gestures by the tracking 3D distance and the rotation angle of the smart-phone, which acts essentially as a motion controller in the user's hand. Also, a color coded communication method using RGB color combinations is included within the interface. Users can conveniently send or receive any text data through this function, and the data can be transferred continuously even while the user is performing gestures. We present the result that implementation of viable contents based on the proposed motion recognition platform.

Fast Tap-N-Drag (FTND) : Enhancing Panning for Web Browsing on Small Screen Devices Considering Panning Ratio and Direction (작은 화면에서의 인터넷 작업을 위한 효율적인 화면이동방법 제안 및 사용성 평가)

  • Choi, Eun-Jung;Kwon, Sung-Hyuk;Chung, Min-K.
    • IE interfaces
    • /
    • v.22 no.4
    • /
    • pp.347-358
    • /
    • 2009
  • Panning tasks caused by both a small screen and the lower resolution of handheld devices are known to decrease the usability of a mobile internet service. To solve this problem, we proposed FTND, an improved version of Tap-N-Drag widely used in various mobile web browsers. 30 participants performed the panning tasks with FTND embedded in combinations of 2 panning directions of Push Background user interface and Push Viewpoint user interface and 5 panning ratios of 100% (a panning ratio of Tap- N-Drag), 300%, 500%, 700%, and 900%. The usability of FTND was assessed by an objective performance and a subjective preference. The objective performance was measured by a task completion time, the number of clicks, and the number of pixels. The subjective preference was measured by satisfaction, accuracy and ease of use. Push Viewpoint user interface at the panning ratios of 300%, 500%, and 700% proved to be the most efficient way for panning tasks with small handheld devices when performing the task by using the right hand thumb.

Handwriting and Voice Input using Transparent Input Overlay (투명한 입력오버레이를 이용한 필기 및 음성 입력)

  • Kim, Dae-Hyun;Kim, Myoung-Jun;Lee, Zin-O
    • Journal of KIISE:Software and Applications
    • /
    • v.35 no.4
    • /
    • pp.245-254
    • /
    • 2008
  • This paper proposes a unified multi-modal input framework to interface the recognition engines such as IBM ViaVoice and Microsoft handwriting-recognition system with general window applications, particularly, for pen-input displays. As soon as user pushes a hardware button attached to the pin-input display with one hand, the current window of focus such as a internet search window and a word processor is overlaid with a transparent window covering the whole desktop; upon which user inputs handwriting with the other hand, without losing the focus of attention on working context. As well as freeform handwriting on this transparent input overlay as a sketch pad, the user can dictate some words and draw diagrams to communicate with the system.

The User Interface of Button Type for Stereo Video-See-Through (Stereo Video-See-Through를 위한 버튼형 인터페이스)

  • Choi, Young-Ju;Seo, Young-Duek
    • Journal of the Korea Computer Graphics Society
    • /
    • v.13 no.2
    • /
    • pp.47-54
    • /
    • 2007
  • This paper proposes a user interface based on video see-through environment which shows the images via stereo-cameras so that the user can control the computer systems or other various processes easily. We include an AR technology to synthesize virtual buttons; the graphic images are overlaid on the captured frames taken by the camera real-time. We search for the hand position in the frames to judge whether or not the user selects the button. The result of judgment is visualized through changing of the button color. The user can easily interact with the system by selecting the virtual button in the screen with watching the screen and moving her fingers at the air.

  • PDF

A Study on User Interface for Quiz Game Contents using Gesture Recognition (제스처인식을 이용한 퀴즈게임 콘텐츠의 사용자 인터페이스에 대한 연구)

  • Ahn, Jung-Ho
    • Journal of Digital Contents Society
    • /
    • v.13 no.1
    • /
    • pp.91-99
    • /
    • 2012
  • In this paper we introduce a quiz application program that digitizes the analogue quiz game. We digitize the quiz components such as quiz proceeding, participants recognition, problem presentation, volunteer recognition who raises his hand first, answer judgement, score addition, winner decision, etc, which are manually performed in the normal quiz game. For automation, we obtained the depth images from the kinect camera which comes into the spotlight recently, so that we located the quiz participants and recognized the user-friendly defined gestures. Analyzing the depth distribution, we detected and segmented the upper body parts and located the hands' areas. Also, we extracted hand features and designed the decision function that classified the hand pose into palm, fist or else, so that a participant can select the example that he wants among presented examples. The implemented quiz application program was tested in real time and showed very satisfactory gesture recognition results.