• Title/Summary/Keyword: Hand Interface

Search Result 601, Processing Time 0.032 seconds

Vision based Fast Hand Motion Recognition Method for an Untouchable User Interface of Smart Devices (스마트 기기의 비 접촉 사용자 인터페이스를 위한 비전 기반 고속 손동작 인식 기법)

  • Park, Jae Byung
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.49 no.9
    • /
    • pp.300-306
    • /
    • 2012
  • In this paper, we propose a vision based hand motion recognition method for an untouchable user interface of smart devices. First, an original color image is converted into a gray scaled image and its spacial resolution is reduced, taking the small memory and low computational power of smart devices into consideration. For robust recognition of hand motions through separation of horizontal and vertical motions, the horizontal principal area (HPA) and the vertical principal area (VPA) are defined respectively. From the difference images of the consecutively obtained images, the center of gravity (CoG) of the significantly changed pixels caused by hand motions is obtained, and the direction of hand motion is detected by defining the least mean squared line for the CoG in time. For verifying the feasibility of the proposed method, the experiments are carried out with a vision system.

Human-Object Interaction Framework Using RGB-D Camera (RGB-D 카메라를 사용한 사용자-실사물 상호작용 프레임워크)

  • Baeka, Yong-Hwan;Lim, Changmin;Park, Jong-Il
    • Journal of Broadcast Engineering
    • /
    • v.21 no.1
    • /
    • pp.11-23
    • /
    • 2016
  • Recent days, touch interaction interface is the most widely used interaction interface to communicate with digital devices. Because of its usability, touch technology is applied almost everywhere from watch to advertising boards and it is growing much bigger. However, this technology has a critical weakness. Normally, touch input device needs a contact surface with touch sensors embedded in it. Thus, touch interaction through general objects like books or documents are still unavailable. In this paper, a human-object interaction framework based on RGB-D camera is proposed to overcome those limitation. The proposed framework can deal with occluded situations like hovering the hand on top of the object and also moving objects by hand. In such situations object recognition algorithm and hand gesture algorithm may fail to recognize. However, our framework makes it possible to handle complicated circumstances without performance loss. The framework calculates the status of the object with fast and robust object recognition algorithm to determine whether it is an object or a human hand. Then, the hand gesture recognition algorithm controls the context of each object by gestures almost simultaneously.

Design of Computer Vision Interface by Recognizing Hand Motion (손동작 인식에 의한 컴퓨터 비전 인터페이스 설계)

  • Yun, Jin-Hyun;Lee, Chong-Ho
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.47 no.3
    • /
    • pp.1-10
    • /
    • 2010
  • As various interfacing devices for computational machines are being developed, a new HCI method using hand motion input is introduced. This interface method is a vision-based approach using a single camera for detecting and tracking hand movements. In the previous researches, only a skin color is used for detecting and tracking hand location. However, in our design, skin color and shape information are collectively considered. Consequently, detection ability of a hand increased. we proposed primary orientation edge descriptor for getting an edge information. This method uses only one hand model. Therefore, we do not need training processing time. This system consists of a detecting part and a tracking part for efficient processing. In tracking part, the system is quite robust on the orientation of the hand. The system is applied to recognize a hand written number in script style using DNAC algorithm. Performance of the proposed algorithm reaches 82% recognition ratio in detecting hand region and 90% in recognizing a written number in script style.

NUI/NUX of the Virtual Monitor Concept using the Concentration Indicator and the User's Physical Features (사용자의 신체적 특징과 뇌파 집중 지수를 이용한 가상 모니터 개념의 NUI/NUX)

  • Jeon, Chang-hyun;Ahn, So-young;Shin, Dong-il;Shin, Dong-kyoo
    • Journal of Internet Computing and Services
    • /
    • v.16 no.6
    • /
    • pp.11-21
    • /
    • 2015
  • As growing interest in Human-Computer Interaction(HCI), research on HCI has been actively conducted. Also with that, research on Natural User Interface/Natural User eXperience(NUI/NUX) that uses user's gesture and voice has been actively conducted. In case of NUI/NUX, it needs recognition algorithm such as gesture recognition or voice recognition. However these recognition algorithms have weakness because their implementation is complex and a lot of time are needed in training because they have to go through steps including preprocessing, normalization, feature extraction. Recently, Kinect is launched by Microsoft as NUI/NUX development tool which attracts people's attention, and studies using Kinect has been conducted. The authors of this paper implemented hand-mouse interface with outstanding intuitiveness using the physical features of a user in a previous study. However, there are weaknesses such as unnatural movement of mouse and low accuracy of mouse functions. In this study, we designed and implemented a hand mouse interface which introduce a new concept called 'Virtual monitor' extracting user's physical features through Kinect in real-time. Virtual monitor means virtual space that can be controlled by hand mouse. It is possible that the coordinate on virtual monitor is accurately mapped onto the coordinate on real monitor. Hand-mouse interface based on virtual monitor concept maintains outstanding intuitiveness that is strength of the previous study and enhance accuracy of mouse functions. Further, we increased accuracy of the interface by recognizing user's unnecessary actions using his concentration indicator from his encephalogram(EEG) data. In order to evaluate intuitiveness and accuracy of the interface, we experimented it for 50 people from 10s to 50s. As the result of intuitiveness experiment, 84% of subjects learned how to use it within 1 minute. Also, as the result of accuracy experiment, accuracy of mouse functions (drag(80.4%), click(80%), double-click(76.7%)) is shown. The intuitiveness and accuracy of the proposed hand-mouse interface is checked through experiment, this is expected to be a good example of the interface for controlling the system by hand in the future.

Characterizations of Interface-state Density between Top Silicon and Buried Oxide on Nano-SOI Substrate by using Pseudo-MOSFETs

  • Cho, Won-Ju
    • JSTS:Journal of Semiconductor Technology and Science
    • /
    • v.5 no.2
    • /
    • pp.83-88
    • /
    • 2005
  • The interface-states between the top silicon layer and buried oxide layer of nano-SOI substrate were developed. Also, the effects of thermal treatment processes on the interface-state distributions were investigated for the first time by using pseudo-MOSFETs. We found that the interface-state distributions were strongly influenced by the thermal treatment processes. The interface-states were generated by the rapid thermal annealing (RTA) process. Increasing the RTA temperature over $800^{\circ}C$, the interface-state density considerably increased. Especially, a peak of interface-states distribution that contributes a hump phenomenon of subthreshold curve in the inversion mode operation of pseudo-MOSFETs was observed at the conduction band side of the energy gap, hut it was not observed in the accumulation mode operation. On the other hand, the increased interface-state density by the RTA process was effectively reduced by the relatively low temperature annealing process in a conventional thermal annealing (CTA) process.

Design of Ball-based Mobile Haptic Interface (볼 기반의 모바일 햅틱 인터페이스 디자인)

  • Choi, Min-Woo;Kim, Joung-Hyun
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.122-128
    • /
    • 2009
  • In this paper, we present a design and an evaluation of a hand-held ball based haptic interface, named "TouchBall." Using a trackball mechanism, the device provides flexibility in terms of directional degrees of freedom. It also has an advantage of a direct transfer of force feedback through frictional touch (with high sensitivity), thus requiring only relatively small amount of inertia. This leads to a compact hand-held design appropriate for mobile and 3D interactive applications. The device is evaluated for the detection thresholds for directions of the force feedback and the perceived amount of directional force. The refined directionality information should combine with other modalities with less sensory conflict, enriching the user experience for a given application.

  • PDF

Implement of Hand Gesture Interface using Ratio and Size Variation of Gesture Clipping Region (제스쳐 클리핑 영역 비율과 크기 변화를 이용한 손-동작 인터페이스 구현)

  • Choi, Chang-Yur;Lee, Woo-Beom
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.13 no.1
    • /
    • pp.121-127
    • /
    • 2013
  • A vision based hand-gesture interface method for substituting a pointing device is proposed in this paper, which is used the ratio and size variation of Gesture Region. Proposed method uses the skin hue&saturation of the hand region from the HSI color model to extract the hand region effectively. This method can remove the non-hand region, and reduces the noise effect by the light source. Also, as the computation quantity is reduced by detecting not the static hand-shape recognition, but the ratio and size variation of hand-moving from the clipped hand region in real time, more response speed is guaranteed. In order to evaluate the performance of the our proposed method, after applying to the computerized self visual acuity testing system as a pointing device. As a result, the proposed method showed the average 86% gesture recognition ratio and 87% coordinate moving recognition ratio.

A new study on hand gesture recognition algorithm using leap motion system (Leap Motion 시스템을 이용한 손동작 인식기반 제어 인터페이스 기술 연구)

  • Nam, Jae-Hyun;Yang, Seung-Hun;Hu, Woong;Kim, Byung-Gyu
    • Journal of Korea Multimedia Society
    • /
    • v.17 no.11
    • /
    • pp.1263-1269
    • /
    • 2014
  • As rapid development of new hardware control interface technology, new concepts have been being proposed and emerged. In this paper, a new approach based on leap motion system is proposed. While we employ a position information from sensor, the hand gesture recognition is suggested with the pre-defined patterns. To do this, we design a recognition algorithm with hand gesture and finger patterns. We apply the proposed scheme to 3-dimensional avatar controling and editing software tool for making animation in the cyber space as a representative application. This proposed algorithm can be used to control computer systems in medical treatment, game, education and other various areas.

Functional Analysis and Design of Touch User Interface in Mobile Game (모바일게임 터치사용자인터페이스(TUI)의 기능적 분석 및 설계)

  • Kim, Mi-Jin;Yoon, Jin-Hong
    • The Journal of the Korea Contents Association
    • /
    • v.10 no.1
    • /
    • pp.138-146
    • /
    • 2010
  • Currently mobile phones possess the new features including the control interface provided with an ease, an intuition, and a variety and the display ensured for wide area. Mobile phones mounted with the touch screen release actively due to such strengths. This is the mega trend of the development of the latest mobile game. Mobile games set to the past keypad input system have changed for adaptation in the input environment and the progressive development. Consequently it is necessary to research for 'Touch User Interface(TUI)' of mobile games fixed into input environment by "Touch screen". This study have concreted the application method of touch game through the comparison analysis with the past game and implemented touch mobile game based on usability for ten touch mobile game titles released from the inside and outside of the country in oder to apply the touch interface fixed in the game to the hand-hold device with the function of touch interface. The result of this study have two implications. First it enhances the playability and diversity of game genre restricted by reason of the limitation of the past keypad input device. Second, it utilizes the basis for the standard of the interface of the touch mobile game by genre.

A Study on Gesture Interface through User Experience (사용자 경험을 통한 제스처 인터페이스에 관한 연구)

  • Yoon, Ki Tae;Cho, Eel Hea;Lee, Jooyoup
    • Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology
    • /
    • v.7 no.6
    • /
    • pp.839-849
    • /
    • 2017
  • Recently, the role of the kitchen has evolved from the space for previous survival to the space that shows the present life and culture. Along with these changes, the use of IoT technology is spreading. As a result, the development and diffusion of new smart devices in the kitchen is being achieved. The user experience for using these smart devices is also becoming important. For a natural interaction between a user and a computer, better interactions can be expected based on context awareness. This paper examines the Natural User Interface (NUI) that does not touch the device based on the user interface (UI) of the smart device used in the kitchen. In this method, we use the image processing technology to recognize the user's hand gesture using the camera attached to the device and apply the recognized hand shape to the interface. The gestures used in this study are proposed to gesture according to the user's context and situation, and 5 kinds of gestures are classified and used in the interface.