• Title/Summary/Keyword: Hand and finger recognition

Search Result 80, Processing Time 0.028 seconds

Smart Wrist Band Considering Wrist Skin Curvature Variation for Real-Time Hand Gesture Recognition (실시간 손 제스처 인식을 위하여 손목 피부 표면의 높낮이 변화를 고려한 스마트 손목 밴드)

  • Yun Kang;Joono Cheong
    • The Journal of Korea Robotics Society
    • /
    • v.18 no.1
    • /
    • pp.18-28
    • /
    • 2023
  • This study introduces a smart wrist band system with pressure measurements using wrist skin curvature variation due to finger motion. It is easy to wear and take off without pre-adaptation or surgery to use. By analyzing the depth variation of wrist skin curvature during each finger motion, we elaborated the most suitable location of each Force Sensitive Resistor (FSR) to be attached in the wristband with anatomical consideration. A 3D depth camera was used to investigate distinctive wrist locations, responsible for the anatomically de-coupled thumb, index, and middle finger, where the variations of wrist skin curvature appear independently. Then sensors within the wristband were attached correspondingly to measure the pressure change of those points and eventually the finger motion. The smart wrist band was validated for its practicality through two demonstrative applications, i.e., one for a real-time control of prosthetic robot hands and the other for natural human-computer interfacing. And hopefully other futuristic human-related applications would be benefited from the proposed smart wrist band system.

A Finger Counting Method for Gesture Recognition (제스처 인식을 위한 손가락 개수 인식 방법)

  • Lee, DoYeob;Shin, DongKyoo;Shin, DongIl
    • Journal of Internet Computing and Services
    • /
    • v.17 no.2
    • /
    • pp.29-37
    • /
    • 2016
  • Humans develop and maintain relationship through communication. Communication is largely divided into verbal communication and non-verbal communication. Verbal communication involves the use of a language or characters, while non-verbal communication utilizes body language. We use gestures with language together in conversations of everyday life. Gestures belong to non-verbal communication, and can be offered using a variety of shapes and movements to deliver an opinion. For this reason, gestures are spotlighted as a means of implementing an NUI/NUX in the fields of HCI and HRI. In this paper, using Kinect and the geometric features of the hand, we propose a method for recognizing the number of fingers and detecting the hand area. A Kinect depth image can be used to detect the hand region, with the finger number identified by comparing the distance of outline and the central point of a hand. Average recognition rate for recognizing the number of fingers is 98.5%, from the proposed method, The proposed method would help enhancing the functionality of the human computer interaction by increasing the expression range of gestures.

Analysis of Face Direction and Hand Gestures for Recognition of Human Motion (인간의 행동 인식을 위한 얼굴 방향과 손 동작 해석)

  • Kim, Seong-Eun;Jo, Gang-Hyeon;Jeon, Hui-Seong;Choe, Won-Ho;Park, Gyeong-Seop
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.7 no.4
    • /
    • pp.309-318
    • /
    • 2001
  • In this paper, we describe methods that analyze a human gesture. A human interface(HI) system for analyzing gesture extracts the head and hand regions after taking image sequence of and operators continuous behavior using CCD cameras. As gestures are accomplished with operators head and hands motion, we extract the head and hand regions to analyze gestures and calculate geometrical information of extracted skin regions. The analysis of head motion is possible by obtaining the face direction. We assume that head is ellipsoid with 3D coordinates to locate the face features likes eyes, nose and mouth on its surface. If was know the center of feature points, the angle of the center in the ellipsoid is the direction of the face. The hand region obtained from preprocessing is able to include hands as well as arms. For extracting only the hand region from preprocessing, we should find the wrist line to divide the hand and arm regions. After distinguishing the hand region by the wrist line, we model the hand region as an ellipse for the analysis of hand data. Also, the finger part is represented as a long and narrow shape. We extract hand information such as size, position, and shape.

  • PDF

Color-Based Real-Time Hand Region Detection with Robust Performance in Various Environments (다양한 환경에 강인한 컬러기반 실시간 손 영역 검출)

  • Hong, Dong-Gyun;Lee, Donghwa
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.14 no.6
    • /
    • pp.295-311
    • /
    • 2019
  • The smart product market is growing year by year and is being used in many areas. There are various ways of interacting with smart products and users by inputting voice recognition, touch and finger movements. It is most important to detect an accurate hand region as a whole step to recognize hand movement. In this paper, we propose a method to detect accurate hand region in real time in various environments. A conventional method of detecting a hand region includes a method using depth information of a multi-sensor camera, a method of detecting a hand through machine learning, and a method of detecting a hand region using a color model. Among these methods, a method using a multi-sensor camera or a method using a machine learning requires a large amount of calculation and a high-performance PC is essential. Many computations are not suitable for embedded systems, and high-end PCs increase or decrease the price of smart products. The algorithm proposed in this paper detects the hand region using the color model, corrects the problems of the existing hand detection algorithm, and detects the accurate hand region based on various experimental environments.

HAND GESTURE INTERFACE FOR WEARABLE PC

  • Nishihara, Isao;Nakano, Shizuo
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.664-667
    • /
    • 2009
  • There is strong demand to create wearable PC systems that can support the user outdoors. When we are outdoors, our movement makes it impossible to use traditional input devices such as keyboards and mice. We propose a hand gesture interface based on image processing to operate wearable PCs. The semi-transparent PC screen is displayed on the head mount display (HMD), and the user makes hand gestures to select icons on the screen. The user's hand is extracted from the images captured by a color camera mounted above the HMD. Since skin color can vary widely due to outdoor lighting effects, a key problem is accurately discrimination the hand from the background. The proposed method does not assume any fixed skin color space. First, the image is divided into blocks and blocks with similar average color are linked. Contiguous regions are then subjected to hand recognition. Blocks on the edges of the hand region are subdivided for more accurate finger discrimination. A change in hand shape is recognized as hand movement. Our current input interface associates a hand grasp with a mouse click. Tests on a prototype system confirm that the proposed method recognizes hand gestures accurately at high speed. We intend to develop a wider range of recognizable gestures.

  • PDF

Development of Apple Harvesting Robot(I) - Development of Robot Hand for Apple Harvesting - (사과 수확 로봇의 핸드 개발(I) - 사과 수확용 로봇의 핸드 개발 -)

  • 장익주;김태한;권기영
    • Journal of Biosystems Engineering
    • /
    • v.22 no.4
    • /
    • pp.411-420
    • /
    • 1997
  • The mechanization efficiency using high ability machines such as tractors or combines in a paddy field rice farm is high. Mechanization in harvesting fruits and vegetables is difficult, because they are easy to be damaged. Therefore, Advanced techniques for careful handling fruits and vegetables are necessary in automation and robotization. An apple harvesting robot must have a recognition device to detect the positioning of fruit, manipulators which function like human arms, and hand to take off the fruit. This study is related to the development of a rotatic hand as the first stage in developing the apple harvesting robot. The results are summarized as follows. 1. It was found that a hand that was eccentric in rotatory motion, was better than a hand of semicircular up-and-down motion in harvesting efficiency. 2. The hand was developed to control changes in grasp forces by using tape-type switch sensor which was attatched to fingers' inside. 3. Initial finger positioning was set up to control accurate harvesting by using a tow step fingering position. 4. This study showed the possibility of apple harvesting using the developed robot hand.

  • PDF

The input device system with hand motion using hand tracking technique of CamShift algorithm (CamShift 알고리즘의 Hand Tracking 기법을 응용한 Hand Motion 입력 장치 시스템)

  • Jeon, Yu-Na;Kim, Soo-Ji;Lee, Chang-Hoon;Kim, Hyeong-Ryul;Lee, Sung-Koo
    • Journal of Digital Contents Society
    • /
    • v.16 no.1
    • /
    • pp.157-164
    • /
    • 2015
  • The existing input device is limited to keyboard and mouse. However, recently new type of input device has been developed in response to requests from users. To reflect this trend we propose the new type of input device that gives instruction as analyzing the hand motion of image without special device. After binarizing the skin color area using Cam-Shift method and tracking, it recognizes the hand motion by inputting the finger areas and the angles from the palm center point, which are separated through labeling, into four cardinal directions and counting them. In cases when specific background was not set and without gloves, the recognition rate remained approximately at 75 percent. However, when specific background was set and the person wore red gloves, the recognition rate increased to 90.2 percent due to reduction in noise.

A Development of Framework for Selecting Labor Attendance Management System Considering Condition of Construction Site (건설 현장 특성을 고려한 출역관리시스템 선정 프레임워크 개발)

  • Kim, Seong-Ah;Chin, Sang-Yoon;Jang, Moon-Seok;Jung, Choong-Won;Choi, Cheol-Ho
    • Korean Journal of Construction Engineering and Management
    • /
    • v.16 no.4
    • /
    • pp.60-69
    • /
    • 2015
  • Labor attendance management has traditionally been carried out by writing a table for checking an attendance of labor, which requires a lot of time and effort. As electronic devices with additions such as barcodes, Quick Response codes, and Radio Frequency Identification(RFID) have been developed, however, automated labor attendance management systems have appeared. Recently, various types of labor recognition devices converged with biometrics (fingerprint, vein, face recognition, etc.) have been released. However, although these devices can be used to check attendance automatically, there is insufficient guidance when it comes to selecting the appropriate labor attendance management system for construction sites. Therefore, this study proposed a decision framework to determine which labor attendance management system would be suitable for a construction site and to select the labor recognition device. This study investigated different labor recognition devices, focusing on how they worked, and tested the performance of devices and their usability for construction labor attendance management. The test results showed that RFID is most suitable when verifying the attendance of many laborers over a short period of time. The devices for hand vein and fingerprint recognition did not function when there was a foreign material such as cement or paint on the laborer's hand, except for a deformed finger. Reflecting these test results, this study suggested a framework for selecting a labor attendance system and recognition device; this is expected to contribute to the development of more efficient labor management systems.

Human Friendly Recognition and Editing Support System of Korean Language (인간에게 친밀한 한글 인식 및 편집 지원시스템)

  • Sohn, Young-Sun
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.17 no.4
    • /
    • pp.494-499
    • /
    • 2007
  • In this paper we realized a system, if a user selects the area of the important parts or the arrangement parts when he reads the books or the papers, which amends, stores and readjusts the characters that are included in the selected area by outputting the characters to the word processor in sequence. If a user selects what he wishes lot with his finger, the system detects the movement of the finger by applying the hand recognition algorithm and recognizes the selected area. The system converts the distance of the width and the length of the selected area to the number of the pulse, and controls the motor to move the camera at the position. After the system scales up/down the zoom to be able to recognize the character and controls the focus to the regulated zoom closely, it controls the focus in detail to get more distinct image by using the difference of the light and darkness. We realize the recognition and editing support system of korean language that converts the obtained images to the document by applying the character recognition algorithm and arrange the important parts.

Implementation of Hand-Gesture-Based Augmented Reality Interface on Mobile Phone (휴대폰 상에서의 손동작 기반 증강현실 인터페이스 구현)

  • Choi, Jun-Yeong;Park, Han-Hoon;Park, Jung-Sik;Park, Jong-Il
    • Journal of Broadcast Engineering
    • /
    • v.16 no.6
    • /
    • pp.941-950
    • /
    • 2011
  • With the recent advance in the performance of mobile phones, many effective interfaces for them have been proposed. This paper implements a hand-gesture-and-vision-based interface on a mobile phone. This paper assumes natural interaction scenario when user holds a mobile phone in a hand and sees the other hand's palm through mobile phone's camera. Then, a virtual object is rendered on his/her palm and reacts to hand and finger movements. Since the implemented interface is based on hand familiar to humans and does not require any additional sensors or markers, user freely interacts with the virtual object anytime and anywhere without any training. The implemented interface worked at 5 fps on mobile phone (Galaxy S2 having a dual-core processor).