• Title/Summary/Keyword: Gestures Mouse

Search Result 28, Processing Time 0.024 seconds

Hand Tracking and Hand Gesture Recognition for Human Computer Interaction

  • Bai, Yu;Park, Sang-Yun;Kim, Yun-Sik;Jeong, In-Gab;Ok, Soo-Yol;Lee, Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.14 no.2
    • /
    • pp.182-193
    • /
    • 2011
  • The aim of this paper is to present the methodology for hand tracking and hand gesture recognition. The detected hand and gesture can be used to implement the non-contact mouse. We had developed a MP3 player using this technology controlling the computer instead of mouse. In this algorithm, we first do a pre-processing to every frame which including lighting compensation and background filtration to reducing the adverse impact on correctness of hand tracking and hand gesture recognition. Secondly, YCbCr skin-color likelihood algorithm is used to detecting the hand area. Then, we used Continuously Adaptive Mean Shift (CAMSHIFT) algorithm to tracking hand. As the formula-based region of interest is square, the hand is closer to rectangular. We have improved the formula of the search window to get a much suitable search window for hand. And then, Support Vector Machines (SVM) algorithm is used for hand gesture recognition. For training the system, we collected 1500 hand gesture pictures of 5 hand gestures. Finally we have performed extensive experiment on a Windows XP system to evaluate the efficiency of the proposed scheme. The hand tracking correct rate is 96% and the hand gestures average correct rate is 95%.

Real-time Multi-device Control System Implementation for Natural User Interactive Platform

  • Kim, Myoung-Jin;Hwang, Tae-min;Chae, Sung-Hun;Kim, Min-Joon;Moon, Yeon-Kug;Kim, SeungJun
    • Journal of Internet Computing and Services
    • /
    • v.23 no.1
    • /
    • pp.19-29
    • /
    • 2022
  • Natural user interface (NUI) is used for the natural motion interface without using a specific device or tool like a mouse, keyboards, and pens. Recently, as non-contact sensor-based interaction technologies for recognizing human motion, gestures, voice, and gaze have been actively studied, an environment has been prepared that can provide more diverse contents based on various interaction methods compared to existing methods. However, as the number of sensors device is rapidly increasing, the system using a lot of sensors can suffer from a lack of computational resources. To address this problem, we proposed a real-time multi-device control system for natural interactive platform. In the proposed system, we classified two types of devices as the HC devices such as high-end commercial sensor and the LC devices such astraditional monitoring sensor with low-cost. we adopt each device manager to control efficiently. we demonstrate a proposed system works properly with user behavior such as gestures, motions, gazes, and voices.

Hand Gesture Classification Using Multiple Doppler Radar and Machine Learning (다중 도플러 레이다와 머신러닝을 이용한 손동작 인식)

  • Baik, Kyung-Jin;Jang, Byung-Jun
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.28 no.1
    • /
    • pp.33-41
    • /
    • 2017
  • This paper suggests a hand gesture recognition technology to control smart devices using multiple Doppler radars and a support vector machine(SVM), which is one of the machine learning algorithms. Whereas single Doppler radar can recognize only simple hand gestures, multiple Doppler radar can recognize various and complex hand gestures by using various Doppler patterns as a function of time and each device. In addition, machine learning technology can enhance recognition accuracy. In order to determine the feasibility of the suggested technology, we implemented a test-bed using two Doppler radars, NI DAQ USB-6008, and MATLAB. Using this test-bed, we can successfully classify four hand gestures, which are Push, Pull, Right Slide, and Left Slide. Applying SVM machine learning algorithm, it was confirmed the high accuracy of the hand gesture recognition.

Efficient Fingertip Tracking and Mouse Pointer Control for Implementation of a Human Mouse (휴먼마우스 구현을 위한 효율적인 손끝좌표 추적 및 마우스 포인트 제어기법)

  • 박지영;이준호
    • Journal of KIISE:Software and Applications
    • /
    • v.29 no.11
    • /
    • pp.851-859
    • /
    • 2002
  • This paper discusses the design of a working system that visually recognizes hand gestures for the control of a window based user interface. We present a method for tracking the fingertip of the index finger using a single camera. Our method is based on CAMSHIFT algorithm and performs better than the CAMSHIFT algorithm in that it tracks well particular hand poses used in the system in complex backgrounds. We describe how the location of the fingertip is mapped to a location on the monitor, and how it Is both necessary and possible to smooth the path of the fingertip location using a physical model of a mouse pointer. Our method is able to track in real time, yet not absorb a major share of computational resources. The performance of our system shows a great promise that we will be able to use this methodology to control computers in near future.

Robot Gesture Reconition System based on PCA algorithm (PCA 알고리즘 기반의 로봇 제스처 인식 시스템)

  • Youk, Yui-Su;Kim, Seung-Young;Kim, Sung-Ho
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2008.04a
    • /
    • pp.400-402
    • /
    • 2008
  • The human-computer interaction technology (HCI) that has played an important role in the exchange of information between human being and computer belongs to a key field for information technology. Recently, control studies through which robots and control devices are controlled by using the movements of a person's body or hands without using conventional input devices such as keyboard and mouse, have been going only in diverse aspects, and their importance has been steadily increasing. This study is proposing a recognition method of user's gestures by applying measurements from an acceleration sensor to the PCA algorithm.

  • PDF

Efficient Hand Mouse Interface using Feature Points with Hand Gestures (손 모양 특징점 정보를 이용한 핸드마우스 인터페이스 구현)

  • Kin, Ji-Hyun;Kim, Min-Ha;Cha, Eui-Young
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2011.10a
    • /
    • pp.223-226
    • /
    • 2011
  • 본 논문은 웹 카메라로부터 입력받은 영상을 이용하여 손 영역을 추출하여 마우스를 대체할 수 있는 핸드마우스를 구현한다. 먼저 웹 카메라를 이용하여 입력받은 영상에서 손 영역을 추출한다. 손영역을 추출하기 위해서 HSV 컬러 모델에서 조도 변화에 강인한 Hue값과 피부색 특징이 잘 나타나는 YcbCr 컬러 공간을 이용하여 손 후보 영역을 획득한다. 손 후보 영역에서 레이블링(labeling) 알고리즘을 적용하여 정확한 손 영역을 추출한다. 추출한 손 영역에서 무게 중심점을 구한 후, 무게 중심점으로부터 거리를 이용하여 손 영역을 분리한다. 분리된 손 영역에서 무게 중심점으로부터 거리 정보를 이용하여 손 영역의 최종 특징 점을 추출한다. 본 논문에서 제안한 방법은 추출한 손 모양의 손끝 정보를 이용하여 마우스 이벤트를 수행함으로써 사용자가 사용하기 편리한 핸드마우스를 구현하였다.

  • PDF

Implementation of the Multi-Gestures Air-Space Mouse using the Diffused Illumination Method. (확산 투광방식을 이용한 멀티-제스처 공간 마우스 구현)

  • Lee, Sung-Jae;Lee, Woo-Beom
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2009.11a
    • /
    • pp.353-354
    • /
    • 2009
  • 최근 사용자의 다중 입력이 가능한 멀티 터치(Multi-Touch) 기술은 HCI(Human Computer Interaction) 분야에서 가장 주목 받고 있는 기술 가운데 하나이다. 그러나 이러한 터치 기술은 입력 장치에 의존적이기 때문에 공간적 제약의 문제를 지니고 있다. 따라서 본 논문에서는 멀티 터치 디스플레이 장치에서 사용자 입력 처리를 위해서 사용되는 확산 투광(DI: Diffused illumination)방식을 이용하여 사용자 입력에 있어서 공간적 제약이 없는 공간 마우스를 구현한다. 제안하는 공간 마우스는 기본적인 마우스 이벤트 처리뿐만 아니라 확장된 사용자 멀티 제스쳐를 위한 성능 향상 방안을 제시한다. 구현된 공간 마우스는 윈도우 어플리케이션 사용 환경에 적용한 결과 성공적인 결과를 보였다.

Airtouch technology smart fusion DID system design (Airtouch 기술을 활용한 스마트융합 DID 시스템 설계)

  • Lee, Gwang-Yong;Hwang, Bu-Hyun
    • Journal of Advanced Navigation Technology
    • /
    • v.17 no.2
    • /
    • pp.240-246
    • /
    • 2013
  • Airtouch technology to integrate the system in the way of information delivery devices, touch screen DID this study is to develop new ways of information delivery systems. Airtouch technology to design and implement a system that can be utilized to view the college campus announcements, education, information, and employment information, and store the remote operation and sharing content, the development of cloud services to sync content via smart technology implementation fusion DID systemto develop. Packs USB interface kinek because you may be used in connection with the information appliances, and low-cost product by leveraging the Kinect sensor, Airtouch technology implementation. Types of input devices paper Airtouch technology systems, the user's hand gestures alone can interact with information appliances, smart fusion system developed by DID by tracking the user's hand movements to manipulate the mouse pointer, and information through the user's hand gestures to command the unit so that you can make. Airtouch technology smart fusion DID system technology utilizing a ripple effect on other industries, such as the online education industry, advertising, information industry increases. Also, replace the existing interface device with the versatility of a wide range of technologies, usability is an infinite expansion.

Virtual Block Game Interface based on the Hand Gesture Recognition (손 제스처 인식에 기반한 Virtual Block 게임 인터페이스)

  • Yoon, Min-Ho;Kim, Yoon-Jae;Kim, Tae-Young
    • Journal of Korea Game Society
    • /
    • v.17 no.6
    • /
    • pp.113-120
    • /
    • 2017
  • With the development of virtual reality technology, in recent years, user-friendly hand gesture interface has been more studied for natural interaction with a virtual 3D object. Most earlier studies on the hand-gesture interface are using relatively simple hand gestures. In this paper, we suggest an intuitive hand gesture interface for interaction with 3D object in the virtual reality applications. For hand gesture recognition, first of all, we preprocess various hand data and classify the data through the binary decision tree. The classified data is re-sampled and converted to the chain-code, and then constructed to the hand feature data with the histograms of the chain code. Finally, the input gesture is recognized by MCSVM-based machine learning from the feature data. To test our proposed hand gesture interface we implemented a 'Virtual Block' game. Our experiments showed about 99.2% recognition ratio of 16 kinds of command gestures and more intuitive and user friendly than conventional mouse interface.

User-centric Immersible and Interactive Electronic Book based on the Interface of Tabletop Display (테이블탑 디스플레이 기반 사용자 중심의 실감형 상호작용 전자책)

  • Song, Dae-Hyeon;Park, Jae-Wan;Lee, Chil-Woo
    • The Journal of the Korea Contents Association
    • /
    • v.9 no.6
    • /
    • pp.117-125
    • /
    • 2009
  • In this paper, we propose user-centric immersible and interactive electronic book based on the interface of tabletop display. Electronic book is usually used for users that want to read the text book with multimedia contents like video, audio, animation and etc. It is based on tabletop display platform then the conventional input device like keyboard and mouse is not essentially needed. Users can interact with the contents based on the gestures defined for the interface of tabletop display using hand finger touches then it gives superior and effective interface for users to use the electronic book interestingly. This interface supports multiple users then it gives more diverse effects on the conventional electronic contents just made for one user. In this paper our method gives new way for the conventional electronics book and it can define the user-centric gestures and help users to interact with the book easily. We expect our method can be utilized for many edutainment contents.