• Title/Summary/Keyword: 핸드마우스

Search Result 10, Processing Time 0.024 seconds

Efficient Hand Mouse Interface using Feature Points with Hand Gestures (손 모양 특징점 정보를 이용한 핸드마우스 인터페이스 구현)

  • Kin, Ji-Hyun;Kim, Min-Ha;Cha, Eui-Young
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2011.10a
    • /
    • pp.223-226
    • /
    • 2011
  • 본 논문은 웹 카메라로부터 입력받은 영상을 이용하여 손 영역을 추출하여 마우스를 대체할 수 있는 핸드마우스를 구현한다. 먼저 웹 카메라를 이용하여 입력받은 영상에서 손 영역을 추출한다. 손영역을 추출하기 위해서 HSV 컬러 모델에서 조도 변화에 강인한 Hue값과 피부색 특징이 잘 나타나는 YcbCr 컬러 공간을 이용하여 손 후보 영역을 획득한다. 손 후보 영역에서 레이블링(labeling) 알고리즘을 적용하여 정확한 손 영역을 추출한다. 추출한 손 영역에서 무게 중심점을 구한 후, 무게 중심점으로부터 거리를 이용하여 손 영역을 분리한다. 분리된 손 영역에서 무게 중심점으로부터 거리 정보를 이용하여 손 영역의 최종 특징 점을 추출한다. 본 논문에서 제안한 방법은 추출한 손 모양의 손끝 정보를 이용하여 마우스 이벤트를 수행함으로써 사용자가 사용하기 편리한 핸드마우스를 구현하였다.

  • PDF

Developing User-friendly Hand Mouse Interface via Gesture Recognition (손 동작 인식을 통한 사용자에게 편리한 핸드마우스 인터페이스 구현)

  • Kang, Sung-Won;Kim, Chul-Joong;Sohn, Won
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.11a
    • /
    • pp.129-132
    • /
    • 2009
  • 컴퓨터의 소형화로 휴대성과 공간의 제약이 없는 컴퓨터 인터페이싱 방법의 필요성이 증가하고 있으며, 이와 관련하여 인간-컴퓨터 상호작용(HCI)을 위한 제스처 기반의 제어방식에 대한 연구가 활발하게 진행되고 있다. 기존의 손동작 인터페이스 구현들은 컴퓨터를 제어하기 위하여 사용방법에 대한 선행학습이 필요하였다. 이 논문은 사용자의 손 모양과 손끝 정보만을 가지고 선행학습이 요구되지 않는 간편한 인터페이스 구현방법을 제안하였다. 이를 위해 1대의 웹캠과 인텔의 오픈소스 영상처리 라이브러리 OpenCv를 사용하였다. 차영상과 화소값 기반의 영상처리과정을 통해 실시간으로 손 영역을 추적하고 이를 이진화 시켰다. 손가락의 움직임도 값이 변하지 않도록 중심모멘트를 설정하여 마우스 커서 움직임을 상대적으로 활용하였다. 상황에 따라 손 끝점을 절대적 좌표로 활용하여 손이 웹캠에서 벋어날 때 움직임을 자연스럽게 연결시켰다. 마지막으로 검지의 움직임 하나 만으로 마우스 클릭 이벤트를 수행함으로써 보다 사용자에게 친숙한 핸드마우스 인터페이스를 구현하였다.

  • PDF

Hand Interface using Intelligent Recognition for Control of Mouse Pointer (마우스 포인터 제어를 위해 지능형 인식을 이용한 핸드 인터페이스)

  • Park, Il-Cheol;Kim, Kyung-Hun;Kwon, Goo-Rak
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.5
    • /
    • pp.1060-1065
    • /
    • 2011
  • In this paper, the proposed method is recognized the hands using color information with input image of the camera. It controls the mouse pointer using recognized hands. In addition, specific commands with the mouse pointer is designed to perform. Most users felt uncomfortable since existing interaction multimedia systems depend on a particular external input devices such as pens and mouse However, the proposed method is to compensate for these shortcomings by hand without the external input devices. In experimental methods, hand areas and backgrounds are separated using color information obtaining image from camera. And coordinates of the mouse pointer is determined using coordinates of the center of a separate hand. The mouse pointer is located in pre-filled area using these coordinates, and the robot will move and execute with the command. In experimental results, the recognition of the proposed method is more accurate but is still sensitive to the change of color of light.

NUI/NUX framework based on intuitive hand motion (직관적인 핸드 모션에 기반한 NUI/NUX 프레임워크)

  • Lee, Gwanghyung;Shin, Dongkyoo;Shin, Dongil
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.11-19
    • /
    • 2014
  • The natural user interface/experience (NUI/NUX) is used for the natural motion interface without using device or tool such as mice, keyboards, pens and markers. Up to now, typical motion recognition methods used markers to receive coordinate input values of each marker as relative data and to store each coordinate value into the database. But, to recognize accurate motion, more markers are needed and much time is taken in attaching makers and processing the data. Also, as NUI/NUX framework being developed except for the most important intuition, problems for use arise and are forced for users to learn many NUI/NUX framework usages. To compensate for this problem in this paper, we didn't use markers and implemented for anyone to handle it. Also, we designed multi-modal NUI/NUX framework controlling voice, body motion, and facial expression simultaneously, and proposed a new algorithm of mouse operation by recognizing intuitive hand gesture and mapping it on the monitor. We implement it for user to handle the "hand mouse" operation easily and intuitively.

NUI/NUX of the Virtual Monitor Concept using the Concentration Indicator and the User's Physical Features (사용자의 신체적 특징과 뇌파 집중 지수를 이용한 가상 모니터 개념의 NUI/NUX)

  • Jeon, Chang-hyun;Ahn, So-young;Shin, Dong-il;Shin, Dong-kyoo
    • Journal of Internet Computing and Services
    • /
    • v.16 no.6
    • /
    • pp.11-21
    • /
    • 2015
  • As growing interest in Human-Computer Interaction(HCI), research on HCI has been actively conducted. Also with that, research on Natural User Interface/Natural User eXperience(NUI/NUX) that uses user's gesture and voice has been actively conducted. In case of NUI/NUX, it needs recognition algorithm such as gesture recognition or voice recognition. However these recognition algorithms have weakness because their implementation is complex and a lot of time are needed in training because they have to go through steps including preprocessing, normalization, feature extraction. Recently, Kinect is launched by Microsoft as NUI/NUX development tool which attracts people's attention, and studies using Kinect has been conducted. The authors of this paper implemented hand-mouse interface with outstanding intuitiveness using the physical features of a user in a previous study. However, there are weaknesses such as unnatural movement of mouse and low accuracy of mouse functions. In this study, we designed and implemented a hand mouse interface which introduce a new concept called 'Virtual monitor' extracting user's physical features through Kinect in real-time. Virtual monitor means virtual space that can be controlled by hand mouse. It is possible that the coordinate on virtual monitor is accurately mapped onto the coordinate on real monitor. Hand-mouse interface based on virtual monitor concept maintains outstanding intuitiveness that is strength of the previous study and enhance accuracy of mouse functions. Further, we increased accuracy of the interface by recognizing user's unnecessary actions using his concentration indicator from his encephalogram(EEG) data. In order to evaluate intuitiveness and accuracy of the interface, we experimented it for 50 people from 10s to 50s. As the result of intuitiveness experiment, 84% of subjects learned how to use it within 1 minute. Also, as the result of accuracy experiment, accuracy of mouse functions (drag(80.4%), click(80%), double-click(76.7%)) is shown. The intuitiveness and accuracy of the proposed hand-mouse interface is checked through experiment, this is expected to be a good example of the interface for controlling the system by hand in the future.

Development of Hand-Tracking Interaction System (핸드 트랙킹 인터랙션 시스템 개발)

  • Park, Seong-Su;Gue, Ja-Young;Hong, Jin-Ju;Rho, Young J.;Seo, Dae-Young
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2010.11a
    • /
    • pp.824-826
    • /
    • 2010
  • 컴퓨터의 활용도가 높아지면서 자연스럽게 컴퓨터에 따른 입력 장치도 사용이 잦아지고 있는 추세이다. 본 논문에서 다루고 있는 핸드 트랙킹 인터랙션 시스템(Hand-Tracking Interaction System)이란 캠에 사람의 손을 인식시켜 손의 모션에 기능을 부여하는 또 다른 차세대 입력 장치이다. 본 논문에서는 현재 사용하고 있는 마우스와 키보드 같은 입력 장치의 공간 제약성 이라는 단점을 보완하기 위해 핸드 트랙킹 인터랙션 시스템 (Hand-Tracking Interaction System) 을 개발하였고, 빛과 그림자의 영향을 쉽게 받아 손 인식률이 낮아지는 단점을 해결하기 위해 캠 대신 적외선카메라를 이용하여 인식률을 높임에 힘썼다. 또 핸드 트랙킹 인터랙션 시스템을 효율적으로 사용할 수 있는 새로운 어플리케이션을 함께 개발하였다.

3D Pointing for Effective Hand Mouse in Depth Image (깊이영상에서 효율적인 핸드 마우스를 위한 3D 포인팅)

  • Joo, Sung-Il;Weon, Sun-Hee;Choi, Hyung-Il
    • Journal of the Korea Society of Computer and Information
    • /
    • v.19 no.8
    • /
    • pp.35-44
    • /
    • 2014
  • This paper proposes a 3D pointing interface that is designed for the efficient application of a hand mouse. The proposed method uses depth images to secure high-quality results even in response to changes in lighting and environmental conditions and uses the normal vector of the palm of the hand to perform 3D pointing. First, the hand region is detected and tracked using the existing conventional method; based on the information thus obtained, the region of the palm is predicted and the region of interest is obtained. Once the region of interest has been identified, this region is approximated by the plane equation and the normal vector is extracted. Next, to ensure stable control, interpolation is performed using the extracted normal vector and the intersection point is detected. For stability and efficiency, the dynamic weight using the sigmoid function is applied to the above detected intersection point, and finally, this is converted into the 2D coordinate system. This paper explains the methods of detecting the region of interest and the direction vector and proposes a method of interpolating and applying the dynamic weight in order to stabilize control. Lastly, qualitative and quantitative analyses are performed on the proposed 3D pointing method to verify its ability to deliver stable control.

The input device system with hand motion using hand tracking technique of CamShift algorithm (CamShift 알고리즘의 Hand Tracking 기법을 응용한 Hand Motion 입력 장치 시스템)

  • Jeon, Yu-Na;Kim, Soo-Ji;Lee, Chang-Hoon;Kim, Hyeong-Ryul;Lee, Sung-Koo
    • Journal of Digital Contents Society
    • /
    • v.16 no.1
    • /
    • pp.157-164
    • /
    • 2015
  • The existing input device is limited to keyboard and mouse. However, recently new type of input device has been developed in response to requests from users. To reflect this trend we propose the new type of input device that gives instruction as analyzing the hand motion of image without special device. After binarizing the skin color area using Cam-Shift method and tracking, it recognizes the hand motion by inputting the finger areas and the angles from the palm center point, which are separated through labeling, into four cardinal directions and counting them. In cases when specific background was not set and without gloves, the recognition rate remained approximately at 75 percent. However, when specific background was set and the person wore red gloves, the recognition rate increased to 90.2 percent due to reduction in noise.

Survey of Staphylococcus epidermidis Contamination on the Hands of Dental Hygienists and Equipment Surface of Dental Clinics (치과의료기관 의료장비 표면 및 치과위생사 손의 Staphylococcus epidermidis 오염도 조사)

  • Kim, Seol-Hee
    • Journal of dental hygiene science
    • /
    • v.17 no.6
    • /
    • pp.472-480
    • /
    • 2017
  • The purpose of this study was to investigate Staphylococcus epidermidis contamination on hands of 20 dental hygienists and 140 equipment surface of 20 dental clinics in a local area, from July to August 2017. The degree of S. epidermidis contamination was measured using a hand plate and a rodac plate and then cultured at $35^{\circ}C$ for 24 hours. Based on hand plate criteria, hand contamination was classified into low, middle, and high groups. Analysis of the variance (ANOVA) of the contamination level of the hand parts of the group surface contamination level of the dental clinic equipment was descriptive statistics after clustering lock count. S. epidermidis contamination was moderate in 55% of the hands of dental hygienists. The area of contamination was 29.45 colony-forming units (CFU) on the palm, followed by the middle finger 7.8 CFU, ring finger 6.4 CFU, and thumb 6 CFU. Medical equipment surface contamination was showed that 3-way handle 4.45 CFU, computer mouse 3.37 CFU, mirror handle 1.60 CFU were higher than other areas. The group with high hand contamination had a high positive correlation with the S. epidermidis contamination of the hand. S. epidermidis contamination level was higher on hands than on the medical equipment surface contamination. Therefore, medical staff should recognize the importance of hand hygiene which should be practiced in the manner suggested by World Health Organization. In addition, the medical team needs to be responsible for performing infection control tasks, implementing infection management guidelines and providing systematic education on infectious disease management.

Upper Limb Motion Detection Including Fingers Using Flex Sensors and Inertial Sensors (휘어짐센서와 관성센서를 이용한 손가락을 포함한 상지 운동 검출)

  • Kim, Yeon-Jun;Yoo, Jae-Ha;Kim, Dong-Yon;Kim, Soo-Chan
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.21 no.3
    • /
    • pp.101-106
    • /
    • 2020
  • The utilization of virtual reality is increasing not only in games but also in medical care such as rehabilitation. Due to the convenience, the motion of the upper limb is detected using a non-contact method using video or a handheld type mouse, etc. In this paper, we implemented a glove which can measure finger movements and upper limb movements by using flex sensors whose resistance value changes according to the degree of folding and inertial sensors which can obtain direction information in space. We showed the upper arm movements including finger movements with signals obtained from the implemented glove on the open software platform, Processing. The sensitivity of each finger movement was 0.5deg, and the sensitivity of the upper limb motion was 0.6deg.