• Title/Summary/Keyword: Camera mouse

Search Result 66, Processing Time 0.025 seconds

Implementation of Mouse Function Using Web Camera and Hand (웹 카메라와 손을 이용한 마우스 기능의 구현)

  • Kim, Seong-Hoon;Woo, Young-Woon;Lee, Kwang-Eui
    • Journal of the Korea Society of Computer and Information
    • /
    • v.15 no.5
    • /
    • pp.33-38
    • /
    • 2010
  • In this paper, we proposed an algorithm implementing mouse functions using hand motion and number of fingers which are extracted from an image sequence. The sequence is acquired through a web camera and processed with image processing algorithms. The sequence is first converted from RGB model to YCbCr model to efficiently extract skin area and the extracted area is further processed using labeling, opening, and closing operations to decide the center of a hand. Based on the center position, the number of fingers is decided, which serves as the information to decide and perform a mouse function. Experimental results show that 94.0% of pointer moves and 96.0% of finger extractions are successful, which opens the possibility of further development for a commercial product.

An Implementation of Presentation System using Image Processing (영상처리를 이용한 프리젠테이션 시스템의 구현)

  • 이후성;양훈기
    • Proceedings of the IEEK Conference
    • /
    • 2000.06d
    • /
    • pp.155-158
    • /
    • 2000
  • In this paper, we propose a Windows-based presentation system using laser pointer mouse. Major-characteristics of this system is to synchronize the laser pointing position with the PC cursor such that the laser can function as not only pointer, but also a PC mouse. It is shown that we use a special pattern to coincide the coordinate of the camera capture image with that of the pc window. We finally show its feasibility by some experiments with the implemented system.

  • PDF

Human Head Mouse System Based on Facial Gesture Recognition

  • Wei, Li;Lee, Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.10 no.12
    • /
    • pp.1591-1600
    • /
    • 2007
  • Camera position information from 2D face image is very important for that make the virtual 3D face model synchronize to the real face at view point, and it is also very important for any other uses such as: human computer interface (face mouth), automatic camera control etc. We present an algorithm to detect human face region and mouth, based on special color features of face and mouth in $YC_bC_r$ color space. The algorithm constructs a mouth feature image based on $C_b\;and\;C_r$ values, and use pattern method to detect the mouth position. And then we use the geometrical relationship between mouth position information and face side boundary information to determine the camera position. Experimental results demonstrate the validity of the proposed algorithm and the Correct Determination Rate is accredited for applying it into practice.

  • PDF

Bubble Popping Augmented Reality System Using a Vibro-Tactile Haptic Mouse (진동촉각 햅틱 마우스 기반 버블포핑 증강현실 시스템)

  • Jung, Da-Un;Lee, Woo-Keun;Jang, Seong-Eun;Kim, Man-Bae
    • Journal of Broadcast Engineering
    • /
    • v.15 no.6
    • /
    • pp.715-722
    • /
    • 2010
  • As one of applications in augmented realities, this paper presents a bubble popping system utilizing a haptic vibro-tactile mouse. In this system, virtual bubbles randomly float in the 3D space. By using the vibro-tactile mouse grabbed by a user, the bubbles are popped when they are touched by the mouse in the 3D space. Then a bubble popping effect with addition mouse vibration is delivered to the user's hand through the mouse. The proposed system is developed on ARToolkit environment. Therefore, basic components such as a camera and a marker pattern are required. The systems is composed of a vibro-haptic mouse, a webcam, a marker pattern, a graphic bubble object, and graphic mouse. Mouse vibration as well as bubble fade-out effect is delivered. Therefore, the combination of visual and tactile bubble popping effects outperforms the usage of a single effect in the experience of augmented reality.

A Mouse Control Method Using Hand Movement Recognition (손동작 인식을 이용한 마우스제어기법)

  • Kim, Jung-In
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.11
    • /
    • pp.1377-1383
    • /
    • 2012
  • This paper proposes a human mouse system that replaces mouse input by human hand movement. As the resolution of monitors increases, it is not quite possible, due to the resolution difference between web cameras and monitors, to place the cursor in the entire range of a monitor by simply moving the pointer which recognizes the position of the hand from the web camera. In this regard, we propose an effective method of placing the position of the mouse, without repeating the returning hand movements, in the corners of the monitor in which the user wants it to be. We also proposes the recognition method of finger movements in terms of using thumb and index finger. The measurement that we conducted shows the successful recognition rate of 97% that corroborates the effectiveness of our method.

Conceptual Fuzzy Sets for Picture Reference System with Visual User Interface and Command Recognition System without Keyboard and Mouse

  • Saito, Maiji;Yamaguchi, Toru
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2003.09a
    • /
    • pp.138-141
    • /
    • 2003
  • This paper proposes conceptual fuzzy sets for picture reference system with visual user interface and command recognition system without keyboard and mouse. The picture reference system consists of the associative picture database, the visual user interface and command recognition system. The associative picture database searches pictures by using conceptual fuzzy sets. To show pictures attractive, the visual user interface provides some visual effect functions. The command recognition unit, without keyboard and mouse, captures user's hand by camera and informs it to the system as a command. We implement and evaluate the picture reference system.

  • PDF

Implementation of eye-controlled mouse by real-time tracking of the three dimensional eye-gazing point (3차원 시선 추적에 의한 시각 제어 마우스 구현 연구)

  • Kim Jae-Han
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2006.05a
    • /
    • pp.209-212
    • /
    • 2006
  • This paper presents design and implementation methods of the eye-controlled mouse using the real-time tracking of the three dimensional gazing point. The proposed method is based on three dimensional data processing of eye images in the 3D world coordinates. The system hardware consists of two conventional CCD cameras for acquisition of stereoscopic image and computer for processing. And in this paper, the advantages of the proposed algorithm and test results are described.

  • PDF

A Development of the Next-generation Interface System Based on the Finger Gesture Recognizing in Use of Image Process Techniques (영상처리를 이용한 지화인식 기반의 차세대 인터페이스 시스템 개발)

  • Kim, Nam-Ho
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.4
    • /
    • pp.935-942
    • /
    • 2011
  • This study aims to design and implement the finger gesture recognizing system that automatically recognizes finger gestures input through a camera and controls the computer. Common CCD cameras were redesigned as infrared light cameras to acquire the images. The recorded images go through the pre-process to find the hand features, the finger gestures are read accordingly, and an event takes place for the follow-up mouse controlling and presentation, and finally the way to control computers is suggested. The finger gesture recognizing system presented in this study has been verified as the next-generation interface to replace the mouse and keyboard for the future information-based units.

Improvement of Smartphone Interface Using AR Marker (AR 마커를 이용한 스마트폰 인터페이스의 개선)

  • Kang, Yun-A;Han, Soon-Hung
    • Korean Journal of Computational Design and Engineering
    • /
    • v.16 no.5
    • /
    • pp.361-369
    • /
    • 2011
  • As smartphones came into wide use recently, it has become increasingly popular not only among young people, but middle-aged people as well. Most smartphones use capacitive full touch screen, so touch commands are made by fingers unlike the PDAs in the past that use touch pens. In this case, a significant portion of the smartphone's screen is blocked by the finger so it is impossible to see the screens around the finger touching the screen, and difficulty occurs in precise control used for small buttons such as qwerty keyboard. To solve this problem, this research proposes a method of using simple AR markers to improve the interface of smartphones. Sticker-form marker is attached to fingernails and placed in front of the smartphone camera Then, the camera image of the marker is analyzed to determine the orientation of the marker to perceive as onRelease() or onPress() of the mouse depending on the marker's angle of rotation, and use its position as the position of the mouse cursor. This method can enable click, double-click, drag-and-drop used in PCs as well as touch, slide, long-touch-input in smartphones. Through this research, smartphone inputs can be made more precise and simple, and show the possibility of the application of a new concept of smartphone interface.

Real Time System Realization for Binocular Eyeball Tracking Mouse (실시간 쌍안구 추적 마우스 시스템 구현에 관한 연구)

  • Ryu Kwang-Ryol;Choi Duck-Hyun
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.10 no.9
    • /
    • pp.1671-1678
    • /
    • 2006
  • A real time system realization for binocular eyeball tracking mouse on the computer monitor being far from 30-40cm is presented in the paper. The processing for searching eyeball and tracking the cursor are that a facial image is acquired by the small CCD camera, convert it into binary image, search for the eye two using the five region mask method in the eye surroundings and the side four points diagonal positioning method is searched the each iris. The tracking cursor is matched by measuring the iris central moving position. The cursor controlling is achieved by comparing two related distances between the iris maximum moving and the cursor moving to calculate the moving distance from gazing position and screen. The experimental results show that the binocular eyeball mouse system is simple and fast to be real time.