• Title/Summary/Keyword: Mouse tracking

Search Result 51, Processing Time 0.023 seconds

Improving Eye-gaze Mouse System Using Mouth Open Detection and Pop Up Menu (입 벌림 인식과 팝업 메뉴를 이용한 시선추적 마우스 시스템 성능 개선)

  • Byeon, Ju Yeong;Jung, Keechul
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.12
    • /
    • pp.1454-1463
    • /
    • 2020
  • An important factor in eye-tracking PC interface for general paralyzed patients is the implementation of the mouse interface, for manipulating the GUI. With a successfully implemented mouse interface, users can generate mouse events exactly at the point of their choosing. However, it is difficult to define this interaction in the eye-tracking interface. This problem has been defined as the Midas touch problem and has been a major focus of eye-tracking research. There have been many attempts to solve this problem using blink, voice input, etc. However, it was not suitable for general paralyzed patients because some of them cannot wink or speak. In this paper, we propose a mouth-pop-up, eye-tracking mouse interface that solves the Midas touch problem as well as becoming a suitable interface for general paralyzed patients using a common RGB camera. The interface presented in this paper implements a mouse interface that detects the opening and closing of the mouth to activate a pop-up menu that the user can select the mouse event. After implementation, a performance experiment was conducted. As a result, we found that the number of malfunctions and the time to perform tasks were reduced compared to the existing method.

Real Time System Realization for Binocular Eyeball Tracking Mouse (실시간 쌍안구 추적 마우스 시스템 구현에 관한 연구)

  • Ryu Kwang-Ryol;Choi Duck-Hyun
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.10 no.9
    • /
    • pp.1671-1678
    • /
    • 2006
  • A real time system realization for binocular eyeball tracking mouse on the computer monitor being far from 30-40cm is presented in the paper. The processing for searching eyeball and tracking the cursor are that a facial image is acquired by the small CCD camera, convert it into binary image, search for the eye two using the five region mask method in the eye surroundings and the side four points diagonal positioning method is searched the each iris. The tracking cursor is matched by measuring the iris central moving position. The cursor controlling is achieved by comparing two related distances between the iris maximum moving and the cursor moving to calculate the moving distance from gazing position and screen. The experimental results show that the binocular eyeball mouse system is simple and fast to be real time.

Usability Test of Interface for Web Widget Using Work-Flow based on Mouse Tracking (마우스 트래킹 기반 작업흐름도를 이용한 웹 Widget 인터페이스 사용성 평가)

  • Han, Mi-Ran;Park, Peom
    • Journal of the Ergonomics Society of Korea
    • /
    • v.29 no.5
    • /
    • pp.763-770
    • /
    • 2010
  • The use of web widgets on desktop and mobile devices has been increasing rapidly. Web widgets provide access to activities and information from various sources across the web. As the number of supported widgets increases, managing widgets and finding relevant or interesting widgets becomes more complex. In addition, interacting with widgets in web service systems can be difficult, especially for novice users. Up to this point, there has been little research on web widget usability. This paper performs an experimental study regarding user interfaces of web widgets based on the mouse tracking and work-flow analysis. In the experiment, four sites providing widget services are chosen - iGoogle, Netvibes, My yahoo, and Wizard. The experiment participants perform three assigned tasks in the chosen sites, and their mouse operations are recorded using Camtasia, a screen casting software. Mouse tracking analysis is performed based on the recorded data in order to analyze common user behaviors. In addition, work-flow diagrams representing the operational flows to carry out the given tasks in each web site are constructed so as to visually and systematically analyze detailed usage patterns. The experimental study results presented in this paper can contribute to developing guidelines for highly usable and accessible interface design of web widgets.

Implementation of eye-controlled mouse by real-time tracking of the three dimensional eye-gazing point (3차원 시선 추적에 의한 시각 제어 마우스 구현 연구)

  • Kim Jae-Han
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2006.05a
    • /
    • pp.209-212
    • /
    • 2006
  • This paper presents design and implementation methods of the eye-controlled mouse using the real-time tracking of the three dimensional gazing point. The proposed method is based on three dimensional data processing of eye images in the 3D world coordinates. The system hardware consists of two conventional CCD cameras for acquisition of stereoscopic image and computer for processing. And in this paper, the advantages of the proposed algorithm and test results are described.

  • PDF

Welfare Interface using Multiple Facial Features Tracking (다중 얼굴 특징 추적을 이용한 복지형 인터페이스)

  • Ju, Jin-Sun;Shin, Yun-Hee;Kim, Eun-Yi
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.45 no.1
    • /
    • pp.75-83
    • /
    • 2008
  • We propose a welfare interface using multiple fecial features tracking, which can efficiently implement various mouse operations. The proposed system consist of five modules: face detection, eye detection, mouth detection, facial feature tracking, and mouse control. The facial region is first obtained using skin-color model and connected-component analysis(CCs). Thereafter the eye regions are localized using neutral network(NN)-based texture classifier that discriminates the facial region into eye class and non-eye class, and then mouth region is localized using edge detector. Once eye and mouth regions are localized they are continuously and correctly tracking by mean-shift algorithm and template matching, respectively. Based on the tracking results, mouse operations such as movement or click are implemented. To assess the validity of the proposed system, it was applied to the interface system for web browser and was tested on a group of 25 users. The results show that our system have the accuracy of 99% and process more than 21 frame/sec on PC for the $320{\times}240$ size input image, as such it can supply a user-friendly and convenient access to a computer in real-time operation.

Extraction of user interest area using foreground image separation and mouse tracking program (전경 이미지 분리와 마우스 트랙킹 프로그램을 이용한 사용자 관심 영역 유도)

  • Lee, MyounJae
    • Journal of Korea Game Society
    • /
    • v.17 no.5
    • /
    • pp.113-122
    • /
    • 2017
  • The location of the objects that make up a game can be an element of immersion for players. repeatedly appearing at the same position, the fun may be reduced, and as the play time elapses, the players will feel the game's fun as they appear in a larger area than at the beginning of the game play. This paper is a study to find out the location of objects according to the passage of time and to see how players controlled these objects. First, foreground images are extracted and accumulated using OpenCV programming language. The accumulated result is displayed as a heat map image. Second, the mouse movement area is detected using the mouse tracking program and compared with the heat map image, so that the screen area in which the player is interested can be known.

A Computer Access System for the Physically Disabled Using Eye-Tracking and Speech Recognition (아이트래킹 및 음성인식 기술을 활용한 지체장애인 컴퓨터 접근 시스템)

  • Kwak, Seongeun;Kim, Isaac;Sim, Debora;Lee, Seung Hwan;Hwang, Sung Soo
    • Journal of the HCI Society of Korea
    • /
    • v.12 no.4
    • /
    • pp.5-15
    • /
    • 2017
  • Alternative computer access devices are one of the ways for the physically disabled to meet their desire to participate in social activities. Most of these devices provide access to computers by using their feet or heads. However, it is not easy to control the mouse by using their feet, head, etc. with physical disabilities. In this paper, we propose a computer access system for the physically disabled. The proposed system can move the mouse only by the user's gaze using the eye-tracking technology. The mouse can be clicked through the external button which is relatively easy to press, and the character can be inputted easily and quickly through the voice recognition. It also provides detailed functions such as mouse right-click, double-click, drag function, on-screen keyboard function, internet function, scroll function, etc.

Efficient Fingertip Tracking and Mouse Pointer Control for Implementation of a Human Mouse (휴먼마우스 구현을 위한 효율적인 손끝좌표 추적 및 마우스 포인트 제어기법)

  • 박지영;이준호
    • Journal of KIISE:Software and Applications
    • /
    • v.29 no.11
    • /
    • pp.851-859
    • /
    • 2002
  • This paper discusses the design of a working system that visually recognizes hand gestures for the control of a window based user interface. We present a method for tracking the fingertip of the index finger using a single camera. Our method is based on CAMSHIFT algorithm and performs better than the CAMSHIFT algorithm in that it tracks well particular hand poses used in the system in complex backgrounds. We describe how the location of the fingertip is mapped to a location on the monitor, and how it Is both necessary and possible to smooth the path of the fingertip location using a physical model of a mouse pointer. Our method is able to track in real time, yet not absorb a major share of computational resources. The performance of our system shows a great promise that we will be able to use this methodology to control computers in near future.

Hand Tracking and Hand Gesture Recognition for Human Computer Interaction

  • Bai, Yu;Park, Sang-Yun;Kim, Yun-Sik;Jeong, In-Gab;Ok, Soo-Yol;Lee, Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.14 no.2
    • /
    • pp.182-193
    • /
    • 2011
  • The aim of this paper is to present the methodology for hand tracking and hand gesture recognition. The detected hand and gesture can be used to implement the non-contact mouse. We had developed a MP3 player using this technology controlling the computer instead of mouse. In this algorithm, we first do a pre-processing to every frame which including lighting compensation and background filtration to reducing the adverse impact on correctness of hand tracking and hand gesture recognition. Secondly, YCbCr skin-color likelihood algorithm is used to detecting the hand area. Then, we used Continuously Adaptive Mean Shift (CAMSHIFT) algorithm to tracking hand. As the formula-based region of interest is square, the hand is closer to rectangular. We have improved the formula of the search window to get a much suitable search window for hand. And then, Support Vector Machines (SVM) algorithm is used for hand gesture recognition. For training the system, we collected 1500 hand gesture pictures of 5 hand gestures. Finally we have performed extensive experiment on a Windows XP system to evaluate the efficiency of the proposed scheme. The hand tracking correct rate is 96% and the hand gestures average correct rate is 95%.

"Least Gain or Wrist Pain": A comparative study about performance and usability of mouse, trackball, and touchpad

  • Yunsun Alice Hong;Kwanghee Han
    • International Journal of Advanced Culture Technology
    • /
    • v.11 no.2
    • /
    • pp.298-309
    • /
    • 2023
  • The mouse as an input device has undoubtedly brought convenience to users due to its intuitiveness and simplicity, but it also brought unprecedented issues such as carpal tunnel syndrome (CTS). As a result, the necessity of alternative input devices that put less strain on the wrist, while still providing the convenience of a conventional mouse, has emerged. Unfortunately, there have been several research about alternative devices to replace a mouse, however, they showed inconsistent results. This study suggests that those inconsistent results may stem from the type and the difficulty of tasks used in previous studies. Therefore, we designed this study to compare the performance and perceived workload of three input devices (Mouse/Trackball/Touchpad) in each condition in terms of task type (Targeting/Tracking) and difficulty level (Easy/Hard). The results indicated that there were significant performance differences and no significant workload differences among the three devices, and the interactions were observed in some conditions. These results can provide users with practical guidelines to choose the optimal input device according to their needs or purpose.