• Title/Summary/Keyword: Mouse tracking

Search Result 51, Processing Time 0.041 seconds

A Proposal of Eye-Voice Method based on the Comparative Analysis of Malfunctions on Pointer Click in Gaze Interface for the Upper Limb Disabled (상지장애인을 위한 시선 인터페이스에서 포인터 실행 방법의 오작동 비교 분석을 통한 Eye-Voice 방식의 제안)

  • Park, Joo Hyun;Park, Mi Hyun;Lim, Soon-Bum
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.4
    • /
    • pp.566-573
    • /
    • 2020
  • Computers are the most common tool when using the Internet and utilizing a mouse to select and execute objects. Eye tracking technology is welcomed as an alternative technology to help control computers for users who cannot use their hands due to their disabilities. However, the pointer execution method of the existing eye tracking technique causes many malfunctions. Therefore, in this paper, we developed a gaze tracking interface that combines voice commands to solve the malfunction problem when the upper limb disabled uses the existing gaze tracking technology to execute computer menus and objects. Usability verification was conducted through comparative experiments regarding the improvements of the malfunction. The upper limb disabled who are hand-impaired use eye tracking technology to move the pointer and utilize the voice commands, such as, "okay" while browsing the computer screen for instant clicks. As a result of the comparative experiments on the reduction of the malfunction of pointer execution with the existing gaze interfaces, we verified that our system, Eye-Voice, reduced the malfunction rate of pointer execution and is effective for the upper limb disabled to use.

A Study on Mouth Mouse

  • Han, Chan-Myung;Park, Joon-Ho;Kim, Hwi-Won;Yoon, Young-Woo
    • 한국정보컨버전스학회:학술대회논문집
    • /
    • 2008.06a
    • /
    • pp.173-176
    • /
    • 2008
  • Among human body parts, the human face has been studied the most actively for the interlace between humans and computers because face has statistic consistency in color, shape and texture. Those characteristics make computers detect and track human faces in images robustly and accurately. The human face consists of eyes, nose, mouth, eyebrows and other features, Detecting and tracking each feature have been researched. The open mouth is the largest in size and the easiest to detect among them, In this study, we present a system which can move mouse pointer using the position and state of the mouth.

  • PDF

A Study on Online Real-Time Strategy Game by using Hand Tracking in Augmented Reality

  • Jeon, Gwang-Ha;Um, Jang-Seok
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.12
    • /
    • pp.1761-1768
    • /
    • 2009
  • In this paper, we implemented online real time strategy game using hand as the mouse in augmented reality. Also, we introduced the algorithm for detecting hand direction, finding fingertip of the index finger and counting the number of fingers for interaction between users and the virtual objects. The proposed method increases the reality of the game by combining the real world and the virtual objects. Retinex algorithm is used to remove the effect of illumination change. The implementation of the virtual reality in the online environment enables to extend the applicability of the proposed method to the areas such as online education, remote medical treatment, and mobile interactive games.

  • PDF

Dynamic Manipulation of a Virtual Object in Marker-less AR system Based on Both Human Hands

  • Chun, Jun-Chul;Lee, Byung-Sung
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.4 no.4
    • /
    • pp.618-632
    • /
    • 2010
  • This paper presents a novel approach to control the augmented reality (AR) objects robustly in a marker-less AR system by fingertip tracking and hand pattern recognition. It is known that one of the promising ways to develop a marker-less AR system is using human's body such as hand or face for replacing traditional fiducial markers. This paper introduces a real-time method to manipulate the overlaid virtual objects dynamically in a marker-less AR system using both hands with a single camera. The left bare hand is considered as a virtual marker in the marker-less AR system and the right hand is used as a hand mouse. To build the marker-less system, we utilize a skin-color model for hand shape detection and curvature-based fingertip detection from an input video image. Using the detected fingertips the camera pose are estimated to overlay virtual objects on the hand coordinate system. In order to manipulate the virtual objects rendered on the marker-less AR system dynamically, a vision-based hand control interface, which exploits the fingertip tracking for the movement of the objects and pattern matching for the hand command initiation, is developed. From the experiments, we can prove that the proposed and developed system can control the objects dynamically in a convenient fashion.

Computer Interface Using Head-Gaze Tracking (응시 위치 추적 기술을 이용한 인터페이스 시스템 개발)

  • 이정준;박강령;김재희
    • Proceedings of the IEEK Conference
    • /
    • 1999.06a
    • /
    • pp.516-519
    • /
    • 1999
  • Gaze detection is to find out the position on a monitor screen where a user is looking at, using the image processing and computer vision technology, We developed a computer interface system using the gaze detection technology, This system enables a user to control the computer system without using their hands. So this system will help the handicapped to use a computer and is also useful for the man whose hands are busy doing another job, especially in tasks in factory. For the practical use, command signal like mouse clicking is necessary and we used eye winking to give this command signal to the system.

  • PDF

Internet Advertisement Technique using Mouse Cursor Tracking (마우스 커서 트래킹을 이용한 인터넷 광고 기법)

  • 김재경;김가영;김진수;최윤철
    • Proceedings of the Korea Multimedia Society Conference
    • /
    • 2003.05b
    • /
    • pp.659-662
    • /
    • 2003
  • 인터넷 광고의 시장 및 기술은 매년 급속하게 성장 및 발전하고 있으며 우리 생활에서 캐놓을 수 없는 한 분야로써 자리잡고 있다. 이러한 인터넷 광고는 일반적으로 전달되는 TV나 라디오와 같은 방송매체의 광고와는 달리 인터넷상에서 사용자의 작업을 방해하지 않으면서도 효과적으로 사용자의 관심을 끌어 광고 내용을 전달하는 것이 중요하다. 왜냐하면 강제적으로 사용자에게 전달되는 광고는 사용자의 임의의 작업을 방해하여 심한 불쾌감을 줄 수 있으며, 너무 정적이어서 관심을 유도하지 못하는 광고는 그 효과가 매우 낮기 때문이다. 본 논문에서는 웹 상에서 사용자의 작업을 최대한 방해하지 않으면서 관심을 유도할 수 있는 인터넷 광고 기법을 제안하고 기존 광고와의 비교 실험을 통하여 제안된 기법의 효과를 검증하여 본다.

  • PDF

The input device system with hand motion using hand tracking technique of CamShift algorithm (CamShift 알고리즘의 Hand Tracking 기법을 응용한 Hand Motion 입력 장치 시스템)

  • Jeon, Yu-Na;Kim, Soo-Ji;Lee, Chang-Hoon;Kim, Hyeong-Ryul;Lee, Sung-Koo
    • Journal of Digital Contents Society
    • /
    • v.16 no.1
    • /
    • pp.157-164
    • /
    • 2015
  • The existing input device is limited to keyboard and mouse. However, recently new type of input device has been developed in response to requests from users. To reflect this trend we propose the new type of input device that gives instruction as analyzing the hand motion of image without special device. After binarizing the skin color area using Cam-Shift method and tracking, it recognizes the hand motion by inputting the finger areas and the angles from the palm center point, which are separated through labeling, into four cardinal directions and counting them. In cases when specific background was not set and without gloves, the recognition rate remained approximately at 75 percent. However, when specific background was set and the person wore red gloves, the recognition rate increased to 90.2 percent due to reduction in noise.

Technical Survey on the Real Time Eye-tracking Pointing Device as a Smart Medical Equipment (실시간 시선 추적기반 스마트 의료기기 고찰)

  • Park, Junghoon;Yim, Kangbin
    • Smart Media Journal
    • /
    • v.10 no.1
    • /
    • pp.9-15
    • /
    • 2021
  • The eye tracking system designed in this paper is an eye-based computer input device designed to give an easy access for those who are uncomfortable with Lou Gehrig's or various muscle-related diseases. It is an eye-based-computer-using device for users whose potential demand alone amounts to 30,000. Combining the number of Lou Gehrig's patients in Korea estimated at around 1,700, and those who are unable to move their bodies due to various accidents or diseases. Because these eye input devices are intended for a small group of users, many types of commercial devices are available on the market. It is making them more expensive and difficult to use for these potential users, less accessible. For this reason, each individual's economic situation and individual experience with smart devices are slightly different. Therefore, making it difficult to access them in terms of cost or usability to use a commercial eye tracking system. Accordingly, attempts to improve accessibility to IT devices through low-cost but easy-to-use technologies are essential. Thus, this paper proposes a complementary superior performance eye tracking system that can be conveniently used by far more people and patients by improving the deficiencies of the existing system. Through voluntary VoCs(Voice of Customers) of users who have used different kinds of eye tracking systems that satisfies it through various usability tests, and we propose a reduced system that the amount of calculation to 1/15th, and eye-gaze tracking error rate to 0.5~1 degree under.

In vivo Tracking of Transplanted Bone Marrow-Derived Mesenchymal Stem Cells in a Murine Model of Stroke by Bioluminescence Imaging

  • Jang, Kyung-Sool;Lee, Kwan-Sung;Yang, Seung-Ho;Jeun, Sin-Soo
    • Journal of Korean Neurosurgical Society
    • /
    • v.48 no.5
    • /
    • pp.391-398
    • /
    • 2010
  • Objective : This study was designed to validate the cell trafficking efficiency of the in vivo bioluminescence image (BLI) study in the setting of transplantation of the luciferase expressing bone marrow-derived mesenchymal stem cells (BMSC), which were delivered at each different time after transient middle cerebral artery occlusion (MCAO) in a mouse model. Methods : Transplanting donor BMSC were prepared by primary cell culture from transgenic mouse expressing luciferase (LUC). Transient focal infarcts were induced in 4-6-week-old male nude mice. The experiment mice were divided into five groups by the time of MSC transplantation : 1) sham-operation group, 2) 2-h group, 3) 1-day group, 4) 3-day group, and 5) 1-week group. BLI for detection of spatial distribution of transplanted MSC was performed by detecting emitted photons. Migration of the transplanted cells to the infarcted area was confirmed by histological examinations. Differences between groups were evaluated by paired t-test. Results : A focal spot of bioluminescence was observed at the injection site on the next day after transplantation by Signal intensity of bioluminescence. After 4 weeks, the mean signal intensities of 2-h, 1-day, 3-day, and 1-week group were $2.6{\times}10^7{\pm}7.4{\times}10^6$. $6.1{\times}10^6{\pm}1.2{\times}10^6$, $1.7{\times}10^6{\pm}4.4{\times}10^5$, and $8.9{\times}10^6{\pm}9.5{\times}10^5$, respectively. The 2-h group showed significantly higher signal intensity (p<0.01). The engrafted BMSC showed around the infarct border zones on immunohistochemical examination. The counts of LUC-positive cells revealed the highest number in the 2-h group, in agreement with the results of BLI experiments (p<0.01). Conclusion : In this study, the results suggested that the transplanted BMSC migrated to the infarct border zone in BLI study and the higher signal intensity of LUC-positive cells seen in 2 hrs after MSC transplantation in MCAO mouse model. In addition, noninvasive imaging in real time is an ideal method for tracking stem cell transplantation. This method can be widely applied to various research fields of cell transplantation therapy.

Development of Virtual Campus Information System using Interactive Virtual Reality Technology (상호작용 VR 기술을 이용한 가상 캠퍼스 안내 시스템 구현)

  • Kim, Jong-Nam;Na, Kil-Hang;Kim, Jong-Heon;Kim, Gyeong-Eop;Jung, Young-Kee
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.779-784
    • /
    • 2008
  • 가상현실 시스템에서 체험자가 현실과 같은 몰입을 느끼기 위해서는 하드웨어를 통한 가상환경 구축과 체험자와의 상호작용이 중요하다. 본 논문에서는 위치 추적 시스템(Motion Tracking System), Wand(3D Mouse) 및 HoloPoint 등 다양한 VR 인터페이스를 통해 체험자의 움직임, 위치, 동작을 인식하고 대형 멀티 디스플레이 시스템을 통해 입체영상을 제공하는 가상환경을 구축하고 체험자가 원하는 정보를 상호작용(Interaction)으로 제공할 수 있는 가상 캠퍼스 안내 시스템을 구현하고자 한다. 가상 캠퍼스 구축은 캠퍼스의 지형, 건물 및 구조물들의 정확한 형상을 얻기 위해 3D 스캐너를 이용하였고 획득된 데이터는 일련의 과정들을 거쳐 3D 모델로 생성된다. 이렇게 생성된 모델을 재배치 및 최적화하기 위해 모델링 소프트웨어를 사용하였다. 구축된 가상 캠퍼스와 위치 추적 시스템 및 Wand의 연동을 위해 VR 프로그래밍 하여 체험자의 움직임 및 동작을 콘텐츠에 그대로 적용시켰다. 여기에 키오스크 유형의 HoloPoint를 이용하여 체험자의 손동작으로 상호작용하는 안내시스템도 구축하였다. 상호작용 가능한 가상캠퍼스 안내 시스템은 가상현실 시스템 구축에 대한 또 다른 방법과 활용 예를 제시함으로써 가상전시관 및 가상체험관 등에 활용될 수 있을 것으로 기대된다.

  • PDF