• Title/Summary/Keyword: Mouse Interface

Search Result 187, Processing Time 0.059 seconds

Optimal Display-Control Gain of the Foot-Controlled Isotonic Mouse on a Target Acquisition Task (목표점 선택작업에서 등력성 발 마우스의 최적 반응 - 조종 이득)

  • Lee, Kyung-Tae;Jang, Phil-Sik;Lee, Dong-Hyun
    • IE interfaces
    • /
    • v.17 no.1
    • /
    • pp.113-120
    • /
    • 2004
  • The increased use of computers has introduced a variety kind of human-computer interfaces. Mouse is one of the useful interface tools to place the cursor on the desired position on the monitor. This paper suggested a foot controlled isotonic mouse which was similar to the ordinary hand-controlled mouse except that positioning was controlled by the right foot and the clicking was performed by the left foot. Experimental results showed that both the index of difficulty(IOD) and the display-control gain(DC gain) varied the total movement time in a target acquisition task on the monitor. The present authors also drew the optimal display-control gain of the foot-controlled isotonic mouse over the index of difficulty of 1.0 to 3.0. The optimal display-control gain, i. e., 0.256, could be used when designing a foot-controlled isotonic mouse.

Implementation of Real-time Vowel Recognition Mouse based on Smartphone (스마트폰 기반의 실시간 모음 인식 마우스 구현)

  • Jang, Taeung;Kim, Hyeonyong;Kim, Byeongman;Chung, Hae
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.8
    • /
    • pp.531-536
    • /
    • 2015
  • The speech recognition is an active research area in the human computer interface (HCI). The objective of this study is to control digital devices with voices. In addition, the mouse is used as a computer peripheral tool which is widely used and provided in graphical user interface (GUI) computing environments. In this paper, we propose a method of controlling the mouse with the real-time speech recognition function of a smartphone. The processing steps include extracting the core voice signal after receiving a proper length voice input with real time, to perform the quantization by using the learned code book after feature extracting with mel frequency cepstral coefficient (MFCC), and to finally recognize the corresponding vowel using hidden markov model (HMM). In addition a virtual mouse is operated by mapping each vowel to the mouse command. Finally, we show the various mouse operations on the desktop PC display with the implemented smartphone application.

Gyro-Mouse for the Disabled: 'Click' and 'Position' Control of the Mouse Cursor

  • Eom, Gwang-Moon;Kim, Kyeong-Seop;Kim, Chul-Seung;Lee, James;Chung, Soon-Cheol;Lee, Bong-Soo;Higa, Hiroki;Furuse, Norio;Futami, Ryoko;Watanabe, Takashi
    • International Journal of Control, Automation, and Systems
    • /
    • v.5 no.2
    • /
    • pp.147-154
    • /
    • 2007
  • This paper describes a 'gyro-mouse', which provides a new human-computer interface (HCI) for persons who are disabled in their upper extremities, for handling the mouse-click and mouse-move function. We adopted the artificial neural network to recognize a quick-nodding pattern of the disabled person as the gyro-mouse click. The performance of our gyro-mouse was evaluated by three indices that include 'click recognition rate', 'error in cursor position control', and 'click rate per minute' on a target box appearing at random positions. Although it turned out that the average error in cursor positioning control was 1.4-1.5 times larger than that of optical mouse control, and the average click rate per minute was 40% of the optical mouse, the overall click recognition rate was 93%. Moreover, the click rate per minute increased from 35.2% to 44% with repetitive trials. Hence, our suggested gyro-mouse system can be used to provide a new user interface tool especially for those persons who do not have full use of their upper extremities.

Study about Windows System Control Using Gesture and Speech Recognition (제스처 및 음성 인식을 이용한 윈도우 시스템 제어에 관한 연구)

  • 김주홍;진성일이남호이용범
    • Proceedings of the IEEK Conference
    • /
    • 1998.10a
    • /
    • pp.1289-1292
    • /
    • 1998
  • HCI(human computer interface) technologies have been often implemented using mouse, keyboard and joystick. Because mouse and keyboard are used only in limited situation, More natural HCI methods such as speech based method and gesture based method recently attract wide attention. In this paper, we present multi-modal input system to control Windows system for practical use of multi-media computer. Our multi-modal input system consists of three parts. First one is virtual-hand mouse part. This part is to replace mouse control with a set of gestures. Second one is Windows control system using speech recognition. Third one is Windows control system using gesture recognition. We introduce neural network and HMM methods to recognize speeches and gestures. The results of three parts interface directly to CPU and through Windows.

  • PDF

Design of a novel haptic mouse system

  • Choi, Hee-Jin;Kwon, Dong-Soo;Kim, Mun-Sang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2002.10a
    • /
    • pp.51.4-51
    • /
    • 2002
  • $\textbullet$ A noval haptic mouse system is developed for human computer interface. $\textbullet$ Five bar mechanism is adapted for 2 dof force feedback with virtual environment. $\textbullet$ Double prismatic joint type mechanism is adapted to reflect 1 dof grabbing force feedback. $\textbullet$ Cable driven mechansim is used for actuation to reduce backlash and endow backdrivability. $\textbullet$ Virtual wall perception experiment is conducted to obtain force specification for haptic mouse. $\textbullet$ Average mouse workspace is measured using magnetic position tracker.

  • PDF

Inflatable Mouse: Volume-adjustable Mouse with Air-pressure-sensitive Input and Haptic Feedback (부풀어지는 마우스: 기압센서를 이용한 입력과 햅틱 피드백을 갖는 부피가 변하는 마우스)

  • Kim, Seok-Tae;Lee, Bo-Ram;Kim, Hyun-Jung;Nam, Tek-Jin;Lee, Woo-Hun
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02b
    • /
    • pp.323-328
    • /
    • 2008
  • Inflatable Mouse is a volume-adjustable user interface. It can be inflated up to the volume of a familiar mouse, but be deflated and stored flat in a PC card slot of a laptop computer when not in use. Inflatable Mouse functions just like a typical mouse; moreover, it provides new interaction techniques by sensing the air pressure in the balloon of the mouse. It also addresses some issues associated with pressure-sensing interactions such as the lack of bi-directional control and the lack of effective feedback. Moreover, it can be used as both a control tool and a display tool. In this paper, the design of an Inflatable Mouse prototype is described and potential application scenarios such as zooming in/out and fast scrolling using pressure control are explained. We also discuss the potential use of Inflatable Mouse as an emotional communication tool.

  • PDF

Implementing Leap-Motion-Based Interface for Enhancing the Realism of Shooter Games (슈팅 게임의 현실감 개선을 위한 립모션 기반 인터페이스 구현)

  • Shin, Inho;Cheon, Donghun;Park, Hanhoon
    • Journal of the HCI Society of Korea
    • /
    • v.11 no.1
    • /
    • pp.5-10
    • /
    • 2016
  • This paper aims at providing a shooter game interface which enhances the game's realism by recognizing user's hand gestures using the Leap Motion. In this paper, we implemented the functions such as shooting, moving, viewpoint change, and zoom in/out, which are necessary in shooter games, and confirmed through user test that the game interface using familiar and intuitive hand gestures is superior to the conventional mouse/keyboard in terms of ease-to-manipulation, interest, extendability, and so on. Specifically, the user satisfaction index(1~5) was 3.02 on average when using the mouse/keyboard interface and 3.57 on average when using the proposed hand gesture interface.

A Design and Implementation of Natural User Interface System Using Kinect (키넥트를 사용한 NUI 설계 및 구현)

  • Lee, Sae-Bom;Jung, Il-Hong
    • Journal of Digital Contents Society
    • /
    • v.15 no.4
    • /
    • pp.473-480
    • /
    • 2014
  • As the use of computer has been popularized these days, an active research is in progress to make much more convenient and natural interface compared to the existing user interfaces such as keyboard or mouse. For this reason, there is an increasing interest toward Microsoft's motion sensing module called Kinect, which can perform hand motions and speech recognition system in order to realize communication between people. Kinect uses its built-in sensor to recognize the main joint movements and depth of the body. It can also provide a simple speech recognition through the built-in microphone. In this paper, the goal is to use Kinect's depth value data, skeleton tracking and labeling algorithm to recognize information about the extraction and movement of hand, and replace the role of existing peripherals using a virtual mouse, a virtual keyboard, and a speech recognition.

피지컬 인터페이스의 구현에 관한 연구

  • 오병근
    • Archives of design research
    • /
    • v.16 no.2
    • /
    • pp.131-140
    • /
    • 2003
  • The input for computer interaction design is very limited for the users to control the interface by only using keyboard and mouse. However, using the basic electrical engineering, the input design can be different from the existing method. Interactive art using computer technology is recently emersed, which is completed by people's participation. The electric signal transmitted in digital and analogue type from the interface controled by people to the computer can be used in multimedia interaction design. The electric circuit design will be necessary to transmit very safe electric signal from the interface. Electric switch, sensor, and camera technologies can be applied to input interface design, which would be alternative physical interaction without computer keyboard and mouse. This type of interaction design using human's body language and gesture would convey the richness of humanity.

  • PDF