• Title/Summary/Keyword: Mouse Interface

Search Result 187, Processing Time 0.035 seconds

A Study on the Virtual Mouse Interface System (가상 마우스 인터페이스 시스템에 관한 연구)

  • Lee, Ki-Young;Lim, Myung-Jae;Kim, Kyu-Ho;Lee, Min-Ki;Kim, Jeong-Lae
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.11 no.2
    • /
    • pp.57-62
    • /
    • 2011
  • Recently, various interaction was demanded from rapid development and use of portable device. So, operation of touch interface in tablet and smart phone needs many requirements and proactivity. In this paper, for variety of users and convenient interaction, It is aimed to realize virtual reality mouse of portable device. It reflected similar interaction of an existing mouse and it used infrared image without intensity of illumination. Also, It analyzed finger point information of middle and index finger of users and it designed virtual mouse without proactivity and constraint of space.

Evaluating the performance of direct manipulation input devices (직접조작방식 입력장치의 성능비교)

  • 박재희;이남식
    • Journal of the Ergonomics Society of Korea
    • /
    • v.11 no.1
    • /
    • pp.103-109
    • /
    • 1992
  • Direct manipulations are composed of pointing operations and dragging operations. In order to find the optimum design parameters (such as C/D ratio, moving direction) for the direct manipulation of a GUI(Graphical User Interface), an ergonomic experiment was devised (2*4*3*8 design) to measure the performance of a mouse (Microsoft) and a trackball (Logitech). The results showed that the mouse was more suitable for the direct manipulation (expecially for the dragging operation) than the trackball, and the suitable C/D ratio was 10 (for the mouse) and 16 (for the trackball). Also the movement direction was a determinant factor in trackball performance.

  • PDF

Development of Mobile Cloud Computing Client UI/UX based on Open Source SPICE (오픈소스 SPICE 기반의 모바일 클라우드 컴퓨팅 클라이언트 UI/UX 개발)

  • Jo, Seungwan;Oh, Hoon;Shim, Kyusung;Shim, Kyuhyun;Lee, Jongmyung;An, Beongku
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.53 no.8
    • /
    • pp.85-92
    • /
    • 2016
  • Mobile cloud computing (MCC) is not just extensions of cloud concepts into mobile environments, but the service technologies that all mobile devices including smartphone can use the desired services by using cloud technology without the constraints of time and space. Currently, a lot of works on mobile cloud computing have been actively researching, whereas user interfaces are not so much researched. The main features and contributions of this paper are as follows. First, develop UI considering UX that is different from conventional interfaces supported by SPICE. Second, combine two button interface into one button interface when keyboard is used in mobile cloud computing clients. Third, develop a mouse interface suitable for mobile cloud computing clients. Fourth, in mobile cloud computing client, solve a problem that the selection of button/files/folder has at the corner. Finally, in mobile cloud computing clients we change mouse scroll mapping functions from volume button to scroll interface in touch-screen. The results of performance evaluation shows that users can input easily with the increased and fixed mouse interface. Since shortcut keys instead of the complex button keys of keyboard are provided, the input with 3-6 steps is reduced into 1 step, which can simply support complex keys and mouse input for users.

Improvement of Smartphone Interface Using AR Marker (AR 마커를 이용한 스마트폰 인터페이스의 개선)

  • Kang, Yun-A;Han, Soon-Hung
    • Korean Journal of Computational Design and Engineering
    • /
    • v.16 no.5
    • /
    • pp.361-369
    • /
    • 2011
  • As smartphones came into wide use recently, it has become increasingly popular not only among young people, but middle-aged people as well. Most smartphones use capacitive full touch screen, so touch commands are made by fingers unlike the PDAs in the past that use touch pens. In this case, a significant portion of the smartphone's screen is blocked by the finger so it is impossible to see the screens around the finger touching the screen, and difficulty occurs in precise control used for small buttons such as qwerty keyboard. To solve this problem, this research proposes a method of using simple AR markers to improve the interface of smartphones. Sticker-form marker is attached to fingernails and placed in front of the smartphone camera Then, the camera image of the marker is analyzed to determine the orientation of the marker to perceive as onRelease() or onPress() of the mouse depending on the marker's angle of rotation, and use its position as the position of the mouse cursor. This method can enable click, double-click, drag-and-drop used in PCs as well as touch, slide, long-touch-input in smartphones. Through this research, smartphone inputs can be made more precise and simple, and show the possibility of the application of a new concept of smartphone interface.

The Development of HeadZmouse for Computer Access Using Gyroscopic Technology and Macro-Interface for Computer Access (컴퓨터접근을 위한 매크로 인터페이스 및 자이로센서기술을 사용한 헤드마우스의 개발)

  • Rhee, K.M.;Woo, J.S.
    • Journal of rehabilitation welfare engineering & assistive technology
    • /
    • v.1 no.1
    • /
    • pp.1-6
    • /
    • 2007
  • Applying the gyroscopic technology, HeadZmouse has been developed to simulate left and right mouse click, double click, drag and drop, and even a wheel function for navigating web. This device was designed to work on both PC and Macintosh environments using a USB cable. The first time you use this device, you'll find out how much freedom it offers to someone who can't use his or her hands freely. Rather than being tied to your computer, simple manipulation such as blowing an air (breathing) into a sonic sensor can simulate all the functions which standard mouse has, even including a wheel function. Also, a macro-interface device has been developed. By storing repetitive tasks into a memory, you can carry out repetitive tasks just by clicking a button once.

  • PDF

Smart HCI Based on the Informations Fusion of Biosignal and Vision (생체 신호와 비전 정보의 융합을 통한 스마트 휴먼-컴퓨터 인터페이스)

  • Kang, Hee-Su;Shin, Hyun-Chool
    • Journal of the Institute of Electronics Engineers of Korea SC
    • /
    • v.47 no.4
    • /
    • pp.47-54
    • /
    • 2010
  • We propose a smart human-computer interface replacing conventional mouse interface. The interface is able to control cursor and command action with only hand performing without object. Four finger motions(left click, right click, hold, drag) for command action are enough to express all mouse function. Also we materialize cursor movement control using image processing. The measure what we use for inference is entropy of EMG signal, gaussian modeling and maximum likelihood estimation. In image processing for cursor control, we use color recognition to get the center point of finger tip from marker, and map the point onto cursor. Accuracy of finger movement inference is over 95% and cursor control works naturally without delay. we materialize whole system to check its performance and utility.

Evaluation of the Head Mouse System using Gyro-and Opto-Sensors (각속도 및 광센서를 이용한 헤드 마우스의 평가)

  • Park, Min-Je;Kim, Soo-Chan
    • Journal of the HCI Society of Korea
    • /
    • v.5 no.2
    • /
    • pp.1-6
    • /
    • 2010
  • In this research, we designed the head mouse system for disabled and gamers, a mouse controller which can be controlled by head movements and eye blinks only, and compared its performance with other regular mouse controller systems. The head mouse was moved by a gyro-sensor, which can measure an angular rotation of a head movement, and the eye blink was used as a clicking event of the mouse system. Accumulated errors caused by integral, which was a problem that previous head mouse system had, were removed periodically, and treated as dead zones in the non-linear relative point graph, and direct mouse point control was possible using its moving distance and acceleration calculation. We used the active light sources to minimize the influence of the ambient light changes, so that the head mouse was not affected by the change in external light source. In a comparison between the head mouse and the gazing tracking mouse (Quick Glance), the above method resulted about 21% higher score on the clicking event experiment called "20 clicks", about 25% higher on the dasher experiment, and about 37% higher on on-screen keyboard test respectively, which concludes that the proposed head mouse has better performance than the standard mouse system.

  • PDF

Welfare Interface using Multiple Facial Features Tracking (다중 얼굴 특징 추적을 이용한 복지형 인터페이스)

  • Ju, Jin-Sun;Shin, Yun-Hee;Kim, Eun-Yi
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.45 no.1
    • /
    • pp.75-83
    • /
    • 2008
  • We propose a welfare interface using multiple fecial features tracking, which can efficiently implement various mouse operations. The proposed system consist of five modules: face detection, eye detection, mouth detection, facial feature tracking, and mouse control. The facial region is first obtained using skin-color model and connected-component analysis(CCs). Thereafter the eye regions are localized using neutral network(NN)-based texture classifier that discriminates the facial region into eye class and non-eye class, and then mouth region is localized using edge detector. Once eye and mouth regions are localized they are continuously and correctly tracking by mean-shift algorithm and template matching, respectively. Based on the tracking results, mouse operations such as movement or click are implemented. To assess the validity of the proposed system, it was applied to the interface system for web browser and was tested on a group of 25 users. The results show that our system have the accuracy of 99% and process more than 21 frame/sec on PC for the $320{\times}240$ size input image, as such it can supply a user-friendly and convenient access to a computer in real-time operation.

PC용 중소형 S/W개발을 위한 graphic user interface tool kit

  • 신하용;홍태화
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1992.04b
    • /
    • pp.524-531
    • /
    • 1992
  • S/W의 구성은 기능을 실제로 수행하는 계산처리부와 사용자와 정보를 주고받는 User Interface부로 크게 나누어 볼 수 있다. 대부분의 S/W는 그 기능의 충실도와 아울러 편리한 사용법이 잘 어울려야 제역할을 할 수 있다. 따라서 편리한 User Interface를 만드는 일은 훌륭한 기능을 제공하는 것과 마찬가지로 S/W 개발자들이 지향해야 할 목표중 하나라 하겠다. 그러나 편리한 User Interface를 만드는 일은 전체 시스템개발 기간중 상당한 부분을 차지한다. 특히 CAD/CAM관련 S/W와 같이 사용자와 정보를 Graphic으로 주고받는 경우에는 User Interface의 구성이 더욱 더 어려워지며, 개발기간중 User Interace가 차지하는 비중은 더욱 커지게 된다. GUI는 최근들어 상당히 각광을 받고 있는 분야이며, 상당수의 GUI가 개발되어 있고, 이중 Unix용의 X-Window MS-DOS용의 MS-WINDOWS 3.0이 널리 사용되고 있다. 그러나 이러한 Window System들은 매우 다양한 기능을 제공하고 있음에도 불구하고, GUI자체의 덩치가 매우크고, S/W개발자가 익혀야하는 내용이 너무 복잡하여, 중/소형의 S/W개발에 사용하기에는 적합하지 않다. 본 연구에서는 GUI(Graphic User Interface)부분을 계산처리부와 완전분리하여, 범용성있는 개발용 Tool Kit을 구성함으로써 S/W개발자는 계산처리부만을 만들면되는 방안을 제시하고 있다. GS-GUI는 Menu처리부, Mouse/Keyboard입력처리부, Text/Graphic출력처리부로 나뉘어진다. Menu처리부는 Menu File로 주어지는 Menu Tree를 Pop-Up형태로 보여주며 User가 Menu를 선택할 수 있도록하며 선택된 Menu Item에 대한 Action Code를 계산처리부로 넘겨준다. 입력처리부에서는 Mouse와 Keyboard 어느것으로나 입력이 가능하도록 해준다. 출력처리부에서는 Action Code에 따라 계산처리부에서 계산된 결과를 화면에 보여주기 위한 각종 2D/3D Graphic Routine들이 포함되어있어 계산처리부에서 불러쓰도록 되어있다.

  • PDF

An Implementation of Web Image Collector using Drag&Drop Mechanism (Drag&Drop 메커니즘을 이용한 웹 이미지 수집기의 구현)

  • Lee, Seon-Ung;Moon, Il-Young
    • The Journal of Korean Institute for Practical Engineering Education
    • /
    • v.1 no.1
    • /
    • pp.55-60
    • /
    • 2009
  • Drag&Drop mechanism was formerly the clipboard of Microsoft Windows. Drag&Drop means that copy and paste functions using the clipboard are processed by a mouse event. The touch interface come info the spotlight not to speak of PCs, laptops and mobile phones. Mouse and touch interfaces make an environment to work easier and intuitive through visible interactions. In this paper, we implemented a web image collector to utilize Drag&Drop. And we proposed the how to apply and a utilizable plan from it.

  • PDF