• Title/Summary/Keyword: Intuitive interface

Search Result 234, Processing Time 0.026 seconds

Visual Touchless User Interface for Window Manipulation (윈도우 제어를 위한 시각적 비접촉 사용자 인터페이스)

  • Kim, Jin-Woo;Jung, Kyung-Boo;Jeong, Seung-Do;Choi, Byung-Uk
    • Journal of KIISE:Software and Applications
    • /
    • v.36 no.6
    • /
    • pp.471-478
    • /
    • 2009
  • Recently, researches for user interface are remarkably processed due to the explosive growth of 3-dimensional contents and applications, and the spread class of computer user. This paper proposes a novel method to manipulate windows efficiently using only the intuitive motion of hand. Previous methods have some drawbacks such as burden of expensive device, high complexity of gesture recognition, assistance of additional information using marker, and so on. To improve the defects, we propose a novel visual touchless interface. First, we detect hand region using hue channel in HSV color space to control window using hand. The distance transform method is applied to detect centroid of hand and curvature of hand contour is used to determine position of fingertips. Finally, by using the hand motion information, we recognize hand gesture as one of predefined seven motions. Recognized hand gesture is to be a command to control window. In the proposed method, user can manipulate windows with sense of depth in the real environment because the method adopts stereo camera. Intuitive manipulation is also available because the proposed method supports visual touch for the virtual object, which user want to manipulate, only using simple motions of hand. Finally, the efficiency of the proposed method is verified via an application based on our proposed interface.

A Study on Vision Based Gesture Recognition Interface Design for Digital TV (동작인식기반 Digital TV인터페이스를 위한 지시동작에 관한 연구)

  • Kim, Hyun-Suk;Hwang, Sung-Won;Moon, Hyun-Jung
    • Archives of design research
    • /
    • v.20 no.3 s.71
    • /
    • pp.257-268
    • /
    • 2007
  • The development of Human Computer Interface has been relied on the development of technology. Mice and keyboards are the most popular HCI devices for personal computing. However, device-based interfaces are quite different from human to human interaction and very artificial. To develop more intuitive interfaces which mimic human to human interface has been a major research topic among HCI researchers and engineers. Also, technology in the TV industry has rapidly developed and the market penetration rate for big size screen TVs has increased rapidly. The HDTV and digital TV broadcasting are being tested. These TV environment changes require changes of Human to TV interface. A gesture recognition-based interface with a computer vision system can replace the remote control-based interface because of its immediacy and intuitiveness. This research focuses on how people use their hands or arms for command gestures. A set of gestures are sampled to control TV set up by focus group interviews and surveys. The result of this paper can be used as a reference to design a computer vision based TV interface.

  • PDF

Development and Research of SMT(Smart Monitor Target) Game Interface for Airsoft Gun Users (AirSoft Gun 사용자를 위한 SMT(Smart Monitor Target)게임 인터페이스 개발 연구)

  • Chung, Ju Youn;Kang, Yun Geuk
    • Journal of Information Technology Applications and Management
    • /
    • v.28 no.1
    • /
    • pp.83-93
    • /
    • 2021
  • The purpose of this study was to develop a personalized SMT (smart monitor target) game interface for game users who enjoy airsoft sports as individual purchases of SMT have increased since the advent of the untouched era. For this study, the UX (user experience) of the game interface was designed based on previous research. In particular, the personalized game service was reinforced by adding the CP (command post) of the SMT system that performs the home function of the console game, which was intended to help the user maintain immersed in the game in the personalized space of the SMT. Major design elements for the SMT game interface included layout, color, graphics, buttons, and text, and the interface design was proceeded based on them. After composing a grid with a layout in which the tab function was applied to the interface with a vertical three-segment structure and the outer margin value secured, the military camouflage pattern and texture were applied to the colored tone to perform graphics work. Targets and thumbnails were produced as illustrations using experts to ensure the consistency of the interface, and then function buttons and texts on each page were used concisely for intuitive information delivery. The design sources organized in this way were developed using the Unity engine. In the future, we hope that game user-centered personalized interfaces will continue to develop and provide differentiated services unique to SMT systems in the airsoft gun market.

Arduino-based Tangible User Interfaces Smart Puck Systems (아두이노 기반의 텐저블 유저 인터페이스 스마트퍽 시스템)

  • Bak, Seon Hui;Kim, Eung Soo;Lee, Jeong Bae;Lee, Heeman
    • Journal of Korea Multimedia Society
    • /
    • v.19 no.2
    • /
    • pp.334-343
    • /
    • 2016
  • In this paper, we developed a low cost smart puck system that can interact with the intuitive operation of natural finger touches. The tangible smart puck, designed for capacitive tabletop display, has Arduino embedded processor which communicates only with the MT server. The MT server communicates both to the smart puck and the display server. The display server displays the relevance information on the location of the smart pucks on the tabletop display and handles interactions with the users. The experiment results show that the accuracy of identifying the smart puck ID was very reliable enough to use in practice, and the information presentation processing time is confirmed excellent enough compared to traditional expensive commercial products.

A Study on Implementing Kinect-Based Control for LCD Display Contents (LCD Display 설비 Contents의 Kinect기반 동작제어 기술 구현에 관한 연구)

  • Rho, Jungkyu
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.63 no.4
    • /
    • pp.565-569
    • /
    • 2014
  • Recently, various kinds of new computer controlled devices have been introduced in a wide range of areas, and convenient user interfaces for controlling the devices are strongly needed. To implement natural user interfaces(NUIs) on top of the devices, new technologies like a touch screen, Wii Remote, wearable interfaces, and Microsoft Kinect were presented. This paper presents a natural and intuitive gesture-based model for controlling contents of LCD display. Microsoft Kinect sensor and its SDK are used to recognize human gestures, and the gestures are interpreted into corresponding commands to be executed. A command dispatch model is also proposed in order to handle the commands more naturally. I expect the proposed interface can be used in various fields, including display contents control.

Development of 3-D viewer for indoor location tracking system using wireless sensor network

  • Yang, Chi-Shian;Chung, Wan-Young
    • Journal of Sensor Science and Technology
    • /
    • v.16 no.2
    • /
    • pp.110-114
    • /
    • 2007
  • In this paper we present 3-D Navigation View, a three-dimensional visualization of indoor environment which serves as an intuitive and unified user interface for our developed indoor location tracking system via Virtual Reality Modeling Language (VRML) in web environment. The extracted user's spatial information from indoor location tracking system was further processed to facilitate the location indication in virtual 3-D indoor environment based on his location in physical world. External Authoring Interface (EAI) provided by VRML enables the integration of interactive 3-D graphics into web and direct communication with the encapsulated Java applet to update position and viewpoint of user periodically in 3-D indoor environment. As any web browser with VRML viewer plug-in is able to run the platform independent 3-D Navigation View, specialized and expensive hardware or software can be disregarded.

A Development of Intuitive Single-Hand Gesture Interface For writing Hangul by Leap motion (립모션 기반의 직관적 한글 입력 핸드 제스처 인터페이스 개발)

  • Kim, Seonghyeon;Kim, Daecheon;Park, Yechan;Yeom, Sanggil;Choo, Hyunseung
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2016.10a
    • /
    • pp.768-770
    • /
    • 2016
  • 현재 NUI(Natural User Interface)는 차세대 입력 방식으로 주목을 받고 있다. 이미 한글 입력에 관한 NUI가 다양하게 연구 및 개발되고 있지만, 한글 입력 NUI는 직관성과 정확도의 부족과 불완전한 인식률 등의 한계점이 존재한다. 본 연구에서는 사용자의 핸드 제스처를 인식하기 위해 Leap Motion 장치를 사용하고, 한글의 글자 조합 원리를 바탕으로 자음과 모음 입력의 제스처를 분리하여 인식의 정확도를 높인다. 그리고 모음의 방향성을 참고하여 한글 입력에 직관성을 향상할 수 있는 핸드 제스처를 연구한다. 이를 통해 사용자가 NUI 환경의 디바이스를 좀 더 정확하고 빠르게 조작할 수 있도록 돕는다.

A Study of Incremental and Multiple Entry Support Parser for Multi View Editing Environment (다중 뷰 편집환경을 위한 점진적 다중진입 지원 파서에 대한 연구)

  • Yeom, Saehun;Bang, Hyeja
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.14 no.3
    • /
    • pp.21-28
    • /
    • 2018
  • As computer performance and needs of user convenience increase, computer user interface are also changing. This changes had great effects on software development environment. In past, text editors like vi or emacs on UNIX OS were the main development environment. These editors are very strong to edit source code, but difficult and not intuitive compared to GUI(Graphical User Interface) based environment and were used by only some experts. Moreover, the trends of software development environment was changed from command line to GUI environment and GUI Editor provides usability and efficiency. As a result, the usage of text based editor had decreased. However, because GUI based editor use a lot of computer resources, computer performance and efficiency are decreasing. The more contents are, the more time to verify and display the contents it takes. In this paper, we provide a new parser that provide multi view editing, incremental parsing and multiple entry of abstract syntax tree.

Design of OpenCV based Finger Recognition System using binary processing and histogram graph

  • Baek, Yeong-Tae;Lee, Se-Hoon;Kim, Ji-Seong
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.2
    • /
    • pp.17-23
    • /
    • 2016
  • NUI is a motion interface. It uses the body of the user without the use of HID device such as a mouse and keyboard to control the device. In this paper, we use a Pi Camera and sensors connected to it with small embedded board Raspberry Pi. We are using the OpenCV algorithms optimized for image recognition and computer vision compared with traditional HID equipment and to implement a more human-friendly and intuitive interface NUI devices. comparison operation detects motion, it proposed a more advanced motion sensors and recognition systems fused connected to the Raspberry Pi.

Nanolithography Using Haptic Interface in a Nanoscale Virtual Surface (햅틱인터페이스를 이용한 나노스케일 가상표면에서의 나노리소그래피)

  • Kim Sung-Gaun
    • Journal of the Korean institute of surface engineering
    • /
    • v.39 no.2
    • /
    • pp.64-69
    • /
    • 2006
  • Nanoscale task such as nanolithography and nanoindenting is a challenging work that is beyond the capabilities of human sensing and precision. Since surface forces and intermolecular forces dominate over gravitational and other more intuitive forces of the macro world at the nanoscale, a user is not familiar with these novel nanoforce effects. In order to overcome this scaling barrier, haptic interfaces that consist of visual and force feedback at the macro world have been used with an Atomic Force Microscope (AFM) as a manipulator at the nanoscale. In this paper, a nanoscale virtual coupling (NSVC) concept is introduced and the relationship between performance and impedance scaling factors of velocity (or position) and force are explicitly represented. Experiments have been performed for nanoindenting and nanolithography with different materials in the nanoscale virtual surface. The interaction forces (non contact and contact nanoforces) between the AFM tip and the nano sample are transmitted to the operator through the haptic interface.