• Title/Summary/Keyword: Intuitive interface

Search Result 234, Processing Time 0.026 seconds

A Study of Hand Gesture Recognition for Human Computer Interface (컴퓨터 인터페이스를 위한 Hand Gesture 인식에 관한 연구)

  • Chang, Ho-Jung;Baek, Han-Wook;Chung, Chin-Hyun
    • Proceedings of the KIEE Conference
    • /
    • 2000.07d
    • /
    • pp.3041-3043
    • /
    • 2000
  • GUI(graphical user interface) has been the dominant platform for HCI(human computer interaction). The GUI-based style of interaction has made computers simpler and easier to use. However GUI will not easily support the range of interaction necessary to meet users' needs that are natural, intuitive, and adaptive. In this paper we study an approach to track a hand in an image sequence and recognize it, in each video frame for replacing the mouse as a pointing device to virtual reality. An algorithm for real time processing is proposed by estimating of the position of the hand and segmentation, considering the orientation of motion and color distribution of hand region.

  • PDF

A Design and Implementation of DB System for Providing Comprehensive Information Retrieval Service about Social Economic Information

  • Lee, Cheol-won;Jeon, Heung Seok;Noh, Younghee
    • Journal of the Korea Society of Computer and Information
    • /
    • v.22 no.5
    • /
    • pp.73-79
    • /
    • 2017
  • In this paper, we propose a design and implementation of a DB system for providing comprehensive information retrieval service about social economic information. We classify social economic organizations into 6 major categories and 35 small categories. The DB contains 25,938 social economic organizations in total. The DB system provides simple and intuitive interface for searching information such as keyword search, category search and initial search. Fully loaded time of webpage of implemented DB system achieved 2.3s by Vancouver, Canada Server.

A Light-weight ANN-based Hand Motion Recognition Using a Wearable Sensor (웨어러블 센서를 활용한 경량 인공신경망 기반 손동작 인식기술)

  • Lee, Hyung Gyu
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.17 no.4
    • /
    • pp.229-237
    • /
    • 2022
  • Motion recognition is very useful for implementing an intuitive HMI (Human-Machine Interface). In particular, hands are the body parts that can move most precisely with relatively small portion of energy. Thus hand motion has been used as an efficient communication interface with other persons or machines. In this paper, we design and implement a light-weight ANN (Artificial Neural Network)-based hand motion recognition using a state-of-the-art flex sensor. The proposed design consists of data collection from a wearable flex sensor, preprocessing filters, and a light-weight NN (Neural Network) classifier. For verifying the performance and functionality of the proposed design, we implement it on a low-end embedded device. Finally, our experiments and prototype implementation demonstrate that the accuracy of the proposed hand motion recognition achieves up to 98.7%.

NUI/NUX framework based on intuitive hand motion (직관적인 핸드 모션에 기반한 NUI/NUX 프레임워크)

  • Lee, Gwanghyung;Shin, Dongkyoo;Shin, Dongil
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.11-19
    • /
    • 2014
  • The natural user interface/experience (NUI/NUX) is used for the natural motion interface without using device or tool such as mice, keyboards, pens and markers. Up to now, typical motion recognition methods used markers to receive coordinate input values of each marker as relative data and to store each coordinate value into the database. But, to recognize accurate motion, more markers are needed and much time is taken in attaching makers and processing the data. Also, as NUI/NUX framework being developed except for the most important intuition, problems for use arise and are forced for users to learn many NUI/NUX framework usages. To compensate for this problem in this paper, we didn't use markers and implemented for anyone to handle it. Also, we designed multi-modal NUI/NUX framework controlling voice, body motion, and facial expression simultaneously, and proposed a new algorithm of mouse operation by recognizing intuitive hand gesture and mapping it on the monitor. We implement it for user to handle the "hand mouse" operation easily and intuitively.

Comparative Study on the Interface and Interaction for Manipulating 3D Virtual Objects in a Virtual Reality Environment (가상현실 환경에서 3D 가상객체 조작을 위한 인터페이스와 인터랙션 비교 연구)

  • Park, Kyeong-Beom;Lee, Jae Yeol
    • Korean Journal of Computational Design and Engineering
    • /
    • v.21 no.1
    • /
    • pp.20-30
    • /
    • 2016
  • Recently immersive virtual reality (VR) becomes popular due to the advanced development of I/O interfaces and related SWs for effectively constructing VR environments. In particular, natural and intuitive manipulation of 3D virtual objects is still considered as one of the most important user interaction issues. This paper presents a comparative study on the manipulation and interaction of 3D virtual objects using different interfaces and interactions in three VR environments. The comparative study includes both quantitative and qualitative aspects. Three different experimental setups are 1) typical desktop-based VR using mouse and keyboard, 2) hand gesture-supported desktop VR using a Leap Motion sensor, and 3) immersive VR by wearing an HMD with hand gesture interaction using a Leap Motion sensor. In the desktop VR with hand gestures, the Leap Motion sensor is put on the desk. On the other hand, in the immersive VR, the sensor is mounted on the HMD so that the user can manipulate virtual objects in the front of the HMD. For the quantitative analysis, a task completion time and success rate were measured. Experimental tasks require complex 3D transformation such as simultaneous 3D translation and 3D rotation. For the qualitative analysis, various factors relating to user experience such as ease of use, natural interaction, and stressfulness were evaluated. The qualitative and quantitative analyses show that the immersive VR with the natural hand gesture provides more intuitive and natural interactions, supports fast and effective performance on task completion, but causes stressful condition.

ARtalet for Digilog Book Authoring Tool - Authoring 3D Objects Properties (디지로그 북 저작도구 ARtalet - 3 차원 객체 속성 저작)

  • Ha, Tae-Jin;Lee, Youg-Ho;Woo, Woon-Tack
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.314-318
    • /
    • 2008
  • This paper is about an authoring interface for augmented/mixed reality based book, specifically authoring 3D objects properties of Digilog book. We pursue even normal users with non-professional knowledge for programming can make the Digilog book easily. An authoring interface 3D object properties includes a manipulator as an input device and 3D contents authoring parts. As an interface design metaphor, existing GUI interface, already familiar to computer users, are referenced. The manipulator generates continuous/discrete input signal are necessary for authoring interface. Contents authoring part performs selection, positioning, scaling, coloring, copy of virtual objects using the input signal of the manipulator. Also users can exploit already existing GUI interface metaphor including pointing, click, drag and drop, and copy techniques with the manipulator. Therefore we think our AR authoring system can support rapid and intuitive modification of properties of virtual objects.

  • PDF

Mobile Shooting Game with Intuitive UI and Recommendation function (직관적 UI와 추천 기능을 가진 모바일 슈팅 게임)

  • Junsu Kim;Kuil Jung;Seokjun Yoon;In-Hwan Jung;Jae-Moon Lee;Kitae Hwang
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.23 no.5
    • /
    • pp.191-197
    • /
    • 2023
  • Mobile shooting games are a representative example of PC games being transferred as they are. In the most mobile shooting games, joystick-like UI used in PC games have been moved to touch buttons, but the display is small, so the user's fingers cover the game screen, which is inconvenient. In mobile shooting games, in order to overcome the limitations of the small display and increase the immersion of the game, this paper introduces a user interface that integrates character movement and aiming, and intuitive UIs such as display rotation, shaking, and vibration. In addition, by analyzing the match process for each round, the character's insufficient abilities are identified and synergies to supplement the abilities are recommended in order to add fun to the game. This paper proved that the proposed goals are achieved by actually designing and implementing a mobile shooting game with the proposed functions on an Android smartphone.

An Expert System for the Real-Time Computer Control of the Large-Scale System (대규모 시스템의 실시간 컴퓨터 제어를 위한 전문가 시스템)

  • Ko, Yun-Seok
    • The Transactions of the Korean Institute of Electrical Engineers A
    • /
    • v.48 no.6
    • /
    • pp.781-788
    • /
    • 1999
  • In this paper, an expert system is proposed, which can be effectively applied to the large-scale systems with the diversity time constraints, the objectives and the unfixed system structure. The inference scheme of the expert system have the integrated structure composed of the intuitive inference module and logical inference module in order to support effectively the operating constraints of system. The intuitive inference module is designed using the pattern matching or pattern recognition method in order to search a same or similar pattern under the fixed system structure. On the other hand, the logical inference module is designed as the structure with the multiple inference mode based on the heuristic search method in order to determine the optimal or near optimal control strategies satisfing the time constraints for system events under the unfixed system structure, and in order to use as knowledge generator. Here, inference mode consists of the best-first, the local-minimum tree, the breadth-iterative, the limited search width/time method. Finally, the application results for large-scale distribution SCADA system proves that the inference scheme of the expert system is very effective for the large-scale system. The expert system is implemented in C language for the dynamic mamory allocation method, database interface, compatability.

  • PDF

The Study of Design Thinking as Foundation of Multidisciplinary Education (다학제 교육의 근간으로서 '디자인 사고'에 대한 연구)

  • Park, Sung-Mi;Kim, Sue-Hwa
    • Journal of Fisheries and Marine Sciences Education
    • /
    • v.25 no.1
    • /
    • pp.260-273
    • /
    • 2013
  • This study aims to reflect experts' opinions in analyzing a design thinking as foundation of multidisciplinary education. For this purpose, a delphi survey was conducted with 20 experts in three sessions from May 1 to June 25, 2012. To analyze the collected data, descriptive statistics, including frequency, percentage, the mean, and standard deviation were implemented, and internal reliability test on the survey instrument was carried out for statistical processing. The main results are as follows : First, the delphi analysis on intuitive thinking of design thinking suggested 7 items(to pursue the possibility of outside, to pursue the possibility of applying new forms of technology, content planning, facing a complex real-world phenomena etc.). Second, the delphi analysis on logical thinking of design thinking suggested 7 items(executed repeatedly, reasoning and verification, artificial intelligence, a decision support system etc.) Third, the delphi analysis on subjective thinking of design thinking suggested 9 items(user experience measuring, user satisfaction ratings, user requirements analysis, user interface design, behavioral responses of the human etc.). Fourth, the delphi analysis on objective information of design thinking suggested 8 items(information management system, simulation, production process, information exchange and sharing etc.). According to the results of the delphi analysis, design thinking can be seen as the foundation of multidisciplinary education. Suggestions were made for discussion about the main results and further researches.

Implementation of a Smartphone Interface for a Personal Mobility System Using a Magnetic Compass Sensor and Wireless Communication (지자기 센서와 무선통신을 이용한 PMS의 스마트폰 인터페이스 구현)

  • Kim, Yeongyun;Kim, Dong Hun
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.25 no.1
    • /
    • pp.48-56
    • /
    • 2015
  • In the paper, a smartphone-controlled personal mobility system(PMS) based on a compass sensor is developed. The use of a magnetic compass sensor makes the PMS move according to the heading direction of a smartphone controlled by a rider. The proposed smartphone-controlled PMS allows more intuitive interface than PMS controlled by pushing a button. As well, the magnetic compass sensor makes a role in compensating for the mechanical characteristics of motors mounted on the PMS. For adequate control of the robot, two methods: absolute and relative direction methods based on the magnetic compass sensor and wireless communication are presented. Experimental results show that the PMS is conveniently and effectively controlled by the proposed two methods.