• Title/Summary/Keyword: Intuitive interface

Search Result 234, Processing Time 0.043 seconds

Multi-modal Sense based Interface for Augmented Reality in Table Top Display (테이블 탑 디스플레이 기반 증강현실 구현을 위한 다중 감각 지원 인터페이스)

  • Jeong, Jong-Mun;Yang, Hyung-Jeong;Kim, Sun-Hee
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.5
    • /
    • pp.708-716
    • /
    • 2009
  • Applications which are implemented on Table Top Display are controlled by hands, so that they support an intuitive interface to users. Users feel the real sense when they interact on the virtual scene in Table Top Display. However, most of conventional augmented reality applications on Table Top Display satisfy only visual sense. In this paper, we propose an interface that supports multi-modal sense in that tactile sense is utilized for augment reality by vibrating a physical control unit when it collides to virtual objects. Users can feel the collision in addition to visual scene. The proposed system facilitates tactile augmented reality through an air hockey game. A physical control unit vibrates when it receives virtual collision data over wireless communication. Since the use of tabletop display environment is extended with a tactile sense based physical unit other than hand, it provides a more intuitive interface.

  • PDF

The development of Intuitive User Interface and Control Software for Audio Mixer in Digital PA System (디지털전관방송시스템을 위한 오디오믹서의 직관적인 사용자 인터페이스 및 제어 소프트웨어 개발)

  • Kim, Kwan Woong;Cho, Juphil
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.11 no.3
    • /
    • pp.307-312
    • /
    • 2018
  • In this paper, we can confirm the result of intuitive interface software implementation for operating a digital PA(Public Address) controller and the performance of audio mixer control part. Developed user interface software provides the maintaining management and control function of digital hybrid mixer. This SW loaded in the integrated control server controls an sound status of the audio mixer TAD-168M and checks the device status for Public Address integrated system. Also, this SW enables the integrated control and the continuous upgrade. Developed SW is connected to TAD-168M with Ethernet and linked to PC Lan port and the 4-port switch, located in the backside of TAD-168M, by LAN cable for communicating with operating PC. Integrated control including system management, audio control and uplink broadcasting control for broadcasting system will be made available with this novel developed system.

SmartPuck System : Tangible Interface for Physical Manipulation of Digital Information (스마트 퍽 시스템 : 디지털 정보의 물리적인 조작을 제공하는 실감 인터페이스 기술)

  • Kim, Lae-Hyun;Cho, Hyun-Chul;Park, Se-Hyung
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.13 no.4
    • /
    • pp.226-230
    • /
    • 2007
  • In the conventional desktop PC environment, keyboard and mouse are used to process the user input and monitor displays the visual information as an output device. In order to manipulate the digital information, we move the virtual cursor to select the desired graphical icon on the monitor The cursor represents the relative motion of the physical mouse on the desk. This desktop metaphor does not provide intuitive interface through human sensation. In this paper, we introduce a novel tangible interface which allows the user to interact with computers using a physical tool called "Smartpuck". SmartPuck system bridges the gap between analog perception and response in human being and digital information on the computer. The system consists of table display based on a PDP, SmartPuck equipped with rotational part and button for the user's intuitive and tactile input, and a sensing system to track the position of SmartPuck. Finally, we will show examples working with the system.

A Comparison of the Characteristics between Single and Double Finger Gestures for Web Browsers

  • Park, Jae-Kyu;Lim, Young-Jae;Jung, Eui-S.
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.5
    • /
    • pp.629-636
    • /
    • 2012
  • Objective: The purpose of this study is to compare the characteristics of single and double finger gestures related on the web browser and to extract the appropriate finger gestures. Background: As electronic equipment emphasizes miniaturization for improving portability various interfaces are being developed as input devices. Electronic devices are made smaller, the gesture recognition technology using the touch-based interface is favored for easy editing. In addition, user focus primarily on the simplicity of intuitive interfaces which propels further research of gesture based interfaces. In particular, the fingers in these intuitive interfaces are simple and fast which are users friendly. Recently, the single and double finger gestures are becoming more popular so more applications for these gestures are being developed. However, systems and software that employ such finger gesture lack consistency in addition to having unclear standard and guideline development. Method: In order to learn the application of these gestures, we performed the sketch map method which happens to be a method for memory elicitation. In addition, we used the MIMA(Meaning in Mediated Action) method to evaluate gesture interface. Results: This study created appropriate gestures for intuitive judgment. We conducted a usability test which consisted of single and double finger gestures. The results showed that double finger gestures had less performance time faster than single finger gestures. Single finger gestures are a wide satisfaction difference between similar type and difference type. That is, single finger gestures can judge intuitively in a similar type but it is difficult to associate functions in difference type. Conclusion: This study was found that double finger gesture was effective to associate functions for web navigations. Especially, this double finger gesture could be effective on associating complex forms such as curve shaped gestures. Application: This study aimed to facilitate the design products which utilized finger and hand gestures.

Interface Mapping and Generation Methods for Intuitive User Interface and Consistency Provision (사용자 인터페이스의 직관적인 인식 및 일관성 부여를 위한 인터페이스 매핑 및 생성 기법)

  • Yoon, Hyo-Seok;Woo, Woon-Tack
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.135-139
    • /
    • 2009
  • In this paper we present INCUI, a user interface based on natural view of physical user interface of target devices and services in pervasive computing environment. We present a concept of Intuitively Natural and Consistent User Interface (INCUI) consisted of an image of physical user interface and a description XML file. Then we elaborate how INCUI template can be used to consistently map user interface components structurally and visually. We describe the process of INCUI mapping and a novel mapping method selection architecture based on domain size, types of source and target INCUI. Especially we developed and applied an extended LCS-based algorithm using prefix/postfix/synonym for similarity calculation.

  • PDF

A Graphical User Interface Design for Surveillance and Security Robot (감시경계 로봇의 그래픽 사용자 인터페이스 설계)

  • Choi, Duck-Kyu;Lee, Chun-Woo;Lee, Choonjoo
    • The Journal of Korea Robotics Society
    • /
    • v.10 no.1
    • /
    • pp.24-32
    • /
    • 2015
  • This paper introduces a graphical user interface design that is aimed to apply to the surveillance and security robot, which is the pilot program for the army unmanned light combat vehicle. It is essential to consider the activities of robot users under the changing security environment in order to design the efficient graphical user interface between user and robot to accomplish the designated mission. The proposed design approach firstly identifies the user activities to accomplish the mission in the standardized scenarios of military surveillance and security operation and then develops the hierarchy of the interface elements that are required to execute the tasks in the surveillance and security scenarios. The developed graphical user interface includes input control component, navigation component, information display component, and accordion and verified by the potential users from the various skilled levels with the military background. The assessment said that the newly developed user interface includes all the critical elements to execute the mission and is simpler and more intuitive compared to the legacy interface design that was more focused on the technical and functional information and informative to the system developing engineers rather than field users.

Cubic Tangible User Interface Development for Mobile Environment (모바일 환경을 위한 큐빅형 텐저블 사용자 인터페이스 개발)

  • Ok, Soo-Yol
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.26 no.10
    • /
    • pp.32-39
    • /
    • 2009
  • Most mobile devices provide limited input interfaces in order to maximize the mobility and the portability. In this paper, the author proposes a small cubic-shaped tangible input interface which tracks the location, the direction, and the velocity using MEMS sensor technology to overcome the physical limitations of the poor input devices in mobile computing environments. As the preliminary phase for implementing the proposed tangible input interface, the prototype design and implementation methods are described in this paper. Various experiments such as menu manipulation, 3-dimensional contents control, and sensor data visualization have been performed in order to verify the validity of the proposed interface. The proposed tangible device enables direct and intuitive manipulation. It is obvious that the mobile computing will be more widespread and various kinds of new contents will emerge in near future. The proposed interface can be successfully employed for the new contents services that cannot be easily implemented because of the limitation of current input devices. It is also obvious that this kind of interface will be a critical component for future mobile communication environments. The proposed tangible interface will be further improved to be applied to various contents manipulation including 2D/3D games.

Data model of Multimodal Visual Interface (멀티모달 비주얼 인터페이스의 테이터형)

  • Malyanov, Ilya;d'Auriol, Brian J.;Lee, Sung-Young;Lee, Young-Koo
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2011.06b
    • /
    • pp.240-241
    • /
    • 2011
  • Contemporary electronic healthcare systems are getting more and more complex, providing users a broad functionality, but often fail to have accessible interfaces. However, the importance of a good interface is nearly as great as of the rest of the system. Development of an intuitive multimodal interface for a healthcare system is the goal of our research work. This paper discusses data model of the interface.

Human Indicator and Information Display using Space Human Interface in Networked Intelligent Space

  • Jin Tae-Seok;Niitsuma Mihoko;Hashimoto Hideki
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.15 no.5
    • /
    • pp.632-638
    • /
    • 2005
  • This paper describes a new data-handing, based on a Spatial Human Interface as human indicator, to the Spatial-Knowledge-Tags (SKT) in the spatial memory the Spatial Human Interface (SHI) is a new system that enables us to facilitate human activity in a working environment. The SHI stores human activity data as knowledge and activity history of human into the Spatial Memory in a working environment as three-dimensional space where one acts, and loads them with the Spatial-Knowledge-Tags(SKT) by supporting the enhancement of human activity. To realize this, the purpose of SHI is to construct new relationship among human and distributed networks computers and sensors that is based on intuitive and simultaneous interactions. In this paper, the specified functions of SKT and the realization method of SKT are explained. The utility of SKT is demonstrated in designing a robot motion control.

Design of Gaming Interaction Control using Gesture Recognition and VR Control in FPS Game (FPS 게임에서 제스처 인식과 VR 컨트롤러를 이용한 게임 상호 작용 제어 설계)

  • Lee, Yong-Hwan;Ahn, Hyochang
    • Journal of the Semiconductor & Display Technology
    • /
    • v.18 no.4
    • /
    • pp.116-119
    • /
    • 2019
  • User interface/experience and realistic game manipulation play an important role in virtual reality first-person-shooting game. This paper presents an intuitive hands-free interface of gaming interaction scheme for FPS based on user's gesture recognition and VR controller. We focus on conventional interface of VR FPS interaction, and design the player interaction wearing head mounted display with two motion controllers; leap motion to handle low-level physics interaction and VIVE tracker to control movement of the player joints in the VR world. The FPS prototype system shows that the design interface helps to enjoy playing immersive FPS and gives players a new gaming experience.