• Title/Summary/Keyword: Simple user interface

Search Result 255, Processing Time 0.024 seconds

Teleloperation of Field Mobile Manipulator with Wearable Haptic-based Multi-Modal User Interface and Its Application to Explosive Ordnance Disposal

  • Ryu Dongseok;Hwang Chang-Soon;Kang Sungchul;Kim Munsang;Song Jae-Bok
    • Journal of Mechanical Science and Technology
    • /
    • v.19 no.10
    • /
    • pp.1864-1874
    • /
    • 2005
  • This paper describes a wearable multi-modal user interface design and its implementation for a teleoperated field robot system. Recently some teleoperated field robots are employed for hazard environment applications (e.g. rescue, explosive ordnance disposal, security). To complete these missions in outdoor environment, the robot system must have appropriate functions, accuracy and reliability. However, the more functions it has, the more difficulties occur in operation of the functions. To cope up with this problem, an effective user interface should be developed. Furthermore, the user interface is needed to be wearable for portability and prompt action. This research starts at the question: how to teleoperate the complicated slave robot easily. The main challenge is to make a simple and intuitive user interface with a wearable shape and size. This research provides multi-modalities such as visual, auditory and haptic sense. It enables an operator to control every functions of a field robot more intuitively. As a result, an EOD (explosive ordnance disposal) demonstration is conducted to verify the validity of the proposed wearable multi-modal user interface.

Controlling Position of Virtual Reality Contents with Mouth-Wind and Acceleration Sensor

  • Kim, Jong-Hyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.24 no.4
    • /
    • pp.57-63
    • /
    • 2019
  • In this paper, we propose a new framework to control VR(Virtual reality) contents in real time using user's mouth-wind and acceleration sensor of mobile device. In VR, user interaction technology is important, but various user interface methods is still lacking. Most of the interaction technologies are hand touch screen touch or motion recognition. We propose a new interface technology that can interact with VR contents in real time using user's mouth-wind method with acceleration sensor. The direction of the mouth-wind is determined using the angle and position between the user and the mobile device, and the control position is adjusted using the acceleration sensor of the mobile device. Noise included in the size of the mouth wind is refined using a simple average filter. In order to demonstrate the superiority of the proposed technology, we show the result of interacting with contents in game and simulation in real time by applying control position and mouth-wind external force to the game.

A Vision System for the Inspection of Shaft Worm (비전 시스템을 이용한 샤프트 웜 외관검사기 개발)

  • Bark, Jun-Sung;Kim, Tae-Ken;Kim, Han-Su;Yang, Woo-Suck
    • Proceedings of the KIEE Conference
    • /
    • 2004.11c
    • /
    • pp.184-186
    • /
    • 2004
  • This paper is about vision system that exhibits automatic examination of the conditions of shaft's worm. The system is composed of three part : image acquisition, vision algorithm, and user interface. The image acquisition part is composed of motor control, illumination and optics. The vision algorithm examines the parts by labeling algorithm using shaft image. User interface is divided into two parts, user interface for feature registering with control value settings and user interface for examination operation. The automatic inspection system of this research is a tool for final examination of shaft worm. This tool can be practically used in production lines with simple adjustments.

  • PDF

Design and Implementation of Animated Simulation System (Animated Simulation 시스템 설계 및 구현)

  • 김상필;배영환
    • Proceedings of the IEEK Conference
    • /
    • 2000.11b
    • /
    • pp.128-131
    • /
    • 2000
  • In this paper, the animated simulation system (Anisim) is proposed in order to develope an efficient functional system verification tool. It displays the simulation results of the designed system using graphic animation with various models lot the target system. With simple interface definitions given by the user, Anisim generates interface codes automatically. Users can describe and model the target system with the generated interface codes. Since the simulation engine is implemented in C-language, modeling is very simple and simulation can be performed in real time.

  • PDF

VR User Interface using Multipurpose Visual Language System (다목적시각언어를 이용한 가상현실 사용자 인터페이스)

  • Kim, Youngjong
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.12 no.2
    • /
    • pp.31-39
    • /
    • 2016
  • In this paper planed Virtual Reality user interface that recently hot issue using MVLS(Multipurpose Visual Language System). Proposed system is planed for more with ease approach new type environment system. The point of this system is for more few the number of time of act to get want to result. That is easy build for Virtual Reality environment system that user so far, who did not experience. Also too, application to the environment through Multipurpose Visual Language System based, can be required to increase the case of user of existing applications, not only a simple application infrastructure Virtual Reality. This has the advantage of being able to under Virtual Reality condition the environment for the use of a wide range of applications such as view TV, video and other contents. By using the proposed system, the experience in virtual realities that have not felt during the general public to be able to easily and quickly, virtual reality or 3D Expected to can one step closer to the needs of general and industry.

English Input Keypad Method Using Picker -Based Interface

  • Kwon, Soon-Kak;Kim, Heung-Jun
    • Journal of Korea Multimedia Society
    • /
    • v.18 no.11
    • /
    • pp.1383-1390
    • /
    • 2015
  • By according to the development of the mobile devices, a touch screen provides the variety of inputting character and the flexibility of user interface. Currently, the physically simple touch method is widely used for English input but this simple touch is not increasing the variety of inputs and flexibility of the user interfaces. In this paper, we propose a new method to input English characters continuously by recognizing gestures instead of the simple touches. The proposed method places the rotational pickers on the screen for changing the alphabetical sequence instead of the keys and inputs English characters through the flick gestures and the touches. Simulation results show that the proposed keypad method has better performance than the keypad of the conventional methods.

A Structured Method of User Data for User Interface Design in Home Network (홈 네트워크에서 UI 디자인을 위한 사용자 데이터 구조화에 관한 연구)

  • Jung, Ji-Hong;Kim, R.Young-Chul;Pan, Young-Hwan
    • Journal of the Ergonomics Society of Korea
    • /
    • v.26 no.2
    • /
    • pp.61-66
    • /
    • 2007
  • The networked home is connected to the external world using a high speed network. The devices inside the house are connected using a wired and wireless network. Acquiring the user data is an essential step for designing the user interface in user centered design. In networked home, the numbers of use cases are exponentially increased because connected use cases are considered. Because the user data for networked home are too complicated, they are acquired and analyzed by a structured methodology. We surveyed 40 people to acquire the context data home and analyzed by 5W1H (Who, Where, What, When, Why, How). We established a framework for the user data using tasks, user, time, space, objects and environment. The data for home context was structured by our framework. This framework makes simple the home context and is helpful for user interface design in home network.

Development of Pen-type Haptic User Interface and Haptic Effect Design for Digilog Book Authoring (디지로그 북 저작을 위한 펜형 햅틱 사용자인터페이스의 개발)

  • Lee, Jun-Hun;Ha, Tae-Jin;Ryu, Je-Ha;Woo, Woon-Tak
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.402-405
    • /
    • 2009
  • Digilog Book, the next generation publication material, supplies digitalized contents on an analog book by integrating digital contents into existing analog books. There are some studies related to authoring tools which are to authorize, and publish some books which provide digital contents by using VR or AR techniques. In this paper, a pen-type haptic user interface for Digilog Book authoring tool has been introduced. This haptic user interface is developed for more realistic and more effective authoring tasks. This haptic interface provides haptic effects for authoring tasks which are including translation, rotation, scaling, and menu selection. In this research, we designed a body, control circuits, vibration haptic patterns for haptic user interface, and a protocol for between haptic user interface and Digilog Book main control system. Also a simple user study has been done with a developed haptic user interface.

  • PDF

Multimodal Interface Based on Novel HMI UI/UX for In-Vehicle Infotainment System

  • Kim, Jinwoo;Ryu, Jae Hong;Han, Tae Man
    • ETRI Journal
    • /
    • v.37 no.4
    • /
    • pp.793-803
    • /
    • 2015
  • We propose a novel HMI UI/UX for an in-vehicle infotainment system. Our proposed HMI UI comprises multimodal interfaces that allow a driver to safely and intuitively manipulate an infotainment system while driving. Our analysis of a touchscreen interface-based HMI UI/UX reveals that a driver's use of such an interface while driving can cause the driver to be seriously distracted. Our proposed HMI UI/UX is a novel manipulation mechanism for a vehicle infotainment service. It consists of several interfaces that incorporate a variety of modalities, such as speech recognition, a manipulating device, and hand gesture recognition. In addition, we provide an HMI UI framework designed to be manipulated using a simple method based on four directions and one selection motion. Extensive quantitative and qualitative in-vehicle experiments demonstrate that the proposed HMI UI/UX is an efficient mechanism through which to manipulate an infotainment system while driving.

Glanceable and Informative WearOS User Interface for Kids and Parents

  • Kim, Siyeon;Yoon, Hyoseok
    • Journal of Multimedia Information System
    • /
    • v.8 no.1
    • /
    • pp.17-22
    • /
    • 2021
  • This paper proposes a wearable user interface intended for kids and parents using WearOS smartwatches. We first review what constitutes a kids smartwatch and then design UI components for watchfaces to be used by kids and parents. Different UI components ranging from activity, education, voice search, app usage, video, location, health, and quick dial are described. These components are either implemented as complications or on watchfaces and may require on-device standalone function, cross-device communication, and external database. We introduce a theme-based amusing UI for kids whereas simple and easily accessible components are recommended to parents' watchface. To illustrate use cases, we present 3 scenarios for enhancing communication between parents and child. To show feasibility and potential of our approach, we implement our proof-of-concept using commercial smartwatches, smartphones, and external cloud database. Furthermore, performance of checking app usages on different devices are presented, followed by discussion on limitations and future work.