• 제목/요약/키워드: Human Interface

검색결과 1,445건 처리시간 0.032초

Comprehensive architecture for intelligent adaptive interface in the field of single-human multiple-robot interaction

  • Ilbeygi, Mahdi;Kangavari, Mohammad Reza
    • ETRI Journal
    • /
    • 제40권4호
    • /
    • pp.483-498
    • /
    • 2018
  • Nowadays, with progresses in robotic science, the design and implementation of a mechanism for human-robot interaction with a low workload is inevitable. One notable challenge in this field is the interaction between a single human and a group of robots. Therefore, we propose a new comprehensive framework for single-human multiple-robot remote interaction that can form an efficient intelligent adaptive interaction (IAI). Our interaction system can thoroughly adapt itself to changes in interaction context and user states. Some advantages of our devised IAI framework are lower workload, higher level of situation awareness, and efficient interaction. In this paper, we introduce a new IAI architecture as our comprehensive mechanism. In order to practically examine the architecture, we implemented our proposed IAI to control a group of unmanned aerial vehicles (UAVs) under different scenarios. The results show that our devised IAI framework can effectively reduce human workload and the level of situation awareness, and concurrently foster the mission completion percentage of the UAVs.

음성인식용 인터페이스의 사용편의성 평가 방법론 (A Usability Evaluation Method for Speech Recognition Interfaces)

  • 한성호;김범수
    • 대한인간공학회지
    • /
    • 제18권3호
    • /
    • pp.105-125
    • /
    • 1999
  • As speech is the human being's most natural communication medium, using it gives many advantages. Currently, most user interfaces of a computer are using a mouse/keyboard type but the interface using speech recognition is expected to replace them or at least be used as a tool for supporting it. Despite the advantages, the speech recognition interface is not that popular because of technical difficulties such as recognition accuracy and slow response time to name a few. Nevertheless, it is important to optimize the human-computer system performance by improving the usability. This paper presents a set of guidelines for designing speech recognition interfaces and provides a method for evaluating the usability. A total of 113 guidelines are suggested to improve the usability of speech-recognition interfaces. The evaluation method consists of four major procedures: user interface evaluation; function evaluation; vocabulary estimation; and recognition speed/accuracy evaluation. Each procedure is described along with proper techniques for efficient evaluation.

  • PDF

스포츠화에 대한 소비자의 감성 DB 및 Interface 구축에 관한 연구 (A Study on the Customers Emotional DB and Interface Design for Sports Shoes)

  • 윤훈용;임기용
    • 산업경영시스템학회지
    • /
    • 제25권3호
    • /
    • pp.34-40
    • /
    • 2002
  • In the past, the important factors of buying the sports shoes for the customers were price and comfort. However these days, the sports shoes are considered as a part of fashion and may not satisfy the customers because their emotional preference have not been properly considered in design phase. The customers' desire and expectation for unique design are growing. Thus, the development of sports shoes not only considering the anthropometric foot characteristics but also satisfying the customers emotional preference is needed. In this study, the basic data on the customer's emotional preference to the design of sports shoes were obtained using human sensibility ergonomics approach and formed a data base. Also, we developed ail interface that can be used for the customers to select the emotionally preferred sports shoes.

Human-Computer Interaction Based Only on Auditory and Visual Information

  • Sha, Hui;Agah, Arvin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • 제2권4호
    • /
    • pp.285-297
    • /
    • 2000
  • One of the research objectives in the area of multimedia human-computer interaction is the application of artificial intelligence and robotics technologies to the development of computer interfaces. This involves utilizing many forms of media, integrating speed input, natural language, graphics, hand pointing gestures, and other methods for interactive dialogues. Although current human-computer communication methods include computer keyboards, mice, and other traditional devices, the two basic ways by which people communicate with each other are voice and gesture. This paper reports on research focusing on the development of an intelligent multimedia interface system modeled based on the manner in which people communicate. This work explores the interaction between humans and computers based only on the processing of speech(Work uttered by the person) and processing of images(hand pointing gestures). The purpose of the interface is to control a pan/tilt camera to point it to a location specified by the user through utterance of words and pointing of the hand, The systems utilizes another stationary camera to capture images of the users hand and a microphone to capture the users words. Upon processing of the images and sounds, the systems responds by pointing the camera. Initially, the interface uses hand pointing to locate the general position which user is referring to and then the interface uses voice command provided by user to fine-the location, and change the zooming of the camera, if requested. The image of the location is captured by the pan/tilt camera and sent to a color TV monitor to be displayed. This type of system has applications in tele-conferencing and other rmote operations, where the system must respond to users command, in a manner similar to how the user would communicate with another person. The advantage of this approach is the elimination of the traditional input devices that the user must utilize in order to control a pan/tillt camera, replacing them with more "natural" means of interaction. A number of experiments were performed to evaluate the interface system with respect to its accuracy, efficiency, reliability, and limitation.

  • PDF

다인승 차량용 멀티미디어 서비스 지원을 위한 HMI기반 네트워크 시스템 설계에 관한 연구 (A Study on Network System Design for the Support of Multi-Passengers' Multimedia Service Based on HMI (Human Machine Interface))

  • 이상엽;이재규;조현중
    • 한국통신학회논문지
    • /
    • 제42권4호
    • /
    • pp.899-903
    • /
    • 2017
  • 본 논문은 멀티미디어 서비스를 지원하는 다인승 차량 내 HMI(Human Machine Interface)을 지원하는 네트워크 구조 및 구현방법에 관한 것이다. 다인승 차량의 경우, 기존 승용차의 멀티미디어 서비스형태와 달리 트래픽에 대한 요소를 고려해야하며, 서버에서 업데이트되는 콘텐츠 정보를 다수의 사용자에게 동시에 전달해야하는 특성이 있으며, 사용자가 네트워크에 접속하여 다양한 콘텐츠를 사용할 수 있는 확장성이 있어야한다. 따라서 본 논문에서는 기존 차량용 광 네트워크 MOST(Media Oriented System Transport) 시스템 구조를 변경하여 MOST와 연동되는 이더넷 기능을 개발하고 사용자 접근할 수 있는 인터페이스를 설계하여 다인승 차량용 멀티디미어 서비스에 필요한 데이터송수신 모듈 설계 방법을 제안한다.

역/촉감 제시 "K-Touch" 햅틱 API 개발 (Development of K-Touch haptic API(Application Programming Interface))

  • 이범찬;김종필;류제하
    • 한국HCI학회:학술대회논문집
    • /
    • 한국HCI학회 2006년도 학술대회 1부
    • /
    • pp.266-274
    • /
    • 2006
  • 본 논문은 새로운 햅틱 API 인 "K-Touch"의 개발에 관한 것이다. 그래픽 하드웨어 기반의 핵심 역감 알고리즘을 기반으로 개발된 K-Touch API 는 가상 환경을 구성하는 다양한 데이터 형식(3D polygon model, volume data, 2.5D depth image)에 대한 햅틱 상호작용을 가능하게 하고, 새로운 햅틱 알고리즘 및 장치 개발에 필요한 소프트웨어 확장성을 제공함과 동시에 사용자가 쉽고 빠르게 햅틱 응용분야를 개발할 수 있도록 설계되었다. 아울러 햅틱 감각의 중요 요소인 역감 및 촉감 상호작용을 위해 기존의 햅틱 SDK 및 API 와 달리 역/촉감을 동시에 제시할 수 있는 알고리즘이 개발되었다. 본 논문에서 제안하는 새로운 햅틱 API 의 효용성을 검증하기 위해 다양한 응용분야의 예를 구현하였다. 새로운 햅틱 API 인 K-Touch 는 사용자 및 연구자에게 보다 효율적으로 햅틱 연구를 진행 할 수 있도록 도움을 주는 툴(Tool)로써 중요한 역할을 할 것으로 기대된다.

  • PDF

인간-기계 인터페이스 및 증강현실 기술의 항공운항 분야 적용 (Application of Human Machine Interface and Augmented Reality Technology to Flight Operation)

  • 박형욱;정준;장조원;주성현;황영하
    • 한국항공운항학회지
    • /
    • 제27권2호
    • /
    • pp.54-69
    • /
    • 2019
  • The primary objective of this paper is to introduce the application of Human-Machine Interface (HMI) and Augmented Reality (AR) technologies in flight operations. These include: self-check-in, baggage handling, airport security and surveillance, airport operations monitoring, In-Flight Entertainment and Connectivity (IFEC), cockpit design, and cabin crew support. This paper investigates the application status and development trends of HMI and AR technologies for airports and aircraft. These technologies can provide more efficient in-flight passenger service and experience by using AR devices. This paper also discusses the developments such as; the Integrated Control Application (ICA) for the IFEC interface, AR flight simulation training program using the fixed-based simulator, and the AR aircraft cabin interior concept test program. These applications present how HMI and AR techniques can be utilized in actual flight operations. The developed programs in this paper can be applied to their purpose within aircraft interiors and services to enhance efficiency, comfort, and experience.

A New Eye Tracking Method as a Smartphone Interface

  • Lee, Eui Chul;Park, Min Woo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제7권4호
    • /
    • pp.834-848
    • /
    • 2013
  • To effectively use these functions many kinds of human-phone interface are used such as touch, voice, and gesture. However, the most important touch interface cannot be used in case of hand disabled person or busy both hands. Although eye tracking is a superb human-computer interface method, it has not been applied to smartphones because of the small screen size, the frequently changing geometric position between the user's face and phone screen, and the low resolution of the frontal cameras. In this paper, a new eye tracking method is proposed to act as a smartphone user interface. To maximize eye image resolution, a zoom lens and three infrared LEDs are adopted. Our proposed method has following novelties. Firstly, appropriate camera specification and image resolution are analyzed in order to smartphone based gaze tracking method. Secondly, facial movement is allowable in case of one eye region is included in image. Thirdly, the proposed method can be operated in case of both landscape and portrait screen modes. Fourthly, only two LED reflective positions are used in order to calculate gaze position on the basis of 2D geometric relation between reflective rectangle and screen. Fifthly, a prototype mock-up design module is made in order to confirm feasibility for applying to actual smart-phone. Experimental results showed that the gaze estimation error was about 31 pixels at a screen resolution of $480{\times}800$ and the average hit ratio of a $5{\times}4$ icon grid was 94.6%.

캐비닛운전원모듈을 위한 사용자인터페이스 스타일가이드 (A User Interface Style Guide for the Cabinet Operator Module)

  • 이현철;이동영;이정운
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2005년도 심포지엄 논문집 정보 및 제어부문
    • /
    • pp.203-205
    • /
    • 2005
  • A reactor protection system (RPS) plays the roles of generating the reactor trip signal and the engineered safety features (ESF) actuation signal when the monitored plant processes reach predefined limits. A Korean project group is developing a new digitalized RPS and the Cabinet Operator Module (COM) of the RPS which is used for the RPS integrity testing and monitoring by an equipment operator. A flat panel display (FPD) with a touch screen capability is provided as a main user interface for the RPS operation. To support the RPS COM user interface design, actually the FPD screen design, we developed a user interface style guide because the system designer could not properly deal with the many general human factors design guidelines. To develop the user interface style guide, various design guideline gatherings, a walk-though with a video recorder, guideline selection with respect to user interface design elements, determination of the properties of the design elements, discussion with the system designers, and a conversion of the properties into a screen design were carried out. This paper describes the process in detail and the findings in the course of the style guide development.

  • PDF

서비스 로봇을 위한 감성인터페이스 기술 (Emotional Interface Technologies for Service Robot)

  • 양현승;서용호;정일웅;한태우;노동현
    • 로봇학회논문지
    • /
    • 제1권1호
    • /
    • pp.58-65
    • /
    • 2006
  • The emotional interface is essential technology for the robot to provide the proper service to the user. In this research, we developed emotional components for the service robot such as a neural network based facial expression recognizer, emotion expression technologies based on 3D graphical face expression and joints movements, considering a user's reaction, behavior selection technology for emotion expression. We used our humanoid robots, AMI and AMIET as the test-beds of our emotional interface. We researched on the emotional interaction between a service robot and a user by integrating the developed technologies. Emotional interface technology for the service robot, enhance the performance of friendly interaction to the service robot, to increase the diversity of the service and the value-added of the robot for human. and it elevates the market growth and also contribute to the popularization of the robot. The emotional interface technology can enhance the performance of friendly interaction of the service robot. This technology can also increase the diversity of the service and the value-added of the robot for human. and it can elevate the market growth and also contribute to the popularization of the robot.

  • PDF