• 제목/요약/키워드: Human-computer interaction

검색결과 624건 처리시간 0.03초

인간-컴퓨터 인터페이스에서 사용편의성에 관한 고찰 (The Usability of The Human-Computer Interface)

  • 곽효연;이상도
    • 산업경영시스템학회지
    • /
    • 제18권36호
    • /
    • pp.13-22
    • /
    • 1995
  • The phenomenal rate of growth of the design, implementation and use of interactive computer-based systems has been paralleled by an appreciation of the criticality of the human factor with regard to successful systems operation. As the pace of technological innovation quickens, and the design of user interfaces involves more complex interaction techniques, user frustration, confusion, degraded human performance, and an unwillingness on the part of users In perform interaction tasks were potential outcomes. Consequently, the important of user-centered interfaces design and use is increasing. Usability-based systems improve user acceptance and satisfaction with the systems

  • PDF

Stereo-Vision-Based Human-Computer Interaction with Tactile Stimulation

  • Yong, Ho-Joong;Back, Jong-Won;Jang, Tae-Jeong
    • ETRI Journal
    • /
    • 제29권3호
    • /
    • pp.305-310
    • /
    • 2007
  • If a virtual object in a virtual environment represented by a stereo vision system could be touched by a user with some tactile feeling on his/her fingertip, the sense of reality would be heightened. To create a visual impression as if the user were directly pointing to a desired point on a virtual object with his/her own finger, we need to align virtual space coordinates and physical space coordinates. Also, if there is no tactile feeling when the user touches a virtual object, the virtual object would seem to be a ghost. Therefore, a haptic interface device is required to give some tactile sensation to the user. We have constructed such a human-computer interaction system in the form of a simple virtual reality game using a stereo vision system, a vibro-tactile device module, and two position/orientation sensors.

  • PDF

얼굴 주시방향 인식을 이용한 장애자용 의사 전달 시스템 (Human-Computer Interaction System for the disabled using Recognition of Face Direction)

  • 정상현;문인혁
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2001년도 하계종합학술대회 논문집(4)
    • /
    • pp.175-178
    • /
    • 2001
  • This paper proposes a novel human-computer interaction system for the disabled using recognition of face direction. Face direction is recognized by comparing positions of center of gravity between face region and facial features such as eyes and eyebrows. The face region is first selected by using color information, and then the facial features are extracted by applying a separation filter to the face region. The process speed for recognition of face direction is 6.57frame/sec with a success rate of 92.9% without any special hardware for image processing. We implement human-computer interaction system using screen menu, and show a validity of the proposed method from experimental results.

  • PDF

손동작 인식을 통한 Human-Computer Interaction 구현 (Recognition of Hand gesture to Human-Computer Interaction)

  • 이래경;김성신
    • 한국지능시스템학회논문지
    • /
    • 제11권1호
    • /
    • pp.28-32
    • /
    • 2001
  • 인간의 손동작 인식은 오랫동안 언어로서의 역할을 해왔던 통신수단의 한 방법이다. 현대의 사회가 정보화 사회로 진행됨에 따라 보다 빠르고 정확한 의사소통 및 정보의 전달을 필요로 하는 가운데 사람과 컴퓨터간의 상호 연결 혹은 사람의 의사 표현에 있어 기존의 장치들이 가지는 단점을 보안하며 이 부분에 사람의 두 손으로 표현되는 자유로운 몸짓을 이용하려는 연구가 최근에 많이 진행되고 있는 추세이다. 본 논문에선 2차원 입력 영상으로부터 동적인 손동작의 사용 없이 손의 특징을 이용한 새로운 인식 알고리즘을 제안하고, 보다 높은 인식률과 실 시간적 처리를 위해 Radial Basis Function Network 및 부가적인 특징점을 통한 손동작의 인식을 구현하였다. 또한 인식된 손동작의 의미를 바탕으로 인식률 및 손동작 표현의 의미성에 대한 정확도를 판별하기 위해 로봇의 제어에 적용한 실험을 수행하였다.

  • PDF

Intelligent Emotional Interface for Personal Robot and Its Application to a Humanoid Robot, AMIET

  • Seo, Yong-Ho;Jeong, Il-Woong;Jung, Hye-Won;Yang, Hyun-S.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.1764-1768
    • /
    • 2004
  • In the near future, robots will be used for the personal use. To provide useful services to humans, it will be necessary for robots to understand human intentions. Consequently, the development of emotional interfaces for robots is an important expansion of human-robot interactions. We designed and developed an intelligent emotional interface for the robot, and applied the interfaces to our humanoid robot, AMIET. Subsequent human-robot interaction demonstrated that our intelligent emotional interface is very intuitive and friendly

  • PDF

Human-Computer Interaction Based Only on Auditory and Visual Information

  • Sha, Hui;Agah, Arvin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • 제2권4호
    • /
    • pp.285-297
    • /
    • 2000
  • One of the research objectives in the area of multimedia human-computer interaction is the application of artificial intelligence and robotics technologies to the development of computer interfaces. This involves utilizing many forms of media, integrating speed input, natural language, graphics, hand pointing gestures, and other methods for interactive dialogues. Although current human-computer communication methods include computer keyboards, mice, and other traditional devices, the two basic ways by which people communicate with each other are voice and gesture. This paper reports on research focusing on the development of an intelligent multimedia interface system modeled based on the manner in which people communicate. This work explores the interaction between humans and computers based only on the processing of speech(Work uttered by the person) and processing of images(hand pointing gestures). The purpose of the interface is to control a pan/tilt camera to point it to a location specified by the user through utterance of words and pointing of the hand, The systems utilizes another stationary camera to capture images of the users hand and a microphone to capture the users words. Upon processing of the images and sounds, the systems responds by pointing the camera. Initially, the interface uses hand pointing to locate the general position which user is referring to and then the interface uses voice command provided by user to fine-the location, and change the zooming of the camera, if requested. The image of the location is captured by the pan/tilt camera and sent to a color TV monitor to be displayed. This type of system has applications in tele-conferencing and other rmote operations, where the system must respond to users command, in a manner similar to how the user would communicate with another person. The advantage of this approach is the elimination of the traditional input devices that the user must utilize in order to control a pan/tillt camera, replacing them with more "natural" means of interaction. A number of experiments were performed to evaluate the interface system with respect to its accuracy, efficiency, reliability, and limitation.

  • PDF

Emotional Model Focused on Robot's Familiarity to Human

  • Choi, Tae-Yong;Kim, Chang-Hyun;Lee, Ju-Jang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2005년도 ICCAS
    • /
    • pp.1025-1030
    • /
    • 2005
  • This paper deals with the emotional model of the software-robot. The software-robot requires several capabilities such as sensing, perceiving, acting, communicating, and surviving. and so on. There are already many studies about the emotional model like KISMET and AIBO. The new emotional model using the modified friendship scheme is proposed in this paper. Quite often, the available emotional models have time invariant human respond architectures. Conventional emotional models make the sociable robot get around with humans, and obey human commands during robot operation. This behavior makes the robot very different from real pets. Similar to real pets, the proposed emotional model with the modified friendship capability has time varying property depending on interaction between human and robot.

  • PDF

Human-Computer Natur al User Inter face Based on Hand Motion Detection and Tracking

  • Xu, Wenkai;Lee, Eung-Joo
    • 한국멀티미디어학회논문지
    • /
    • 제15권4호
    • /
    • pp.501-507
    • /
    • 2012
  • Human body motion is a non-verbal part for interaction or movement that can be used to involves real world and virtual world. In this paper, we explain a study on natural user interface (NUI) in human hand motion recognition using RGB color information and depth information by Kinect camera from Microsoft Corporation. To achieve the goal, hand tracking and gesture recognition have no major dependencies of the work environment, lighting or users' skin color, libraries of particular use for natural interaction and Kinect device, which serves to provide RGB images of the environment and the depth map of the scene were used. An improved Camshift tracking algorithm is used to tracking hand motion, the experimental results show out it has better performance than Camshift algorithm, and it has higher stability and accuracy as well.

비 HCI 전공자들을 대상으로 한 Nielsen의 Usability Heuristics에 대한 이해 정도 평가 (Evaluating the Effectiveness of Nielsen's Usability Heuristics for Computer Engineers and Designers without Human Computer Interaction Background)

  • 정영주;심인숙;정구철
    • 한국실천공학교육학회논문지
    • /
    • 제2권2호
    • /
    • pp.165-171
    • /
    • 2010
  • Usability heuristics는 사용자 인터페이스를 설계하는 과정에서 유용성 평가에 사용되는 일반적인 원칙이다. 유용성 평가 방법은 보통 Human Computer Interaction(HCI) 전문가들이 사용하는데, 이 연구의 최종적인 목적은 HCI 전문가들 뿐 만이 아니라 사용자 인터페이스를 제작하는 더 광범위한 많은 사람들(사용자 인터페이스 디자이너나 엔지니어들)에게 적용하는 것이다. 따라서 본 연구에서는 유용성 평가에서 대표적으로 사용되는 Jakob Nielsen의 10가지 usability heuristics가 HCI 전문가가 아닌 디자인과 컴퓨터 공학 교수 및 학생들이 얼마나 이해하기 쉬운지를 설문하였다. 또한 응답한 설문을 바탕으로 어떤 heuristics들이 이해하기 쉬었으며, 이해하기 어려운 것들은 어떠한 이유로 이해하기가 어려웠는지, 또 교수들과 학생들과의 이해정도의 차이나 응답반응의 차이 등을 분석하였다. 본 연구를 통하여 얻게 된 가장 큰 성과는 응답자들이 좀 더 사용자 입장에서 어플리케이션들을 바라볼 수 있게 되었다는 점이었으며, 본 연구의 결과들은 usability heuristics를 좀 더 많은 사람들이 쉽게 이해할 수 있도록 만들어서 인터페이스를 설계하고 구현하는데 도움이 될 것이다.

  • PDF

Automatic Gesture Recognition for Human-Machine Interaction: An Overview

  • Nataliia, Konkina
    • International Journal of Computer Science & Network Security
    • /
    • 제22권1호
    • /
    • pp.129-138
    • /
    • 2022
  • With the increasing reliance of computing systems in our everyday life, there is always a constant need to improve the ways users can interact with such systems in a more natural, effective, and convenient way. In the initial computing revolution, the interaction between the humans and machines have been limited. The machines were not necessarily meant to be intelligent. This begged for the need to develop systems that could automatically identify and interpret our actions. Automatic gesture recognition is one of the popular methods users can control systems with their gestures. This includes various kinds of tracking including the whole body, hands, head, face, etc. We also touch upon a different line of work including Brain-Computer Interface (BCI), Electromyography (EMG) as potential additions to the gesture recognition regime. In this work, we present an overview of several applications of automated gesture recognition systems and a brief look at the popular methods employed.