• 제목/요약/키워드: Human Interaction

검색결과 2,497건 처리시간 0.029초

Soar (State Operator and Result)와 ROS 연계를 통해 거절가능 HRI 태스크의 휴머노이드로봇 구현 (Implementation of a Refusable Human-Robot Interaction Task with Humanoid Robot by Connecting Soar and ROS)

  • 당반치엔;트란트렁틴;팜쑤언쭝;길기종;신용빈;김종욱
    • 로봇학회논문지
    • /
    • 제12권1호
    • /
    • pp.55-64
    • /
    • 2017
  • This paper proposes combination of a cognitive agent architecture named Soar (State, operator, and result) and ROS (Robot Operating System), which can be a basic framework for a robot agent to interact and cope with its environment more intelligently and appropriately. The proposed Soar-ROS human-robot interaction (HRI) agent understands a set of human's commands by voice recognition and chooses to properly react to the command according to the symbol detected by image recognition, implemented on a humanoid robot. The robotic agent is allowed to refuse to follow an inappropriate command like "go" after it has seen the symbol 'X' which represents that an abnormal or immoral situation has occurred. This simple but meaningful HRI task is successfully experimented on the proposed Soar-ROS platform with a small humanoid robot, which implies that extending the present hybrid platform to artificial moral agent is possible.

직접 교시 작업을 위한 로봇 작업 정보 편집 및 재생산 기법 (Techniques of Editing and Reproducing Robot Operation Data for Direct Teaching)

  • 김한준;왕영진;김진오;백주훈
    • 한국정밀공학회지
    • /
    • 제30권1호
    • /
    • pp.96-104
    • /
    • 2013
  • Study of human-robot Interaction gets more and more attention to expand the robot application for tasks difficult by robot alone. Developed countries are preparing for a new market by introducing the concept of 'Co-Robot' model of human-robot Interaction. Our research of direct teaching is a way to instruct robot's trajectory by human's handling of its end device. This method is more intuitive than other existing methods. The benefit of this approach includes easy and fast teaching even by non-professional workers. And it can enhance utilization of robots in small and medium-sized enterprises for small quantity batch production. In this study, we developed the algorithms for creating accurate trajectory from repeated inaccurate direct teaching and GUI for the direct teaching. We also propose the basic framework for direct teaching.

스마트홈 대화형 인터페이스의 의인화 효과 음성-채팅 인터랙션 유형에 따른 실험 연구 (Effects of Anthropomorphic Conversational Interface for Smart Home: An Experimental Study on the Voice and Chatting Interactions)

  • 홍은지;조광수;최준호
    • 한국HCI학회논문지
    • /
    • 제12권1호
    • /
    • pp.15-23
    • /
    • 2017
  • 이 연구는 인간성의 개념과 구성 요인들을 스마트 홈 맥락에서 대화형 에이전트에 적용하여, 의인화의 수준과 인터랙션 유형이 사용자 감성 경험과 향후 이용 의도에 미치는 효과를 알아보고자 하였다. 실험연구를 통한 분석 결과 의인화의 고-저 수준과 인터랙션의 음성-채팅 유형은 친밀도, 호감도, 향후 이용 의도에 영향을 미치는 것으로 나타났다. 즉, 의인화 수준이 높고, 음성 인터랙션인 경우 대화형 에이전트를 더 친밀하고 호감이 가며 향후에도 계속해서 이용하고 싶은 대상으로 평가하였다. 또한, 의인화의 고-저 수준과 관계없이, 채팅에 비해 음성 인터랙션이 심리적 저항감이 낮았다. 향후 이용 의도에 대해서 의인화 수준과 인터랙션 유형의 상호작용 효과가 발견되었다. 즉, 채팅 인터랙션은 의인화의 효과가 나타나지 않은 반면, 음성 인터랙션의 경우 의인화의 효과가 나타났다. 따라서, 음성 대화형 스마트홈 에이전트는 의인화 수준을 높이는 방향으로 설계가 되어야 사용자들의 지속적 이용을 기대할 수 있다.

A Study on Infra-Technology of RCP Interaction System

  • Kim, Seung-Woo;Choe, Jae-Il
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.1121-1125
    • /
    • 2004
  • The RT(Robot Technology) has been developed as the next generation of a future technology. According to the 2002 technical report from Mitsubishi R&D center, IT(Information Technology) and RT(Robotic Technology) fusion system will grow five times larger than the current IT market at the year 2015. Moreover, a recent IEEE report predicts that most people will have a robot in the next ten years. RCP(Robotic Cellular Phone), CP(Cellular Phone) having personal robot services, will be an intermediate hi-tech personal machine between one CP a person and one robot a person generations. RCP infra consists of $RCP^{Mobility}$, $RCP^{Interaction}$, $RCP^{Integration}$ technologies. For $RCP^{Mobility}$, human-friendly motion automation and personal service with walking and arming ability are developed. $RCP^{Interaction}$ ability is achieved by modeling an emotion-generating engine and $RCP^{Integration}$ that recognizes environmental and self conditions is developed. By joining intelligent algorithms and CP communication network with the three base modules, a RCP system is constructed. Especially, the RCP interaction system is really focused in this paper. The $RCP^{interaction}$(Robotic Cellular Phone for Interaction) is to be developed as an emotional model CP as shown in figure 1. $RCP^{interaction}$ refers to the sensitivity expression and the link technology of communication of the CP. It is interface technology between human and CP through various emotional models. The interactive emotion functions are designed through differing patterns of vibrator beat frequencies and a feeling system created by a smell injection switching control. As the music influences a person, one can feel a variety of emotion from the vibrator's beats, by converting musical chord frequencies into vibrator beat frequencies. So, this paper presents the definition, the basic theory and experiment results of the RCP interaction system. We confirm a good performance of the RCP interaction system through the experiment results.

  • PDF

Tactile Sensation Display with Electrotactile Interface

  • Yarimaga, Oktay;Lee, Jun-Hun;Lee, Beom-Chan;Ryu, Je-Ha
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2005년도 ICCAS
    • /
    • pp.145-150
    • /
    • 2005
  • This paper presents an Electrotactile Display System (ETCS). One of the most important human sensory systems for human computer interaction is the sense of touch, which can be displayed to human through tactile output devices. To realize the sense of touch, electrotactile display produces controlled, localized touch sensation on the skin by passing small electric current. In electrotactile stimulation, the mechanoreceptors in the skin may be stimulated individually in order to display the sense of vibration, touch, itch, tingle, pressure etc. on the finger, palm, arm or any suitable location of the body by using appropriate electrodes and waveforms. We developed an ETCS and investigated effectiveness of the proposed system in terms of the perception of roughness of a surface by stimulating the palmar side of hand with different waveforms and the perception of direction and location information through forearm. Positive and negative pulse trains were tested with different current intensities and electrode switching times on the forearm or finger of the user with an electrode-embedded armband in order to investigate how subjects recognize displayed patterns and directions of stimulation.

  • PDF

원자력발전소에서의 인간공학적 실험평가를 위한 종합 실험설비 개발 (Development of integrated test facility for human factors experiments in nuclear power plant)

  • 오인석;이현철;천세우;박근옥;심봉식
    • 대한인간공학회지
    • /
    • 제16권1호
    • /
    • pp.107-117
    • /
    • 1997
  • It is necessary to evaluate HMI inaspects of human factors in the design stage of MMIS(man machine interface system) and feedback the result of evaluation because operators performance is mainly influenced by the HMI. Therefore, the MMIS design should be reflected the operators psychological, behavioral and physiological characteristics in the interaction with human machine interface(HMI) in order to improve the safety and availability of the MMIS of a nuclear power plant(NPP) by reduction of human error. The development of human factors experimental evaluation techniques and integrated test facility(ITF) for the human factors evaluation become an important research field to resolve hi,am factors issues on the design of an advanced control room(ACR). We developed am ITF, which is aimed to experiment with the design of the ACR and the human machine interaction as it relates to the control of NPP. This paper presents the development of an ITF that consists of three rooms such as main test room(MTR), supporting test room(STR) and experiment control room(ECR). And, the ITF has a various facilities such as a human machine simulator(HMS), experimental measurement systems and data analysis and experiment evaluation supporting system(DAEXESS). The HMS consists of full-scope simulation model of Korean standard NPP and advanced HMI based on visual display nits (VDUS) such as touch color CRT, large scale display panel(LSDP), flat panel display unit and so on.

  • PDF

Intelligent Emotional Interface for Personal Robot and Its Application to a Humanoid Robot, AMIET

  • Seo, Yong-Ho;Jeong, Il-Woong;Jung, Hye-Won;Yang, Hyun-S.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.1764-1768
    • /
    • 2004
  • In the near future, robots will be used for the personal use. To provide useful services to humans, it will be necessary for robots to understand human intentions. Consequently, the development of emotional interfaces for robots is an important expansion of human-robot interactions. We designed and developed an intelligent emotional interface for the robot, and applied the interfaces to our humanoid robot, AMIET. Subsequent human-robot interaction demonstrated that our intelligent emotional interface is very intuitive and friendly

  • PDF

호르몬 모델에 기반한 안드로이드의 감정모델 (Emotional Model for an Android based on Hormone Model)

  • 이동욱;이태근;정준영;소병록;손웅희;백문홍;김홍석;이호길
    • 로봇학회논문지
    • /
    • 제2권4호
    • /
    • pp.341-345
    • /
    • 2007
  • This paper proposes an emotional interaction model between human and robot using an android. An android is a sort of humanoid robot that the outward shape of robot is almost the same as that of human. The android is a robot platform to implement and test emotional expressions and human interaction. In order to behave for the android like human, a structure of internal emotion system is very important. In our research, we propose a novel emotional model of android based on biological hormone and emotion space. Proposed emotion model has an advantage that it can represent emotion change as time by hormone dynamics.

  • PDF

지능형 로봇 구동을 위한 제스처 인식 기술 동향 (Survey: Gesture Recognition Techniques for Intelligent Robot)

  • 오재용;이칠우
    • 제어로봇시스템학회논문지
    • /
    • 제10권9호
    • /
    • pp.771-778
    • /
    • 2004
  • Recently, various applications of robot system become more popular in accordance with rapid development of computer hardware/software, artificial intelligence, and automatic control technology. Formerly robots mainly have been used in industrial field, however, nowadays it is said that the robot will do an important role in the home service application. To make the robot more useful, we require further researches on implementation of natural communication method between the human and the robot system, and autonomous behavior generation. The gesture recognition technique is one of the most convenient methods for natural human-robot interaction, so it is to be solved for implementation of intelligent robot system. In this paper, we describe the state-of-the-art of advanced gesture recognition technologies for intelligent robots according to three methods; sensor based method, feature based method, appearance based method, and 3D model based method. And we also discuss some problems and real applications in the research field.