• 제목/요약/키워드: Human and Robot Interaction

검색결과 320건 처리시간 0.026초

Intelligent Emotional Interface for Personal Robot and Its Application to a Humanoid Robot, AMIET

  • Seo, Yong-Ho;Jeong, Il-Woong;Jung, Hye-Won;Yang, Hyun-S.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.1764-1768
    • /
    • 2004
  • In the near future, robots will be used for the personal use. To provide useful services to humans, it will be necessary for robots to understand human intentions. Consequently, the development of emotional interfaces for robots is an important expansion of human-robot interactions. We designed and developed an intelligent emotional interface for the robot, and applied the interfaces to our humanoid robot, AMIET. Subsequent human-robot interaction demonstrated that our intelligent emotional interface is very intuitive and friendly

  • PDF

지능형 로봇 구동을 위한 제스처 인식 기술 동향 (Survey: Gesture Recognition Techniques for Intelligent Robot)

  • 오재용;이칠우
    • 제어로봇시스템학회논문지
    • /
    • 제10권9호
    • /
    • pp.771-778
    • /
    • 2004
  • Recently, various applications of robot system become more popular in accordance with rapid development of computer hardware/software, artificial intelligence, and automatic control technology. Formerly robots mainly have been used in industrial field, however, nowadays it is said that the robot will do an important role in the home service application. To make the robot more useful, we require further researches on implementation of natural communication method between the human and the robot system, and autonomous behavior generation. The gesture recognition technique is one of the most convenient methods for natural human-robot interaction, so it is to be solved for implementation of intelligent robot system. In this paper, we describe the state-of-the-art of advanced gesture recognition technologies for intelligent robots according to three methods; sensor based method, feature based method, appearance based method, and 3D model based method. And we also discuss some problems and real applications in the research field.

안내 로봇을 향한 관람객의 행위 인식 기반 관심도 추정 (Estimating Interest Levels based on Visitor Behavior Recognition Towards a Guide Robot)

  • 이예준;김주현;정의정;김민규
    • 로봇학회논문지
    • /
    • 제18권4호
    • /
    • pp.463-471
    • /
    • 2023
  • This paper proposes a method to estimate the level of interest shown by visitors towards a specific target, a guide robot, in spaces where a large number of visitors, such as exhibition halls and museums, can show interest in a specific subject. To accomplish this, we apply deep learning-based behavior recognition and object tracking techniques for multiple visitors, and based on this, we derive the behavior analysis and interest level of visitors. To implement this research, a personalized dataset tailored to the characteristics of exhibition hall and museum environments was created, and a deep learning model was constructed based on this. Four scenarios that visitors can exhibit were classified, and through this, prediction and experimental values were obtained, thus completing the validation for the interest estimation method proposed in this paper.

이륜 구동 로봇의 균형 각도 조절을 통한 사람과의 상호 제어의 실험적 연구 (Experimental Studies of Balancing Control of a Two-wheel Mobile Robot for Human Interaction by Angle Modification)

  • 이승준;정슬
    • 로봇학회논문지
    • /
    • 제8권2호
    • /
    • pp.67-74
    • /
    • 2013
  • This paper presents interaction force control between a balancing robot and a human operator. The balancing robot has two wheels to generate movements on the plane. Since the balancing robot is based on position control, the robot tries to maintain a desired angle to be zero when an external force is applied. This leads to the instability of the system. Thus a hybrid force control method is employed to react the external force from the operator to guide the balancing robot to the desired position by a human operator. Therefore, when an operator applies a force to the robot, desired balancing angles should be modified to maintain stable balance. To maintain stable balance under an external force, suitable desired balancing angles are determined along with force magnitudes applied by the operator through experimental studies. Experimental studies confirm the functionality of the proposed method.

Soar (State Operator and Result)와 ROS 연계를 통해 거절가능 HRI 태스크의 휴머노이드로봇 구현 (Implementation of a Refusable Human-Robot Interaction Task with Humanoid Robot by Connecting Soar and ROS)

  • 당반치엔;트란트렁틴;팜쑤언쭝;길기종;신용빈;김종욱
    • 로봇학회논문지
    • /
    • 제12권1호
    • /
    • pp.55-64
    • /
    • 2017
  • This paper proposes combination of a cognitive agent architecture named Soar (State, operator, and result) and ROS (Robot Operating System), which can be a basic framework for a robot agent to interact and cope with its environment more intelligently and appropriately. The proposed Soar-ROS human-robot interaction (HRI) agent understands a set of human's commands by voice recognition and chooses to properly react to the command according to the symbol detected by image recognition, implemented on a humanoid robot. The robotic agent is allowed to refuse to follow an inappropriate command like "go" after it has seen the symbol 'X' which represents that an abnormal or immoral situation has occurred. This simple but meaningful HRI task is successfully experimented on the proposed Soar-ROS platform with a small humanoid robot, which implies that extending the present hybrid platform to artificial moral agent is possible.

감성 상호작용을 갖는 교육용 휴머노이드 로봇 D2 개발 (Design and implement of the Educational Humanoid Robot D2 for Emotional Interaction System)

  • 김도우;정기철;박원성
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2007년도 제38회 하계학술대회
    • /
    • pp.1777-1778
    • /
    • 2007
  • In this paper, We design and implement a humanoid robot, With Educational purpose, which can collaborate and communicate with human. We present an affective human-robot communication system for a humanoid robot, D2, which we designed to communicate with a human through dialogue. D2 communicates with humans by understanding and expressing emotion using facial expressions, voice, gestures and posture. Interaction between a human and a robot is made possible through our affective communication framework. The framework enables a robot to catch the emotional status of the user and to respond appropriately. As a result, the robot can engage in a natural dialogue with a human. According to the aim to be interacted with a human for voice, gestures and posture, the developed Educational humanoid robot consists of upper body, two arms, wheeled mobile platform and control hardware including vision and speech capability and various control boards such as motion control boards, signal processing board proceeding several types of sensors. Using the Educational humanoid robot D2, we have presented the successful demonstrations which consist of manipulation task with two arms, tracking objects using the vision system, and communication with human by the emotional interface, the synthesized speeches, and the recognition of speech commands.

  • PDF

사람과 로봇의 상호작용을 통한 청소 로봇 알고리즘 (Cleaning Robot Algorithm through Human-Robot Interaction)

  • 김승용;김태형
    • 한국정보과학회논문지:소프트웨어및응용
    • /
    • 제35권5호
    • /
    • pp.297-305
    • /
    • 2008
  • 청소 로봇은 지도 작성 및 위치 인식을 기준으로 청소 방법을 랜덤(random)방식과 매핑(mapping)방식으로 분류 할 수 있다. 랜덤방식은 지도를 작성하지 않아 가격경쟁력이 있지만 효율이 떨어진다. 반면, 매핑방식은 지도를 작성하므로 청소 효율이 높지만 상대적인 가격경쟁력이 떨어진다. 그러므로 랜덤방식과 매핑방식의 문제점들을 보안하기 위해 본 논문은 고가의 센서 정보를 사용하지 않고 사람이 청소 로봇에게 청소 공간에 대한 정보를 알려주며, 이 정보를 이용하여 기존의 청소 로봇보다 효율적이고 저렴한 청소 로봇을 제안한다. 또한 기존의 청소 로봇과 본 논문에서 제안한 청소 로봇과의 성능 비교를 통해서 본 논문에서 제안한 방식의 효율성을 보인다.

인간의 비언어적 행동 특징을 이용한 다중 사용자의 상호작용 의도 분석 (Interaction Intent Analysis of Multiple Persons using Nonverbal Behavior Features)

  • 윤상석;김문상;최문택;송재복
    • 제어로봇시스템학회논문지
    • /
    • 제19권8호
    • /
    • pp.738-744
    • /
    • 2013
  • According to the cognitive science research, the interaction intent of humans can be estimated through an analysis of the representing behaviors. This paper proposes a novel methodology for reliable intention analysis of humans by applying this approach. To identify the intention, 8 behavioral features are extracted from the 4 characteristics in human-human interaction and we outline a set of core components for nonverbal behavior of humans. These nonverbal behaviors are associated with various recognition modules including multimodal sensors which have each modality with localizing sound source of the speaker in the audition part, recognizing frontal face and facial expression in the vision part, and estimating human trajectories, body pose and leaning, and hand gesture in the spatial part. As a post-processing step, temporal confidential reasoning is utilized to improve the recognition performance and integrated human model is utilized to quantitatively classify the intention from multi-dimensional cues by applying the weight factor. Thus, interactive robots can make informed engagement decision to effectively interact with multiple persons. Experimental results show that the proposed scheme works successfully between human users and a robot in human-robot interaction.

Agent Mobility in Human Robot Interaction

  • Nguyen, To Dong;Oh, Sang-Rok;You, Bum-Jae
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2005년도 제36회 하계학술대회 논문집 D
    • /
    • pp.2771-2773
    • /
    • 2005
  • In network human-robot interaction, human can access services of a robot system through the network The communication is done by interacting with the distributed sensors via voice, gestures or by using user network access device such as computer, PDA. The service organization and exploration is very important for this distributed system. In this paper we propose a new agent-based framework to integrate partners of this distributed system together and help users to explore the service effectively without complicated configuration. Our system consists of several robots. users and distributed sensors. These partners are connected in a decentralized but centralized control system using agent-based technology. Several experiments are conducted successfully using our framework The experiments show that this framework is good in term of increasing the availability of the system, reducing the time users and robots needs to connect to the network at the same time. The framework also provides some coordination methods for the human robot interaction system.

  • PDF

Human-Robot Interaction in Real Environments by Audio-Visual Integration

  • Kim, Hyun-Don;Choi, Jong-Suk;Kim, Mun-Sang
    • International Journal of Control, Automation, and Systems
    • /
    • 제5권1호
    • /
    • pp.61-69
    • /
    • 2007
  • In this paper, we developed not only a reliable sound localization system including a VAD(Voice Activity Detection) component using three microphones but also a face tracking system using a vision camera. Moreover, we proposed a way to integrate three systems in the human-robot interaction to compensate errors in the localization of a speaker and to reject unnecessary speech or noise signals entering from undesired directions effectively. For the purpose of verifying our system's performances, we installed the proposed audio-visual system in a prototype robot, called IROBAA(Intelligent ROBot for Active Audition), and demonstrated how to integrate the audio-visual system.