• Title/Summary/Keyword: Human-robot Interaction

Search Result 342, Processing Time 0.028 seconds

Human-Computer Interaction Survey for Intelligent Robot (지능형 로봇을 위한 인간-컴퓨터 상호작용(HCI) 연구동향)

  • Hong, Seok-Ju;Lee, Chil-Woo
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2006.11a
    • /
    • pp.507-511
    • /
    • 2006
  • Intelligent robot is defined as a system that it judges autonomously based on sensory organ of sight, hearing etc.. analogously with human. Human communicates using nonverbal means such as gesture in addition to language. If robot understands such nonverbal communication means, robot may become familiar with human . HCI (Human Computer Interaction) technologies are studied vigorously including face recognition and gesture recognition, but they are many problems that must be solved in real conditions. In this paper, we introduce the importance of contents and give application example of technology stressed on the recent research result about gesture recognition technology as one of most natural communication method with human.

  • PDF

Functions and Driving Mechanisms for Face Robot Buddy (얼굴로봇 Buddy의 기능 및 구동 메커니즘)

  • Oh, Kyung-Geune;Jang, Myong-Soo;Kim, Seung-Jong;Park, Shin-Suk
    • The Journal of Korea Robotics Society
    • /
    • v.3 no.4
    • /
    • pp.270-277
    • /
    • 2008
  • The development of a face robot basically targets very natural human-robot interaction (HRI), especially emotional interaction. So does a face robot introduced in this paper, named Buddy. Since Buddy was developed for a mobile service robot, it doesn't have a living-being like face such as human's or animal's, but a typically robot-like face with hard skin, which maybe suitable for mass production. Besides, its structure and mechanism should be simple and its production cost also should be low enough. This paper introduces the mechanisms and functions of mobile face robot named Buddy which can take on natural and precise facial expressions and make dynamic gestures driven by one laptop PC. Buddy also can perform lip-sync, eye-contact, face-tracking for lifelike interaction. By adopting a customized emotional reaction decision model, Buddy can create own personality, emotion and motive using various sensor data input. Based on this model, Buddy can interact probably with users and perform real-time learning using personality factors. The interaction performance of Buddy is successfully demonstrated by experiments and simulations.

  • PDF

Robot-Human Task Sharing System for Assembly Process (조립 공정을 위한 로봇-사람 간 작업 공유 시스템)

  • Minwoo Na;Tae Hwa Hong;Junwan Yun;Jae-Bok Song
    • The Journal of Korea Robotics Society
    • /
    • v.18 no.4
    • /
    • pp.419-426
    • /
    • 2023
  • Assembly tasks are difficult to fully automate due to uncertain errors occurring in unstructured environments. When assembling parts such as electrical connectors, advances in grasping and assembling technology have made it possible for the robot to assemble the connectors without the aid of humans. However, some parts with tight assembly tolerances should be assembled by humans. Therefore, task sharing with human-robot interaction is emerging as an alternative. The goal of this concept is to achieve shared autonomy, which reduces the efforts of humans when carrying out repetitive tasks. In this study, a task-sharing robotic system for assembly process has been proposed to achieve shared autonomy. This system consists of two parts, one for robotic grasping and assembly, and the other for monitoring the process for robot-human task sharing. Experimental results show that robots and humans share tasks efficiently while performing assembly tasks successfully.

A Human Robot Interactive System 'RoJi '

  • Yoon, Joongsun
    • Journal of Mechanical Science and Technology
    • /
    • v.18 no.11
    • /
    • pp.1900-1908
    • /
    • 2004
  • A human-friendly interactive system that is based on the harmonious symbiotic coexistence of human and robots is explored. Based on interactive technology paradigm, a robotic cane is proposed for blind or visually impaired travelers to navigate safely and quickly through obstacles and other hazards faced by blind pedestrians. Robotic aids, such as robotic canes, require cooperation between human and robots. Various methods for implementing the appropriate cooperative recognition, planning, and acting, have been investigated. The issues discussed include the interaction between humans and robots, design issues of an interactive robotic cane, and behavior arbitration methodologies for navigation planning.

Emotional Interface Technologies for Service Robot (서비스 로봇을 위한 감성인터페이스 기술)

  • Yang, Hyun-Seung;Seo, Yong-Ho;Jeong, Il-Woong;Han, Tae-Woo;Rho, Dong-Hyun
    • The Journal of Korea Robotics Society
    • /
    • v.1 no.1
    • /
    • pp.58-65
    • /
    • 2006
  • The emotional interface is essential technology for the robot to provide the proper service to the user. In this research, we developed emotional components for the service robot such as a neural network based facial expression recognizer, emotion expression technologies based on 3D graphical face expression and joints movements, considering a user's reaction, behavior selection technology for emotion expression. We used our humanoid robots, AMI and AMIET as the test-beds of our emotional interface. We researched on the emotional interaction between a service robot and a user by integrating the developed technologies. Emotional interface technology for the service robot, enhance the performance of friendly interaction to the service robot, to increase the diversity of the service and the value-added of the robot for human. and it elevates the market growth and also contribute to the popularization of the robot. The emotional interface technology can enhance the performance of friendly interaction of the service robot. This technology can also increase the diversity of the service and the value-added of the robot for human. and it can elevate the market growth and also contribute to the popularization of the robot.

  • PDF

Human emotional elements and external stimulus information-based Artificial Emotion Expression System for HRI (HRI를 위한 사람의 내적 요소 기반의 인공 정서 표현 시스템)

  • Oh, Seung-Won;Hahn, Min-Soo
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.7-12
    • /
    • 2008
  • In human and robot interaction, the role of emotion becomes more important Therefore, robots need the emotion expression mechanism similar to human. In this paper, we suggest a new emotion expression system based on the psychological studies and it consists of five affective elements, i.e., the emotion, the mood, the personality, the tendency, and the machine rhythm. Each element has somewhat peculiar influence on the emotion expression pattern change according to their characteristics. As a result, although robots were exposed to the same external stimuli, each robot can show a different emotion expression pattern. The proposed system may contribute to make a rather natural and human-friendly human-robot interaction and to promote more intimate relationships between people and robots.

  • PDF

Effects of LED on Emotion-Like Feedback of a Single-Eyed Spherical Robot

  • Onchi, Eiji;Cornet, Natanya;Lee, SeungHee
    • Science of Emotion and Sensibility
    • /
    • v.24 no.3
    • /
    • pp.115-124
    • /
    • 2021
  • Non-verbal communication is important in human interaction. It provides a layer of information that complements the message being transmitted. This type of information is not limited to human speakers. In human-robot communication, increasing the animacy of the robotic agent-by using non-verbal cues-can aid the expression of abstract concepts such as emotions. Considering the physical limitations of artificial agents, robots can use light and movement to express equivalent emotional feedback. This study analyzes the effects of LED and motion animation of a spherical robot on the emotion being expressed by the robot. A within-subjects experiment was conducted at the University of Tsukuba where participants were asked to rate 28 video samples of a robot interacting with a person. The robot displayed different motions with and without light animations. The results indicated that adding LED animations changes the emotional impression of the robot for valence, arousal, and dominance dimensions. Furthermore, people associated various situations according to the robot's behavior. These stimuli can be used to modulate the intensity of the emotion being expressed and enhance the interaction experience. This paper facilitates the possibility of designing more affective robots in the future, using simple feedback.

Analysis of User's Eye Gaze Distribution while Interacting with a Robotic Character (로봇 캐릭터와의 상호작용에서 사용자의 시선 배분 분석)

  • Jang, Seyun;Cho, Hye-Kyung
    • The Journal of Korea Robotics Society
    • /
    • v.14 no.1
    • /
    • pp.74-79
    • /
    • 2019
  • In this paper, we develop a virtual experimental environment to investigate users' eye gaze in human-robot social interaction, and verify it's potential for further studies. The system consists of a 3D robot character capable of hosting simple interactions with a user, and a gaze processing module recording which body part of the robot character, such as eyes, mouth or arms, the user is looking at, regardless of whether the robot is stationary or moving. To verify that the results acquired on this virtual environment are aligned with those of physically existing robots, we performed robot-guided quiz sessions with 120 participants and compared the participants' gaze patterns with those in previous works. The results included the followings. First, when interacting with the robot character, the user's gaze pattern showed similar statistics as the conversations between humans. Second, an animated mouth of the robot character received longer attention compared to the stationary one. Third, nonverbal interactions such as leakage cues were also effective in the interaction with the robot character, and the correct answer ratios of the cued groups were higher. Finally, gender differences in the users' gaze were observed, especially in the frequency of the mutual gaze.

Is Robot Alive? : Young Children's Perception of a Teacher Assistant Robot in a Classroom (로봇은 살아 있을까? : 우리 반 교사보조로봇에 대한 유아의 인식)

  • Hyun, Eun-Ja;Son, Soo-Ryun
    • Korean Journal of Child Studies
    • /
    • v.32 no.4
    • /
    • pp.1-14
    • /
    • 2011
  • The purpose of this study was to investigate young children's perceptions of a teacher assistant robot, IrobiQ. in a kindergarten classroom. The subjects of this study were 23 6-year-olds attending to G kindergarten located in E city, Korea, where the teacher assistant robot had been in operation since Oct. 2008. Each child responded to questions assessing the child's perceptions of IrobiQ's identity regarding four domains : it's biological, intellectual, emotional and social identity. Some questions asked the child to affirm or deny some characteristics pertaining to the robot and the other questions asked the reasons for the answer given. The results indicated that while majority of children considered an IrobiQ not as a biological entity, but as a machine, they thought it could have an emotion and be their playmate. The implications of these results are two folds : firstly, they force us to reconsider the traditional ontological categories regarding intelligent service robots to understand human-robot interaction and secondly, they open up an ecological perspective on the design of teacher assistant robots for use with young children in early childhood education settings.

Multi-modal Sensor System and Database for Human Detection and Activity Learning of Robot in Outdoor (실외에서 로봇의 인간 탐지 및 행위 학습을 위한 멀티모달센서 시스템 및 데이터베이스 구축)

  • Uhm, Taeyoung;Park, Jeong-Woo;Lee, Jong-Deuk;Bae, Gi-Deok;Choi, Young-Ho
    • Journal of Korea Multimedia Society
    • /
    • v.21 no.12
    • /
    • pp.1459-1466
    • /
    • 2018
  • Robots which detect human and recognize action are important factors for human interaction, and many researches have been conducted. Recently, deep learning technology has developed and learning based robot's technology is a major research area. These studies require a database to learn and evaluate for intelligent human perception. In this paper, we propose a multi-modal sensor-based image database condition considering the security task by analyzing the image database to detect the person in the outdoor environment and to recognize the behavior during the running of the robot.