• 제목/요약/키워드: Robot Interaction

검색결과 481건 처리시간 0.027초

로봇활용수업이 학생의 상호작용 촉진에 미치는 효과 (The Effects of the Robot Based Instruction for Promotion of Students' Interaction)

  • 김경현
    • 공학교육연구
    • /
    • 제13권6호
    • /
    • pp.164-170
    • /
    • 2010
  • 본 연구는 초등학교 교실수업에서 로봇활용수업이 학생의 상호작용 촉진에 미치는 효과를 살펴보는 것이다. 본 연구의 결과, 로봇활용수업은 인지적 정의적 초인지적 상호작용이 고루 분포되었으며 특히 초인지적 상호작용의 비율이 상대적으로 높은 것으로 나타났다. 학습자의 세부 활동을 분석한 결과에 의하면, 로봇매체가 중개자가 되어 상호간의 의사소통을 활발히 촉진한 것으로 관찰되었으며, 명료화와 정교화에 관련한 초인지적 상호작용이 활발히 조성되었다. 따라서 로봇활용수업은 학생의 상호작용을 효과적으로 촉진하는 것으로 나타났다.

  • PDF

Safe and Reliable Intelligent Wheelchair Robot with Human Robot Interaction

  • Hyuk, Moon-In;Hyun, Joung-Sang;Kwang, Kum-Young
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.120.1-120
    • /
    • 2001
  • This paper proposes a prototype of a safe and reliable wheelchair robot with Human Robot Interaction (HRI). Since the wheelchair users are usually the handicapped, the wheelchair robot must guarantee the safety and reliability for the motion while considering users intention, A single color CCD camera is mounted for input user´s command based on human-friendly gestures, and a ultra sonic sensor array is used for sensing external motion environment. We use face and hand directional gestures as the user´s command. By combining the user´s command with the sensed environment configuration, the planner of the wheelchair robot selects an optimal motion. We implement a prototype wheelchair robot, MR, HURI (Mobile Robot with Human Robot Interaction) ...

  • PDF

지능형 서비스 로봇을 위한 문맥독립 화자인식 시스템 (Context-Independent Speaker Recognition in URC Environment)

  • 지미경;김성탁;김회린
    • 로봇학회논문지
    • /
    • 제1권2호
    • /
    • pp.158-162
    • /
    • 2006
  • This paper presents a speaker recognition system intended for use in human-robot interaction. The proposed speaker recognition system can achieve significantly high performance in the Ubiquitous Robot Companion (URC) environment. The URC concept is a scenario in which a robot is connected to a server through a broadband connection allowing functions to be performed on the server side, thereby minimizing the stand-alone function significantly and reducing the robot client cost. Instead of giving a robot (client) on-board cognitive capabilities, the sensing and processing work are outsourced to a central computer (server) connected to the high-speed Internet, with only the moving capability provided by the robot. Our aim is to enhance human-robot interaction by increasing the performance of speaker recognition with multiple microphones on the robot side in adverse distant-talking environments. Our speaker recognizer provides the URC project with a basic interface for human-robot interaction.

  • PDF

서비스 로봇을 위한 리액티브 감정 생성 모델 (Design of Reactive Emotion Process for the Service Robot)

  • 김형록;김영민;박종찬;박경숙;강태운;권동수
    • 로봇학회논문지
    • /
    • 제2권2호
    • /
    • pp.119-128
    • /
    • 2007
  • Emotion interaction between human and robot is an important element for natural interaction especially for service robot. We propose a hybrid emotion generation architecture and detailed design of reactive process in the architecture based on insight about human emotion system. Reactive emotion generation is to increase task performance and believability of the service robot. Experiment result shows that it seems possible for the reactive process to function for those purposes, and reciprocal interaction between different layers is important for proper functioning of robot's emotion generation system.

  • PDF

A Robot Motion Authoring Using Finger-Robot Interaction

  • Kim, Yoon-Sang;Seok, Kwang-Ho;Lee, Chang-Mug;Kwon, Oh-Young
    • Journal of information and communication convergence engineering
    • /
    • 제8권2호
    • /
    • pp.180-184
    • /
    • 2010
  • This paper proposes a robot motion authoring using finger-robot interaction. The proposed method is a user friendly method that easily authors (creates and controls) robot motion according to the number of fingers. The effectiveness of the proposed motion authoring method was verified based on motion authoring simulation of an industrial robot.

사람과 로봇의 사회적 상호작용을 위한 로봇의 가치효용성 기반 동기-감정 생성 모델 (Robot's Motivational Emotion Model with Value Effectiveness for Social Human and Robot Interaction)

  • 이원형;박정우;김우현;이희승;정명진
    • 제어로봇시스템학회논문지
    • /
    • 제20권5호
    • /
    • pp.503-512
    • /
    • 2014
  • People would like to be socially engaged not only with humans but also with robots. One of the most common ways in the robotic field to enhance human robot interaction is to use emotion and integrate emotional concepts into robots. Many researchers have been focusing on developing a robot's emotional expressions. However, it is first necessary to establish the psychological background of a robot's emotion generation model in order to implement the whole process of a robot's emotional behavior. Therefore, this article suggests a robot's motivational emotion model with value effectiveness from a Higgins' motivation definition, regulatory focus theory, and Circumplex model. For the test, a game with the best-two-out-of-three rule is introduced. Each step of the game was evaluated by the proposed model. As the results imply, the proposed model generated psychologically appropriate emotions for a robot in the given situation. The empirical survey remains for future work to prove that this research improves social human robot interaction.

얼굴로봇 Buddy의 기능 및 구동 메커니즘 (Functions and Driving Mechanisms for Face Robot Buddy)

  • 오경균;장명수;김승종;박신석
    • 로봇학회논문지
    • /
    • 제3권4호
    • /
    • pp.270-277
    • /
    • 2008
  • The development of a face robot basically targets very natural human-robot interaction (HRI), especially emotional interaction. So does a face robot introduced in this paper, named Buddy. Since Buddy was developed for a mobile service robot, it doesn't have a living-being like face such as human's or animal's, but a typically robot-like face with hard skin, which maybe suitable for mass production. Besides, its structure and mechanism should be simple and its production cost also should be low enough. This paper introduces the mechanisms and functions of mobile face robot named Buddy which can take on natural and precise facial expressions and make dynamic gestures driven by one laptop PC. Buddy also can perform lip-sync, eye-contact, face-tracking for lifelike interaction. By adopting a customized emotional reaction decision model, Buddy can create own personality, emotion and motive using various sensor data input. Based on this model, Buddy can interact probably with users and perform real-time learning using personality factors. The interaction performance of Buddy is successfully demonstrated by experiments and simulations.

  • PDF