• 제목/요약/키워드: Robot Interaction

검색결과 481건 처리시간 0.026초

Adaptive Particle Filter와 Active Appearance Model을 이용한 얼굴 특징 추적 (Facial Feature Tracking Using Adaptive Particle Filter and Active Appearance Model)

  • 조덕현;이상훈;서일홍
    • 로봇학회논문지
    • /
    • 제8권2호
    • /
    • pp.104-115
    • /
    • 2013
  • For natural human-robot interaction, we need to know location and shape of facial feature in real environment. In order to track facial feature robustly, we can use the method combining particle filter and active appearance model. However, processing speed of this method is too slow. In this paper, we propose two ideas to improve efficiency of this method. The first idea is changing the number of particles situationally. And the second idea is switching the prediction model situationally. Experimental results is presented to show that the proposed method is about three times faster than the method combining particle filter and active appearance model, whereas the performance of the proposed method is maintained.

멀티모달 상호작용 중심의 로봇기반교육 콘텐츠를 활용한 r-러닝 시스템 사용의도 분석 (A Study on the Intention to Use a Robot-based Learning System with Multi-Modal Interaction)

  • 오준석;조혜경
    • 제어로봇시스템학회논문지
    • /
    • 제20권6호
    • /
    • pp.619-624
    • /
    • 2014
  • This paper introduces a robot-based learning system which is designed to teach multiplication to children. In addition to a small humanoid and a smart device delivering educational content, we employ a type of mixed-initiative operation which provides enhanced multi-modal cognition to the r-learning system through human intervention. To investigate major factors that influence people's intention to use the r-learning system and to see how the multi-modality affects the connections, we performed a user study based on TAM (Technology Acceptance Model). The results support the fact that the quality of the system and the natural interaction are key factors for the r-learning system to be used, and they also reveal very interesting implications related to the human behaviors.

모바일 기기를 통한 지능형 로봇의 인간-로봇 상호작용 (Human-Robot Interaction by Mobile Device for Intelligence Robot)

  • 최병기;곽별샘;박춘성;이재호
    • 한국정보과학회:학술대회논문집
    • /
    • 한국정보과학회 2010년도 한국컴퓨터종합학술대회논문집 Vol.37 No.1(B)
    • /
    • pp.43-46
    • /
    • 2010
  • 지능형 로봇의 인간-로봇 상호작용(Human-Robot Interaction)은 현재 세계적으로 주목받는 연구 분야 중 하나이다. 이 논문에서는 지능형 로봇의 작업 완결성을 높이기 위한 방편의 일환으로 사용자에게 작업 내용을 공개하고, 문제 발생 시 사용자의 개입을 유도하는 형태의 인간-로봇 상호작용을 제안한다. 이러한 형태의 서비스는 지능형 로봇이 작업 수행 중 필요한 경우 사용자의 선택 통해 예상치 못한 상황에 대해 유연한 대처를 할 수 있으며, 동시에 로봇의 현재 상황을 사용자가 확인할 수 있도록 하는 수단을 제공한다. 이를 위해 이 논문은 Android 플랫폼과 로봇 간의 로봇-사용자 정보교환을 구현하고, 나아가 범용적이고 일반적인 인간-로봇 상호작용을 위한 연구방향을 제시하고자 한다.

  • PDF

Biosign Recognition based on the Soft Computing Techniques with application to a Rehab -type Robot

  • Lee, Ju-Jang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.29.2-29
    • /
    • 2001
  • For the design of human-centered systems in which a human and machine such as a robot form a human-in system, human-friendly interaction/interface is essential. Human-friendly interaction is possible when the system is capable of recognizing human biosigns such as5 EMG Signal, hand gesture and facial expressions so the some humanintention and/or emotion can be inferred and is used as a proper feedback signal. In the talk, we report our experiences of applying the Soft computing techniques including Fuzzy, ANN, GA and rho rough set theory for efficiently recognizing various biosigns and for effective inference. More specifically, we first observe characteristics of various forms of biosigns and propose a new way of extracting feature set for such signals. Then we show a standardized procedure of getting an inferred intention or emotion from the signals. Finally, we present examples of application for our model of rehabilitation robot named.

  • PDF

Deep Level Situation Understanding for Casual Communication in Humans-Robots Interaction

  • Tang, Yongkang;Dong, Fangyan;Yoichi, Yamazaki;Shibata, Takanori;Hirota, Kaoru
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제15권1호
    • /
    • pp.1-11
    • /
    • 2015
  • A concept of Deep Level Situation Understanding is proposed to realize human-like natural communication (called casual communication) among multi-agent (e.g., humans and robots/machines), where the deep level situation understanding consists of surface level understanding (such as gesture/posture understanding, facial expression understanding, speech/voice understanding), emotion understanding, intention understanding, and atmosphere understanding by applying customized knowledge of each agent and by taking considerations of thoughtfulness. The proposal aims to reduce burden of humans in humans-robots interaction, so as to realize harmonious communication by excluding unnecessary troubles or misunderstandings among agents, and finally helps to create a peaceful, happy, and prosperous humans-robots society. A simulated experiment is carried out to validate the deep level situation understanding system on a scenario where meeting-room reservation is done between a human employee and a secretary-robot. The proposed deep level situation understanding system aims to be applied in service robot systems for smoothing the communication and avoiding misunderstanding among agents.

강인한 손가락 끝 추출과 확장된 CAMSHIFT 알고리즘을 이용한 자연스러운 Human-Robot Interaction을 위한 손동작 인식 (A Robust Fingertip Extraction and Extended CAMSHIFT based Hand Gesture Recognition for Natural Human-like Human-Robot Interaction)

  • 이래경;안수용;오세영
    • 제어로봇시스템학회논문지
    • /
    • 제18권4호
    • /
    • pp.328-336
    • /
    • 2012
  • In this paper, we propose a robust fingertip extraction and extended Continuously Adaptive Mean Shift (CAMSHIFT) based robust hand gesture recognition for natural human-like HRI (Human-Robot Interaction). Firstly, for efficient and rapid hand detection, the hand candidate regions are segmented by the combination with robust $YC_bC_r$ skin color model and haar-like features based adaboost. Using the extracted hand candidate regions, we estimate the palm region and fingertip position from distance transformation based voting and geometrical feature of hands. From the hand orientation and palm center position, we find the optimal fingertip position and its orientation. Then using extended CAMSHIFT, we reliably track the 2D hand gesture trajectory with extracted fingertip. Finally, we applied the conditional density propagation (CONDENSATION) to recognize the pre-defined temporal motion trajectories. Experimental results show that the proposed algorithm not only rapidly extracts the hand region with accurately extracted fingertip and its angle but also robustly tracks the hand under different illumination, size and rotation conditions. Using these results, we successfully recognize the multiple hand gestures.

동물로봇과의 상호작용에 따른 치매노인의 인지기능, 기분상태, 문제행동 및 반응의 변화 (Cognitive Function, Mood, Problematic Behavior and Response to Interaction with Robot Pet by Elders with Dementia)

  • 임난영;강현숙;박영숙;안동현;오진환;송정희
    • 기본간호학회지
    • /
    • 제16권2호
    • /
    • pp.223-231
    • /
    • 2009
  • Purpose: The purpose of this study was to identify the effects on cognitive function, mood, problematic behaviors and response to interaction with robot pet by elderly people with dementia. Method: A methodological triangulation design with quantitative and qualitative methods was used. The participants were 9 elderly people with dementia. The intervention was conducted twice a week for 4 weeks, Qualitative data were collected by interviews and video-taping for analysis of the responses of participants, Results: 1) Cognitive function, mood and problematic behaviors did not show any significantly differences after the program. 2) Analysis of the responses showed increases in verbal communication and positive action. Conclusion: The robot pet program had positive effects such as increasing communication and interaction. Therefore, this program could be considered as an effective program for emotional support for elderly people with dementia. However further repetitive study is need to validate the result.

  • PDF

Ontological Robot System for Communication

  • Yamaguchi, Toru;Sato, Eri;Higuchi, Katsutaka
    • 한국지능시스템학회:학술대회논문집
    • /
    • 한국퍼지및지능시스템학회 2003년도 ISIS 2003
    • /
    • pp.130-133
    • /
    • 2003
  • The robot has recently emerged as a factor in the daily lives of humans, taking the form of a mechanical pet or similar source of entertainment. A robot system that is designed to co-exist with humans, i.e., a coexistence-type robot system, is important to be "it exists in various environments with the person, and robot system by which the interaction of a physical, informational emotion with the person etc. was valued". When studying the impact of intimacy in the human/robot relationship, we have to examine the problems that can arise as a result of physical intimacy(coordination on safety in the hardware side and a soft side). Furthermore, We should also consider the informational aspects of intimacy (recognition technology, and information transport and sharing). This paper reports the interim results of the research of a system configuration that enhances the physical intimacy relationship in the symbiosis of the human and the robot.

  • PDF

외부 환경 감지 센서 모듈을 이용한 소프트웨어 로봇의 감정 모델 구현 (Implementation of Emotional Model of Software Robot Using the Sensor Modules for External Environments)

  • 이준용;김창현;이주장
    • 한국지능시스템학회:학술대회논문집
    • /
    • 한국퍼지및지능시스템학회 2005년도 추계학술대회 학술발표 논문집 제15권 제2호
    • /
    • pp.179-184
    • /
    • 2005
  • Recently, studying on modeling the emotion of a robot has become issued among fields of a humanoid robot and an interaction of human and robot. Especially, modeling of the motivation, the emotion, the behavior, and so on, in the robot, is hard and need to make efforts to use our originality. In this paper, new modeling using mathematical formulations to represent the emotion and the behavior selection is proposed for the software robot with virtual sensor modules. Various Points which affect six emotional states such as happy or sad are formulated as simple exponential equations with various parameters. There are several experiments with seven external sensor inputs from virtual environment and human to evaluate this modeling.

  • PDF

외부 환경 감지 센서 모듈을 이용한 소프트웨어 로봇의 감정 모델 구현 (Implementation of Emotional Model of Software Robot Using the Sensor Modules for External Environments)

  • 이준용;김창현;이주장
    • 한국지능시스템학회논문지
    • /
    • 제16권1호
    • /
    • pp.37-42
    • /
    • 2006
  • Recently, studying on modeling the emotion of a robot has become issued among fields of a humanoid robot and an interaction of human and robot. Especially, modeling of the motivation, the emotion, the behavior. and so on, in the robot, is hard and need to make efforts to use ow originality. In this paper, new modeling using mathematical formulations to represent the emotion and the behavior selection is proposed for the software robot with virtual sensor modules. Various points which affect six emotional states such as happy or sad are formulated as simple exponential equations with various parameters. There are several experiments with seven external sensor inputs from virtual environment and human to evaluate this modeling.