• Title/Summary/Keyword: Multi-modal Interaction

Search Result 39, Processing Time 0.026 seconds

Intelligent Emotional Interface for Personal Robot and Its Application to a Humanoid Robot, AMIET

  • Seo, Yong-Ho;Jeong, Il-Woong;Jung, Hye-Won;Yang, Hyun-S.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1764-1768
    • /
    • 2004
  • In the near future, robots will be used for the personal use. To provide useful services to humans, it will be necessary for robots to understand human intentions. Consequently, the development of emotional interfaces for robots is an important expansion of human-robot interactions. We designed and developed an intelligent emotional interface for the robot, and applied the interfaces to our humanoid robot, AMIET. Subsequent human-robot interaction demonstrated that our intelligent emotional interface is very intuitive and friendly

  • PDF

The Effect of AI Agent's Multi Modal Interaction on the Driver Experience in the Semi-autonomous Driving Context : With a Focus on the Existence of Visual Character (반자율주행 맥락에서 AI 에이전트의 멀티모달 인터랙션이 운전자 경험에 미치는 효과 : 시각적 캐릭터 유무를 중심으로)

  • Suh, Min-soo;Hong, Seung-Hye;Lee, Jeong-Myeong
    • The Journal of the Korea Contents Association
    • /
    • v.18 no.8
    • /
    • pp.92-101
    • /
    • 2018
  • As the interactive AI speaker becomes popular, voice recognition is regarded as an important vehicle-driver interaction method in case of autonomous driving situation. The purpose of this study is to confirm whether multimodal interaction in which feedback is transmitted by auditory and visual mode of AI characters on screen is more effective in user experience optimization than auditory mode only. We performed the interaction tasks for the music selection and adjustment through the AI speaker while driving to the experiment participant and measured the information and system quality, presence, the perceived usefulness and ease of use, and the continuance intention. As a result of analysis, the multimodal effect of visual characters was not shown in most user experience factors, and the effect was not shown in the intention of continuous use. Rather, it was found that auditory single mode was more effective than multimodal in information quality factor. In the semi-autonomous driving stage, which requires driver 's cognitive effort, multimodal interaction is not effective in optimizing user experience as compared to single mode interaction.

Design of the emotion expression in multimodal conversation interaction of companion robot (컴패니언 로봇의 멀티 모달 대화 인터랙션에서의 감정 표현 디자인 연구)

  • Lee, Seul Bi;Yoo, Seung Hun
    • Design Convergence Study
    • /
    • v.16 no.6
    • /
    • pp.137-152
    • /
    • 2017
  • This research aims to develop the companion robot experience design for elderly in korea based on needs-function deploy matrix of robot and emotion expression research of robot in multimodal interaction. First, Elder users' main needs were categorized into 4 groups based on ethnographic research. Second, the functional elements and physical actuators of robot were mapped to user needs in function- needs deploy matrix. The final UX design prototype was implemented with a robot type that has a verbal non-touch multi modal interface with emotional facial expression based on Ekman's Facial Action Coding System (FACS). The proposed robot prototype was validated through a user test session to analyze the influence of the robot interaction on the cognition and emotion of users by Story Recall Test and face emotion analysis software; Emotion API when the robot changes facial expression corresponds to the emotion of the delivered information by the robot and when the robot initiated interaction cycle voluntarily. The group with emotional robot showed a relatively high recall rate in the delayed recall test and In the facial expression analysis, the facial expression and the interaction initiation of the robot affected on emotion and preference of the elderly participants.

Emotion Recognition and Expression System of User using Multi-Modal Sensor Fusion Algorithm (다중 센서 융합 알고리즘을 이용한 사용자의 감정 인식 및 표현 시스템)

  • Yeom, Hong-Gi;Joo, Jong-Tae;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.18 no.1
    • /
    • pp.20-26
    • /
    • 2008
  • As they have more and more intelligence robots or computers these days, so the interaction between intelligence robot(computer) - human is getting more and more important also the emotion recognition and expression are indispensable for interaction between intelligence robot(computer) - human. In this paper, firstly we extract emotional features at speech signal and facial image. Secondly we apply both BL(Bayesian Learning) and PCA(Principal Component Analysis), lastly we classify five emotions patterns(normal, happy, anger, surprise and sad) also, we experiment with decision fusion and feature fusion to enhance emotion recognition rate. The decision fusion method experiment on emotion recognition that result values of each recognition system apply Fuzzy membership function and the feature fusion method selects superior features through SFS(Sequential Forward Selection) method and superior features are applied to Neural Networks based on MLP(Multi Layer Perceptron) for classifying five emotions patterns. and recognized result apply to 2D facial shape for express emotion.

The new paths of user interface #1 - The non-verbal communication for the interactive media - (사용자 인터페이스의 새로운 길 #1 - 인터렉티브 미디어를 위한 비언어적 의사소통 방법 -)

  • 류제성
    • Archives of design research
    • /
    • v.13 no.3
    • /
    • pp.49-58
    • /
    • 2000
  • We commonly use the computer interface a generalized form. However, the requirement of the user various and some users cannot apply the general circumstance. For these requirements, this research suggests the non-verbal communication. The suggestion is that Mewing with the mouth in human behavior applies to the interaction of the computer This was offered in three forms. First, drawing application: second, the arcade game: third, the interactive book. in condusion, we confirmed that the suggestion of this research could be effectively used for the development of the human computer interface.

  • PDF

Immersive Virtual Custom-made Model House (몰입감 있는 맞춤형 가상 모델하우스)

  • Hwang, Sun-Uk;Kim, Yeong-Mi;Seo, Yong-Won;Ko, Kwang-Hee;Ryu, Je-Ha;Lee, Kwan-Heng;Lee, Yong-Gu
    • Korean Journal of Computational Design and Engineering
    • /
    • v.13 no.1
    • /
    • pp.8-17
    • /
    • 2008
  • Putting a high value on individual preferences is a modern trend that more and more companies are considering for their product design and development and the apartment design is not an exception. Most apartments today are built using similar design with no room for customization. People in general want their tastes to be reflected in the design of their apartment. However, delivering what customers like to the construction company may not be an easy task in practice. For this reason, an intuitive and effective medium between the company and customers for effective communication is needed to ameliorate such a difficulty and in response to this necessity, we developed a test platform for the virtual model house which provides a user with the customization of the apartment using haptic interactions. In our virtual environment, a user can explore an apartment and change the interior based on their taste and feel through intuitive haptic interactions.

Multimodal Interface Control Module for Immersive Virtual Education (몰입형 가상교육을 위한 멀티모달 인터페이스 제어모듈)

  • Lee, Jaehyub;Im, SungMin
    • The Journal of Korean Institute for Practical Engineering Education
    • /
    • v.5 no.1
    • /
    • pp.40-44
    • /
    • 2013
  • This paper suggests a multimodal interface control module which allows a student to naturally interact with educational contents in virtual environment. The suggested module recognizes a user's motion when he/she interacts with virtual environment and then conveys the user's motion to the virtual environment via wireless communication. Futhermore, a haptic actuator is incorporated into the proposed module in order to create haptic information. Due to the proposed module, a user can haptically sense the virtual object as if the virtual object is exists in real world.

  • PDF

The influence of different support movements and heights of piers on the dynamic behavior of bridges -Part I: Earthquake acting transversely to the deck

  • Michaltsos, George T.;Raftoyiannis, Ioannis G.
    • Interaction and multiscale mechanics
    • /
    • v.2 no.4
    • /
    • pp.431-454
    • /
    • 2009
  • This paper presents a simple model for studying the dynamic response of multi-span bridges resting on piers with different heights and subjected to earthquake forces acting transversely to the bridge, but varying spatially along its length. The analysis is carried out using the modal superposition technique, while the solution of the resulting integral-differential equations is obtained via the Laplace transformation. It has been found that the piers' height and the quality of the foundation soil can affect significantly the dynamic behavior of such bridges. Typical examples showing the effectiveness of the method are presented with useful results listed.

The influence of different support movements and heights of piers on the dynamic behavior of bridges -Part II: earthquake acting along the bridge axis

  • Raftoyiannis, I.G.;Konstantakopoulos, T.G.;Michaltsos, G.T.
    • Interaction and multiscale mechanics
    • /
    • v.3 no.1
    • /
    • pp.39-54
    • /
    • 2010
  • In this paper, a simple approach is presented for studying the dynamic response of multi-span steel bridges supported by pylons of different heights, subjected to earthquake motions acting along the axis of the bridge with spatial variations. The analysis is carried out using the modal analysis technique, while the solution of the integral-differential equations derived is obtained using the successive approximations technique. It was found that the height of piers and the quality of the foundation soil can affect significantly the dynamical behavior of the bridges studied. Illustrative examples are presented to highlight the points of concern and useful conclusions are gathered.

An Analysis of Quality Attributes and Service Satisfaction for Artificial Intelligence-based Guide Robot (인공지능 안내 로봇 서비스 만족도와 품질 속성 분석)

  • Miyoung Cho;Jaehong Kim;Daeha Lee;Minsu Jang
    • The Journal of Korea Robotics Society
    • /
    • v.18 no.2
    • /
    • pp.216-224
    • /
    • 2023
  • Guide robots that provide services in public places have recently emerged as a non-face-to-face solution with the spread of COVID-19 and are growing. However, most guide robots provide only the same level of intelligence and the same interaction in different and changing environments. Therefore, its usefulness is limited and customers' interest is quickly lost. To solve this problem, it is necessary to develop social intelligence that can improve the robot's environment and situational awareness performance, and to continuously maintain customer interest by providing personalized and situational services. In this study, we developed guide robot services based on social HRI components that provides multi-modal context-aware. We evaluated service usefulness by measuring user satisfaction and frequency of use of the service through the survey. We analyzed the service quality attributes to identify the differentiating factors of guide robot based on social HRI components.