• Title/Summary/Keyword: Emotional Expressions

Search Result 234, Processing Time 0.034 seconds

A Study on Visual Perception based Emotion Recognition using Body-Activity Posture (사용자 행동 자세를 이용한 시각계 기반의 감정 인식 연구)

  • Kim, Jin-Ok
    • The KIPS Transactions:PartB
    • /
    • v.18B no.5
    • /
    • pp.305-314
    • /
    • 2011
  • Research into the visual perception of human emotion to recognize an intention has traditionally focused on emotions of facial expression. Recently researchers have turned to the more challenging field of emotional expressions through body posture or activity. Proposed work approaches recognition of basic emotional categories from body postures using neural model applied visual perception of neurophysiology. In keeping with information processing models of the visual cortex, this work constructs a biologically plausible hierarchy of neural detectors, which can discriminate 6 basic emotional states from static views of associated body postures of activity. The proposed model, which is tolerant to parameter variations, presents its possibility by evaluating against human test subjects on a set of body postures of activities.

A Study on Emotion Recognition Systems based on the Probabilistic Relational Model Between Facial Expressions and Physiological Responses (생리적 내재반응 및 얼굴표정 간 확률 관계 모델 기반의 감정인식 시스템에 관한 연구)

  • Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.19 no.6
    • /
    • pp.513-519
    • /
    • 2013
  • The current vision-based approaches for emotion recognition, such as facial expression analysis, have many technical limitations in real circumstances, and are not suitable for applications that use them solely in practical environments. In this paper, we propose an approach for emotion recognition by combining extrinsic representations and intrinsic activities among the natural responses of humans which are given specific imuli for inducing emotional states. The intrinsic activities can be used to compensate the uncertainty of extrinsic representations of emotional states. This combination is done by using PRMs (Probabilistic Relational Models) which are extent version of bayesian networks and are learned by greedy-search algorithms and expectation-maximization algorithms. Previous research of facial expression-related extrinsic emotion features and physiological signal-based intrinsic emotion features are combined into the attributes of the PRMs in the emotion recognition domain. The maximum likelihood estimation with the given dependency structure and estimated parameter set is used to classify the label of the target emotional states.

The Accuracy of Recognizing Emotion From Korean Standard Facial Expression (한국인 표준 얼굴 표정 이미지의 감성 인식 정확률)

  • Lee, Woo-Ri;Whang, Min-Cheol
    • The Journal of the Korea Contents Association
    • /
    • v.14 no.9
    • /
    • pp.476-483
    • /
    • 2014
  • The purpose of this study was to make a suitable images for korean emotional expressions. KSFI(Korean Standard Facial Image)-AUs was produced from korean standard apperance and FACS(Facial Action coding system)-AUs. For the objectivity of KSFI, the survey was examined about emotion recognition rate and contribution of emotion recognition in facial elements from six-basic emotional expression images(sadness, happiness, disgust, fear, anger and surprise). As a result of the experiment, the images of happiness, surprise, sadness and anger which had shown higher accuracy. Also, emotional recognition rate was mainly decided by the facial element of eyes and a mouth. Through the result of this study, KSFI contents which could be combined AU images was proposed. In this future, KSFI would be helpful contents to improve emotion recognition rate.

Design and implement of the Educational Humanoid Robot D2 for Emotional Interaction System (감성 상호작용을 갖는 교육용 휴머노이드 로봇 D2 개발)

  • Kim, Do-Woo;Chung, Ki-Chull;Park, Won-Sung
    • Proceedings of the KIEE Conference
    • /
    • 2007.07a
    • /
    • pp.1777-1778
    • /
    • 2007
  • In this paper, We design and implement a humanoid robot, With Educational purpose, which can collaborate and communicate with human. We present an affective human-robot communication system for a humanoid robot, D2, which we designed to communicate with a human through dialogue. D2 communicates with humans by understanding and expressing emotion using facial expressions, voice, gestures and posture. Interaction between a human and a robot is made possible through our affective communication framework. The framework enables a robot to catch the emotional status of the user and to respond appropriately. As a result, the robot can engage in a natural dialogue with a human. According to the aim to be interacted with a human for voice, gestures and posture, the developed Educational humanoid robot consists of upper body, two arms, wheeled mobile platform and control hardware including vision and speech capability and various control boards such as motion control boards, signal processing board proceeding several types of sensors. Using the Educational humanoid robot D2, we have presented the successful demonstrations which consist of manipulation task with two arms, tracking objects using the vision system, and communication with human by the emotional interface, the synthesized speeches, and the recognition of speech commands.

  • PDF

A Comic Facial Expression Using Cheeks and Jaws Movements for Intelligent Avatar Communications (지적 아바타 통신에서 볼과 턱 움직임을 사용한 코믹한 얼굴 표정)

  • ;;Yoshinao Aoki
    • Proceedings of the IEEK Conference
    • /
    • 2001.06c
    • /
    • pp.121-124
    • /
    • 2001
  • In this paper, a method of generating the facial gesture CG animation on different avatar models is provided. At first, to edit emotional expressions efficiently, regeneration of the comic expression on different polygonal mesh models is carried out, where the movements of the cheeks and numerical methods. Experimental results show a possibility that the method could be used for intelligent avatar communications between Korea and Japan.

  • PDF

A study on the vocal characteristics of spoken emotional expressions (구어체 정서표현에 있어서의 음성 특성 연구)

  • 이수정;김명재;김정수
    • Science of Emotion and Sensibility
    • /
    • v.2 no.2
    • /
    • pp.53-66
    • /
    • 1999
  • 현 연구에서는 음성합성의 기초자료 수집을 위하여 대화체 감정표현의 음성적인 패러미터를 찾아내려고 시도하였다. 이를 이하여 일단 가장 자주 사용되는 대화체 감정표현자료가 수집되었고 이들 표현을 발화할 때 가장 주의를 기울이는 발성의 특징들이 탐색되었다. 구어체적 감정표현의 타당한 데이터베이스를 작성하기 위하여 20대와 30대로 연령층을 구분하여 자료를 수집, 분석하였다. 그 결과 다양한 감정표현의 발화특성들은 음의 강도, 강도변화, 그리고 음색이 중요한 기준으로 작용하는 것으로 나타났다. 다차원분석 결과 산출된 20대와 30대의 음성표현이 도면은 개별정서들이 음성의 잠재차원 상에서 상당한 일관된 특징을 지님을 보여 주었다.

  • PDF

A study on the vocal characteristics of spoken emotional expressions (구어체 정서표현에 있어서의 음성 특성 연구)

  • 이수정
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 1999.11a
    • /
    • pp.277-291
    • /
    • 1999
  • 현 연구에서는 음성합성의 기초자료 수집을 위하여 대화체 감정표현의 음성적인 패러미터를 찾아내려고 시도하였다. 이를 위하여 일단 가장 자주 사용되는 대화체 감정 표현자료가 수집되었고 이들 표현을 발화할 때 가장 주의를 기울이는 발성의 특징들이 탐색되었다. 구어체적 감정표현의 타당한 데이타베이스를 작성하기 위하여 20대와 30로 연령층을 구분하여 자료를 수집, 분석하였다. 그 결과 다양한 감정표현의 발화특성들은 음의 강도, 강도변화, 그리고 음색이 중요한 기준으로 작용하는 것으로 나타났다. 다차원 분석 결과 산출된 20대와 30대의 음성표현의 도면은 개별정서들이 음성의 잠재차원 상에서 상당한 일관된 특징을 지님을 보여 주었다.

  • PDF

Socio-Emotional Cues Can Help 10-Month-Olds Understand the Relationship Between Others' Words and Goals (타인의 단어와 행동 목표의 관계성에 대한 10개월 영아의 이해에 있어서 사회정서 단서의 영향)

  • Lee, Youn Mi Cathy;Kim, Min Ju;Song, Hyun-joo
    • Korean Journal of Child Studies
    • /
    • v.38 no.1
    • /
    • pp.205-215
    • /
    • 2017
  • Objective: The current study examined whether providing both an actor's eye gaze and emotional expressions can help 10-month-olds interpret a change in the actor's words as a signal to a change in the actor's goal object. Methods: Sixteen 10-month-olds participated in an experiment using the violation-of-expectation paradigm and were compared to 16 10-month-olds in a control condition. The infants in the experimental condition were familiarized to an event in which an actor looks at one of two novel objects, excitingly utters a sentence, "Wow, here's a modi!", and grasps the object. The procedure in the control condition was identical to that of the experimental condition except that the infants heard the sentence without any emotional excitement and the eye gaze of the agent was hidden by a visor. In the following test trial, the infants in both conditions heard the agent changing her word (from modi to papu) and watched her grasping either the same object as before (old-goal event) or the new object (new-goal event). Results: The infants in the experimental condition looked at the old-goal event longer than at the new-goal event, suggesting that they expected the agent to change her goal object when the actor changed her word. However, the infants in the control condition looked at the two events about equally. Conclusion: When both eye gaze and emotional cues were provided, 10-month-olds were able to exploit the agent's verbal information when reasoning about whether the agent would pursue the same goal object as before.

A Study on the Message Orientation of Newspaper Advertisements - With a focus on apartment sales ads in Busan (신문광고의 메시지 지향성에 관한 연구 -부산시 아파트 분양광고를 중심으로)

  • Choi, Hyang
    • The Journal of the Convergence on Culture Technology
    • /
    • v.4 no.3
    • /
    • pp.127-132
    • /
    • 2018
  • This study set out to examine message orientation in advertising expression through the content analysis of apartment sales advertisements in Busan. For this purpose, the study analyzed newspaper ads about apartment sales in the Busan area in terms of advertising appeal methods and message types. Advertising appeal methods(rational and emotional appeal) and message orientation(environmental, emotional, investment, and functional orientation) were categorized according to brands(nationwide and local brands). The findings show that most of apartment sales ads in Busan used a rational appeal method. While nationwide brands used rational and emotional appeal in similar percentage, local brands used a lot of rational appeal. Of the types of advertising message orientation, investment and emotional orientation recorded the highest percentage. Messages of investment orientation to show economic profit or premium were used most, being followed by messages of emotional orientation to show pride and happiness. These findings were interpreted to display differences in strategies for advertising expressions among brands. The findings are expected to offer useful practical implications for message strategies for apartment sales ads.

Exploring Older Adults' Experienced Barriers and Emotional Changes in Seeking Health Information (건강정보검색에서 노인이 경험하는 어려움과 감정변화)

  • Na, Kyoungsik;Jeong, Yongsun
    • Journal of Korean Library and Information Science Society
    • /
    • v.48 no.1
    • /
    • pp.227-243
    • /
    • 2017
  • This study aims to explore older adults' experiences on cognitive and physical barriers and emotional changes of interactions from their health information seeking. Total of 10 older adults aged 65 or more were individually interviewed. The results show that the older adults may experience more difficulty from the perspectives on cognitive and physical barriers. The cognitive barriers are to: 1) know information resource and information search skills, 2) to choose relevant information, and 3) to know information search tools. The physical barriers for them to consider are eyes, hands, legs, and the whole body when accessing health information. In terms of emotion, the older adults express curiosity and negative emotion and at the beginning of the search and then they have more emotional expressions in the middle, and then they express positive emotion at the end of the search. The results suggest that information professionals should consider library as a connection to help them reduce these barriers and stabilize emotional changes.