• Title/Summary/Keyword: Expressed emotion

Search Result 230, Processing Time 0.024 seconds

A Study of the Effects on Premarital Adult Children Aged Thirties Psychological Depression by Parents-Children Differentiation and Expressed Emotion (30대 미혼성인자녀가 지각한 부모-자녀분화, 표현된 정서가 자녀의 심리적 우울에 미치는 영향)

  • 권미애;김태현
    • Journal of Families and Better Life
    • /
    • v.22 no.5
    • /
    • pp.197-210
    • /
    • 2004
  • The Purpose of this study was to explore the effects of differentiation, emotion over involvement(expressed emotion), and criticism between middle-or-old aged parent and child, by relation of emotional system, on child's psychological depression. The subject of this study were m premarital adult children over 30 years old. The major findings of this study were as follows. First. it was found that mother-child differentiation was more perceptive than that of father-child. With psychological depression, expressed emotion within family and criticism were shown average score that was lower than middle score. Second, among demographic characteristics, there are significant differences premarital adult children's sex, education, income, family type, father's education, and parents' marital status. Third, as the result of regression analysis, the higher level of psychological depression when the lower differentiation between parent-child, the higher expressed emotion over involvement within family and criticism. Based on the findings in this study, the relation of emotional system is very important. Therefore, it is necessary to consider the therapeutic intervention and relation improvement program when individual and family counseling about parent-child are going on.

Design of Intelligent Emotion Recognition Model

  • Kim, Yi-gon
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.11 no.7
    • /
    • pp.611-614
    • /
    • 2001
  • Voice is one of the most efficient communication media and it includes several kinds of factors about speaker, context emotion and so on. Human emotion is expressed is expressed in the speech, the gesture, the physiological phenomena(the breath, the beating of the pulse, etc). In this paper, the emotion recognition method model using neuro-fuzzy in order to have cognizance of emotion from voice signal is presented and simulated.

  • PDF

Emotion Expressiveness and Knowledge in Preschool-Age Children: Age-Related Changes

  • Shin, Nana;Krzysik, Lisa;Vaughn, Brian E.
    • Child Studies in Asia-Pacific Contexts
    • /
    • v.4 no.1
    • /
    • pp.1-12
    • /
    • 2014
  • Emotion is a central feature of social interactions. In this study, we examined age-related changes in emotion expressiveness and emotion knowledge and how young children's emotion expressiveness and knowledge were related. A total of 300 children attending a daycare center contributed data for the study. Observation and interview data relevant to measures of emotion expressiveness and knowledge were collected and analyzed. Both emotion knowledge and expressed positive affect increased with age. Older preschool children expressed positive affect more frequently than did younger preschoolers. Older preschool children also labeled, recognized, and provided plausible causes mores accurately than did younger preschool children. In addition, we tested whether children's errors on the free labeling component conform to the structural model previously suggested by Bullock and Russell (1986) and found that preschool children were using systematic strategies for labeling emotion states. Relations between emotion expressiveness and emotion knowledge generally were not significant, suggesting that emotional competence is only gradually constructed by the child over the preschool years.

Emotion Recognition and Expression System of Robot Based on 2D Facial Image (2D 얼굴 영상을 이용한 로봇의 감정인식 및 표현시스템)

  • Lee, Dong-Hoon;Sim, Kwee-Bo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.4
    • /
    • pp.371-376
    • /
    • 2007
  • This paper presents an emotion recognition and its expression system of an intelligent robot like a home robot or a service robot. Emotion recognition method in the robot is used by a facial image. We use a motion and a position of many facial features. apply a tracking algorithm to recognize a moving user in the mobile robot and eliminate a skin color of a hand and a background without a facial region by using the facial region detecting algorithm in objecting user image. After normalizer operations are the image enlarge or reduction by distance of the detecting facial region and the image revolution transformation by an angel of a face, the mobile robot can object the facial image of a fixing size. And materialize a multi feature selection algorithm to enable robot to recognize an emotion of user. In this paper, used a multi layer perceptron of Artificial Neural Network(ANN) as a pattern recognition art, and a Back Propagation(BP) algorithm as a learning algorithm. Emotion of user that robot recognized is expressed as a graphic LCD. At this time, change two coordinates as the number of times of emotion expressed in ANN, and change a parameter of facial elements(eyes, eyebrows, mouth) as the change of two coordinates. By materializing the system, expressed the complex emotion of human as the avatar of LCD.

Dynamic Emotion Classification through Facial Recognition (얼굴 인식을 통한 동적 감정 분류)

  • Han, Wuri;Lee, Yong-Hwan;Park, Jeho;Kim, Youngseop
    • Journal of the Semiconductor & Display Technology
    • /
    • v.12 no.3
    • /
    • pp.53-57
    • /
    • 2013
  • Human emotions are expressed in various ways. It can be expressed through language, facial expression and gestures. In particular, the facial expression contains many information about human emotion. These vague human emotion appear not in single emotion, but in combination of various emotion. This paper proposes a emotional expression algorithm using Active Appearance Model(AAM) and Fuzz k- Nearest Neighbor which give facial expression in similar with vague human emotion. Applying Mahalanobis distance on the center class, determine inclusion level between center class and each class. Also following inclusion level, appear intensity of emotion. Our emotion recognition system can recognize a complex emotion using Fuzzy k-NN classifier.

The Analysis of Emotion Adjective for LED Light Colors by using Kobayashi scale and I.R.I scale (Kobayashi 스케일과 I.R.I 스케일을 사용한 LED 광색의 형용사 이미지 분석)

  • Baek, Chang-Hwan;Park, Seung-Ok;Kim, Hong-Suk
    • Journal of the Korean Institute of Illuminating and Electrical Installation Engineers
    • /
    • v.25 no.10
    • /
    • pp.1-13
    • /
    • 2011
  • The aim of this study is to analyze the emotion adjectives for light emitting diode(LED) light colors using a twofold adjective image scales from Kobayashi and I.R.I. A set of psychophysical experiments using category judgment was conducted in an LED light color simulation system, in order to evaluate each emotion scale coordinate for those test light colors in both adjective image scales. In total, 49 test light colors from a combination of 6 color series were assessed by 15 human observers. As a result, Kobayashi adjective image scale clearly expressed to emotion adjectives of 'Dynamic', 'Casual', 'Chic', 'Cool-casual', 'Modern', and 'Natural' for different hues. In contrast, I.R.I adjective image scale expressed only 2 adjectives of 'dynamic' and 'luxurious' for the all hues.

Impact Analysis of nonverbal multimodals for recognition of emotion expressed virtual humans (가상 인간의 감정 표현 인식을 위한 비언어적 다중모달 영향 분석)

  • Kim, Jin Ok
    • Journal of Internet Computing and Services
    • /
    • v.13 no.5
    • /
    • pp.9-19
    • /
    • 2012
  • Virtual human used as HCI in digital contents expresses his various emotions across modalities like facial expression and body posture. However, few studies considered combinations of such nonverbal multimodal in emotion perception. Computational engine models have to consider how a combination of nonverbal modal like facial expression and body posture will be perceived by users to implement emotional virtual human, This paper proposes the impacts of nonverbal multimodal in design of emotion expressed virtual human. First, the relative impacts are analysed between different modals by exploring emotion recognition of modalities for virtual human. Then, experiment evaluates the contribution of the facial and postural congruent expressions to recognize basic emotion categories, as well as the valence and activation dimensions. Measurements are carried out to the impact of incongruent expressions of multimodal on the recognition of superposed emotions which are known to be frequent in everyday life. Experimental results show that the congruence of facial and postural expression of virtual human facilitates perception of emotion categories and categorical recognition is influenced by the facial expression modality, furthermore, postural modality are preferred to establish a judgement about level of activation dimension. These results will be used to implementation of animation engine system and behavior syncronization for emotion expressed virtual human.

Happy Applicants Achieve More: Expressed Positive Emotions Captured Using an AI Interview Predict Performances

  • Shin, Ji-eun;Lee, Hyeonju
    • Science of Emotion and Sensibility
    • /
    • v.24 no.2
    • /
    • pp.75-80
    • /
    • 2021
  • Do happy applicants achieve more? Although it is well established that happiness predicts desirable work-related outcomes, previous findings were primarily obtained in social settings. In this study, we extended the scope of the "happiness premium" effect to the artificial intelligence (AI) context. Specifically, we examined whether an applicant's happiness signal captured using an AI system effectively predicts his/her objective performance. Data from 3,609 job applicants showed that verbally expressed happiness (frequency of positive words) during an AI interview predicts cognitive task scores, and this tendency was more pronounced among women than men. However, facially expressed happiness (frequency of smiling) recorded using AI could not predict the performance. Thus, when AI is involved in a hiring process, verbal rather than the facial cues of happiness provide a more valid marker for applicants' hiring chances.

Kinetic Analysis of Gam-ki in the Korean Traditional Dance during Expressing Different Emotions (한국무용 감기 동작 시 표현하고자 하는 감정에 따른 운동역학적 차이)

  • Cho, Nam-Gyu;Oh, Seong-Geun
    • Korean Journal of Applied Biomechanics
    • /
    • v.25 no.2
    • /
    • pp.207-218
    • /
    • 2015
  • Objective : The purpose of this study was to investigate the characteristics of Gam-ki (double-arm winding) depending on the emotion being expressed. Gam-ki is one of the basic movements of Korean traditional dance. Method : We selected three Korean traditional dancers who belong to National Dance Company of Korea. They were asked to express four different emotions (anger, joy, sadness, and neutral) while performing Gam-ki. We analyzed elapsed time and time ratio, size of movement, ground reaction forces and ground impulses. Results : During Gam-ki the elapsed time for each phase as well as for one cycle was longest when "sadness" was expressed then followed by "neutral" and then "angry" and "joy." Except for the ankle in/eversion, the ROMs of the lower limb joints seem not to be an emotion-characteristic factor. The ROMs of the upper limb joints were largest when "anger" was expressed. Neck rotation is associated with expressing negative emotions ("angry" and "sadness"). For medial-lateral GRF "angry"> "joy" > "neutral" > "sadness" was in order. Therefore, it can be regarded as a factor indicating the activity of the emotion.

The effects of the usability of products on user's emotions - with emphasis on suggestion of methods for measuring user's emotions expressed while using a product -

  • Jeong, Sang-Hoon
    • Archives of design research
    • /
    • v.20 no.2 s.70
    • /
    • pp.5-16
    • /
    • 2007
  • The main objective of our research is analyzing user's emotional changes while using a product, to reveal the influence of usability on human emotions. In this study we have extracted some emotional words that can come up during user interaction with a product and reveal emotional changes through three methods. Finally, we extracted 88 emotional words for measuring user's emotions expressed while using products. And we categorized the 88 words to form 6 groups by using factor analysis. The 6 categories that were extracted as a result of this study were found to be user's representative emotions expressed while using products. It is expected that emotional words and user's representative emotions extracted in this study will be used as subjective evaluation data that is required to measure user's emotional changes while using a product. Also, we proposed the effective methods for measuring user's emotion expressed while using a product in the environment which is natural and accessible for the field of design, by using the emotion mouse and the Eyegaze. An examinee performs several tasks with the emotion mouse through the mobile phone simulator on the computer monitor connected to the Eyegaze. While testing, the emotion mouse senses user's EDA and PPG and transmits the data to the computer. In addition, the Eyegaze can observe the change of pupil size. And a video camera records user's facial expression while testing. After each testing, a subjective evaluation on the emotional changes expressed by the user is performed by the user him/herself using the emotional words extracted from the above study. We aim to evaluate the satisfaction level of usability of the product and compare it with the actual experiment results. Through continuous studies based on these researches, we hope to supply a basic framework for the development of interface with consideration to the user's emotions.

  • PDF