• 제목/요약/키워드: Expressed Emotion

검색결과 230건 처리시간 0.019초

30대 미혼성인자녀가 지각한 부모-자녀분화, 표현된 정서가 자녀의 심리적 우울에 미치는 영향 (A Study of the Effects on Premarital Adult Children Aged Thirties Psychological Depression by Parents-Children Differentiation and Expressed Emotion)

  • 권미애;김태현
    • 가정과삶의질연구
    • /
    • 제22권5호
    • /
    • pp.197-210
    • /
    • 2004
  • The Purpose of this study was to explore the effects of differentiation, emotion over involvement(expressed emotion), and criticism between middle-or-old aged parent and child, by relation of emotional system, on child's psychological depression. The subject of this study were m premarital adult children over 30 years old. The major findings of this study were as follows. First. it was found that mother-child differentiation was more perceptive than that of father-child. With psychological depression, expressed emotion within family and criticism were shown average score that was lower than middle score. Second, among demographic characteristics, there are significant differences premarital adult children's sex, education, income, family type, father's education, and parents' marital status. Third, as the result of regression analysis, the higher level of psychological depression when the lower differentiation between parent-child, the higher expressed emotion over involvement within family and criticism. Based on the findings in this study, the relation of emotional system is very important. Therefore, it is necessary to consider the therapeutic intervention and relation improvement program when individual and family counseling about parent-child are going on.

Design of Intelligent Emotion Recognition Model

  • Kim, Yi-gon
    • 한국지능시스템학회논문지
    • /
    • 제11권7호
    • /
    • pp.611-614
    • /
    • 2001
  • Voice is one of the most efficient communication media and it includes several kinds of factors about speaker, context emotion and so on. Human emotion is expressed is expressed in the speech, the gesture, the physiological phenomena(the breath, the beating of the pulse, etc). In this paper, the emotion recognition method model using neuro-fuzzy in order to have cognizance of emotion from voice signal is presented and simulated.

  • PDF

Emotion Expressiveness and Knowledge in Preschool-Age Children: Age-Related Changes

  • Shin, Nana;Krzysik, Lisa;Vaughn, Brian E.
    • Child Studies in Asia-Pacific Contexts
    • /
    • 제4권1호
    • /
    • pp.1-12
    • /
    • 2014
  • Emotion is a central feature of social interactions. In this study, we examined age-related changes in emotion expressiveness and emotion knowledge and how young children's emotion expressiveness and knowledge were related. A total of 300 children attending a daycare center contributed data for the study. Observation and interview data relevant to measures of emotion expressiveness and knowledge were collected and analyzed. Both emotion knowledge and expressed positive affect increased with age. Older preschool children expressed positive affect more frequently than did younger preschoolers. Older preschool children also labeled, recognized, and provided plausible causes mores accurately than did younger preschool children. In addition, we tested whether children's errors on the free labeling component conform to the structural model previously suggested by Bullock and Russell (1986) and found that preschool children were using systematic strategies for labeling emotion states. Relations between emotion expressiveness and emotion knowledge generally were not significant, suggesting that emotional competence is only gradually constructed by the child over the preschool years.

2D 얼굴 영상을 이용한 로봇의 감정인식 및 표현시스템 (Emotion Recognition and Expression System of Robot Based on 2D Facial Image)

  • 이동훈;심귀보
    • 제어로봇시스템학회논문지
    • /
    • 제13권4호
    • /
    • pp.371-376
    • /
    • 2007
  • This paper presents an emotion recognition and its expression system of an intelligent robot like a home robot or a service robot. Emotion recognition method in the robot is used by a facial image. We use a motion and a position of many facial features. apply a tracking algorithm to recognize a moving user in the mobile robot and eliminate a skin color of a hand and a background without a facial region by using the facial region detecting algorithm in objecting user image. After normalizer operations are the image enlarge or reduction by distance of the detecting facial region and the image revolution transformation by an angel of a face, the mobile robot can object the facial image of a fixing size. And materialize a multi feature selection algorithm to enable robot to recognize an emotion of user. In this paper, used a multi layer perceptron of Artificial Neural Network(ANN) as a pattern recognition art, and a Back Propagation(BP) algorithm as a learning algorithm. Emotion of user that robot recognized is expressed as a graphic LCD. At this time, change two coordinates as the number of times of emotion expressed in ANN, and change a parameter of facial elements(eyes, eyebrows, mouth) as the change of two coordinates. By materializing the system, expressed the complex emotion of human as the avatar of LCD.

얼굴 인식을 통한 동적 감정 분류 (Dynamic Emotion Classification through Facial Recognition)

  • 한우리;이용환;박제호;김영섭
    • 반도체디스플레이기술학회지
    • /
    • 제12권3호
    • /
    • pp.53-57
    • /
    • 2013
  • Human emotions are expressed in various ways. It can be expressed through language, facial expression and gestures. In particular, the facial expression contains many information about human emotion. These vague human emotion appear not in single emotion, but in combination of various emotion. This paper proposes a emotional expression algorithm using Active Appearance Model(AAM) and Fuzz k- Nearest Neighbor which give facial expression in similar with vague human emotion. Applying Mahalanobis distance on the center class, determine inclusion level between center class and each class. Also following inclusion level, appear intensity of emotion. Our emotion recognition system can recognize a complex emotion using Fuzzy k-NN classifier.

Kobayashi 스케일과 I.R.I 스케일을 사용한 LED 광색의 형용사 이미지 분석 (The Analysis of Emotion Adjective for LED Light Colors by using Kobayashi scale and I.R.I scale)

  • 백창환;박승옥;김홍석
    • 조명전기설비학회논문지
    • /
    • 제25권10호
    • /
    • pp.1-13
    • /
    • 2011
  • The aim of this study is to analyze the emotion adjectives for light emitting diode(LED) light colors using a twofold adjective image scales from Kobayashi and I.R.I. A set of psychophysical experiments using category judgment was conducted in an LED light color simulation system, in order to evaluate each emotion scale coordinate for those test light colors in both adjective image scales. In total, 49 test light colors from a combination of 6 color series were assessed by 15 human observers. As a result, Kobayashi adjective image scale clearly expressed to emotion adjectives of 'Dynamic', 'Casual', 'Chic', 'Cool-casual', 'Modern', and 'Natural' for different hues. In contrast, I.R.I adjective image scale expressed only 2 adjectives of 'dynamic' and 'luxurious' for the all hues.

가상 인간의 감정 표현 인식을 위한 비언어적 다중모달 영향 분석 (Impact Analysis of nonverbal multimodals for recognition of emotion expressed virtual humans)

  • 김진옥
    • 인터넷정보학회논문지
    • /
    • 제13권5호
    • /
    • pp.9-19
    • /
    • 2012
  • 디지털 콘텐츠에서 HCI로 활용되는 가상 인간은 얼굴 표정과 신체자세와 같은 모달을 이용하여 다양한 감정을 표현하지만 비언어적 다중모달의 조합에 대한 연구는 많지 않다. 감정을 표현하는 가상 인간을 제작하려면 계산 엔진 모델은 얼굴 표정과 신체자세와 같은 비언어적 모달의 조합이 사용자에 의해 어떻게 인식되는지를 고려해야 하기 때문에 본 연구는 가상 인간의 감정 표현 디자인에 필요한 비언어적 다중모달의 영향을 분석하여 제시한다. 먼저 가상 인간에 대한 다중모달 별 감정 인식을 평가하여 다른 모달간의 상대적 영향성을 분석하였다. 그리고 일치하는 얼굴과 자세 모달을 통해 기본 감정 및 정서가와 활성화 인식에 대한 영향을 평가하며 감정이 불일치하는 다중모달을 통해 일상생활에서 빈번하게 드러나는 중첩된 감정의 인식 정도를 관측하였다. 실험 결과, 가상 인간의 얼굴과 신체자세의 표정이 일치하면 감정 인식이 용이하며, 얼굴 표정으로 감정 카테고리를 판별하지만 감정의 활성화 차원 판단에는 자세 모달리티가 선호됨을 확인하였다. 본 연구 결과는 감정을 드러내는 가상 인간의 행동 동기화 및 애니메이션 엔진 시스템 구현에 활용할 수 있다.

Happy Applicants Achieve More: Expressed Positive Emotions Captured Using an AI Interview Predict Performances

  • Shin, Ji-eun;Lee, Hyeonju
    • 감성과학
    • /
    • 제24권2호
    • /
    • pp.75-80
    • /
    • 2021
  • Do happy applicants achieve more? Although it is well established that happiness predicts desirable work-related outcomes, previous findings were primarily obtained in social settings. In this study, we extended the scope of the "happiness premium" effect to the artificial intelligence (AI) context. Specifically, we examined whether an applicant's happiness signal captured using an AI system effectively predicts his/her objective performance. Data from 3,609 job applicants showed that verbally expressed happiness (frequency of positive words) during an AI interview predicts cognitive task scores, and this tendency was more pronounced among women than men. However, facially expressed happiness (frequency of smiling) recorded using AI could not predict the performance. Thus, when AI is involved in a hiring process, verbal rather than the facial cues of happiness provide a more valid marker for applicants' hiring chances.

한국무용 감기 동작 시 표현하고자 하는 감정에 따른 운동역학적 차이 (Kinetic Analysis of Gam-ki in the Korean Traditional Dance during Expressing Different Emotions)

  • 조남규;오성근
    • 한국운동역학회지
    • /
    • 제25권2호
    • /
    • pp.207-218
    • /
    • 2015
  • Objective : The purpose of this study was to investigate the characteristics of Gam-ki (double-arm winding) depending on the emotion being expressed. Gam-ki is one of the basic movements of Korean traditional dance. Method : We selected three Korean traditional dancers who belong to National Dance Company of Korea. They were asked to express four different emotions (anger, joy, sadness, and neutral) while performing Gam-ki. We analyzed elapsed time and time ratio, size of movement, ground reaction forces and ground impulses. Results : During Gam-ki the elapsed time for each phase as well as for one cycle was longest when "sadness" was expressed then followed by "neutral" and then "angry" and "joy." Except for the ankle in/eversion, the ROMs of the lower limb joints seem not to be an emotion-characteristic factor. The ROMs of the upper limb joints were largest when "anger" was expressed. Neck rotation is associated with expressing negative emotions ("angry" and "sadness"). For medial-lateral GRF "angry"> "joy" > "neutral" > "sadness" was in order. Therefore, it can be regarded as a factor indicating the activity of the emotion.

The effects of the usability of products on user's emotions - with emphasis on suggestion of methods for measuring user's emotions expressed while using a product -

  • Jeong, Sang-Hoon
    • 디자인학연구
    • /
    • 제20권2호
    • /
    • pp.5-16
    • /
    • 2007
  • The main objective of our research is analyzing user's emotional changes while using a product, to reveal the influence of usability on human emotions. In this study we have extracted some emotional words that can come up during user interaction with a product and reveal emotional changes through three methods. Finally, we extracted 88 emotional words for measuring user's emotions expressed while using products. And we categorized the 88 words to form 6 groups by using factor analysis. The 6 categories that were extracted as a result of this study were found to be user's representative emotions expressed while using products. It is expected that emotional words and user's representative emotions extracted in this study will be used as subjective evaluation data that is required to measure user's emotional changes while using a product. Also, we proposed the effective methods for measuring user's emotion expressed while using a product in the environment which is natural and accessible for the field of design, by using the emotion mouse and the Eyegaze. An examinee performs several tasks with the emotion mouse through the mobile phone simulator on the computer monitor connected to the Eyegaze. While testing, the emotion mouse senses user's EDA and PPG and transmits the data to the computer. In addition, the Eyegaze can observe the change of pupil size. And a video camera records user's facial expression while testing. After each testing, a subjective evaluation on the emotional changes expressed by the user is performed by the user him/herself using the emotional words extracted from the above study. We aim to evaluate the satisfaction level of usability of the product and compare it with the actual experiment results. Through continuous studies based on these researches, we hope to supply a basic framework for the development of interface with consideration to the user's emotions.

  • PDF