• Title/Summary/Keyword: Emotional Expressions

Search Result 233, Processing Time 0.027 seconds

Characteristics of Interactions between Fan and Celebrities on Twitter (유명인과의 트위터 매개 상호작용 특성 탐색)

  • Hwang, Yoosun
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.8
    • /
    • pp.72-82
    • /
    • 2013
  • The present study explored types of Twitter-mediated communication and emotional responses of Twitter users toward celebrities. Three perspectives of para-social interactions, information hub, and fandom were proposed as communication types on Twitter. Celebrities were classified by entertainer, politician, specialist, and blogger. Communication patterns according to each category of celebrities were analyzed. The patterns of emotional responses, which represents the use of emoticons and emotional expressions were also analyzed. The results show that the type of para-social interactions was frequently accepted for the interactions with politicians and specialists, while fandom style was salient for the entertainers. For the power bloggers, the users tend to adopt the type of information hub interaction. The use of emotions and emotional expressions were most frequent in case of fandom style communication and the messages to the entertainers. Implications were further discussed.

The Effects of the Emotion Regulation Strategy to the Disgust Stimulus on Facial Expression and Emotional Experience (혐오자극에 대한 정서조절전략이 얼굴표정 및 정서경험에 미치는 영향)

  • Jang, Sung-Lee;Lee, Jang-Han
    • Korean Journal of Health Psychology
    • /
    • v.15 no.3
    • /
    • pp.483-498
    • /
    • 2010
  • This study is to examine the effects of emotion regulation strategies in facial expressions and emotional experiences, based on the facial expressions of groups, using antecedent- and response- focused regulation. 50 female undergraduate students were instructed to use different emotion regulation strategies during the viewing of a disgust inducing film. While watching, their facial expressions and emotional experiences were measured. As a result, participants showed the highest frequency of action units related to disgust in the EG(expression group), and they reported in the following order of DG(expressive dissonance group), CG(cognitive reappraisal group), and SG(expressive suppression group). Also, the upper region of the face reflected real emotions. In this region, the frequency of action units related to disgust were lower in the CG than in the EG or DG. The results of the PANAS indicated the largest decrease of positive emotions reported in the DG, but an increase of positive emotions reported in the CG. This study suggests that cognitive reappraisal to an event is a more functional emotion regulation strategy compared to other strategies related to facial expression and emotional experience that affect emotion regulation strategies.

Variables Related to Gender Differences in Structural Analysis of Children's Emotional Competence (성별에 따른 유아의 정서능력과 관련변인간 구조 분석)

  • Woo, Soo Kyeong;Choi, Kee Young
    • Korean Journal of Child Studies
    • /
    • v.23 no.6
    • /
    • pp.15-32
    • /
    • 2002
  • Child's temperament, cognitive ability, social competence, mother's affective child rearing and positive expression, father's positive expression, and teacher's positive expression were the variables investigated in relation to the structure of children's emotional competence (EC). Subjects were 20 teachers and 236 five-year-old children and their parents. Data were analyzed by LISREL (Linear Structural Relations), a statistical program for structural equation modeling. Results showed that boys' social competence and mother's affective rearing behavior directly influenced the EC of boys; boys temperament and cognitive ability, and positive expressions of their teachers indirectly influenced the EC of boys. Girls' temperament and social competence directly influenced the EC of girls; their cognitive ability, mother's affective child rearing behavior, and positive expressions of mothers and fathers indirectly influenced the EC of girls.

  • PDF

A study about the aspect of translation on 'Hu(怖)' in novel 『Kokoro』 - Focusing on novels translated in Korean and English - (소설 『こころ』에 나타난 감정표현 '포(怖)'에 관한 번역 양상 - 한국어 번역 작품과 영어 번역 작품을 중심으로 -)

  • Yang, Jung-soon
    • Cross-Cultural Studies
    • /
    • v.53
    • /
    • pp.131-161
    • /
    • 2018
  • Emotional expressions are expressions that show the internal condition of mind or consciousness. Types of emotional expressions include vocabulary that describes emotion, the composition of sentences that expresses emotion such as an exclamatory sentence and rhetorical question, expressions of interjection, appellation, causative, passive, adverbs of attitude for an idea, and a style of writing. This study focuses on vocabulary that describes emotion and analyzes the aspect of translation when emotional expressions of 'Hu(怖)' is shown on "Kokoro". The aspect of translation was analyzed by three categories as follows; a part of speech, handling of subjects, and classification of meanings. As a result, the aspect of translation for expressions of Hu(怖)' showed that they were translated to vocabulary as they were suggested in the dictionary in some cases. However, they were not always translated as they were suggested in the dictionary. Vocabulary that described the emotion of 'Hu(怖)' in Japanese sentences were mostly translated to their corresponding parts of speech in Korean. Some adverbs needed to add 'verbs' when they were translated. Also, different vocabulary was added or used to maximize emotion. However, the correspondence of a part of speech in English was different from Korean. Examples of Japanese sentences that expressed 'Hu(怖)' by verbs were translated to expression of participles for passive verbs such as 'fear', 'dread', 'worry', and 'terrify' in many cases. Also, idioms were translated with focus on the function of sentences rather than the form of sentences. Examples, what was expressed in adverbs did not accompany verbs of 'Hu (怖)'. Instead, it was translated to the expression of participles for passive verbs and adjectives such as 'dread', 'worry', and 'terrify' in many cases. The main agents of emotion were shown in the first person and the third person in simple sentences. The translation on emotional expressions when a main agent was the first person showed that the fundamental word order of Japanese was translated as it was in Korean. However, adverbs of time and adverbs of degree tended to be added. Also, the first person as the main agent of emotion was positioned at the place of subject when it was translated in English. However, things or the cause of events were positioned at the place of subject in some cases to show the degree of 'Hu(怖)' which the main agent experienced. The expression of conjecture and supposition or a certain visual and auditory basis was added to translate the expression of emotion when the main agent of emotion was the third person. Simple sentences without a main agent of emotion showed that their subjects could be omitted even if they were essential components because they could be known through context in Korean. These omitted subjects were found and translated in English. Those subjects were not necessarily humans who were the main agents of emotion. They could be things or causes of events that specified the expression of emotion.

Accurate Visual Working Memory under a Positive Emotional Expression in Face (얼굴표정의 긍정적 정서에 의한 시각작업기억 향상 효과)

  • Han, Ji-Eun;Hyun, Joo-Seok
    • Science of Emotion and Sensibility
    • /
    • v.14 no.4
    • /
    • pp.605-616
    • /
    • 2011
  • The present study examined memory accuracy for faces with positive, negative and neutral emotional expressions to test whether their emotional content can affect visual working memory (VWM) performance. Participants remembered a set of face pictures in which facial expressions of the faces were randomly assigned from pleasant, unpleasant and neutral emotional categories. Participants' task was to report presence or absence of an emotion change in the faces by comparing the remembered set against another set of test faces displayed after a short delay. The change detection accuracies of the pleasant, unpleasant and neutral face conditions were compared under two memory exposure duration of 500ms vs. 1000ms. Under the duration of 500ms, the accuracy in the pleasant condition was higher than both unpleasant and neutral conditions. However the difference disappeared when the duration was extended to 1000ms. The results indicate that a positive facial expression can improve VWM accuracy relative to the negative or positive expressions especially when there is not enough time for forming durable VWM representations.

  • PDF

An emotional speech synthesis markup language processor for multi-speaker and emotional text-to-speech applications (다음색 감정 음성합성 응용을 위한 감정 SSML 처리기)

  • Ryu, Se-Hui;Cho, Hee;Lee, Ju-Hyun;Hong, Ki-Hyung
    • The Journal of the Acoustical Society of Korea
    • /
    • v.40 no.5
    • /
    • pp.523-529
    • /
    • 2021
  • In this paper, we designed and developed an Emotional Speech Synthesis Markup Language (SSML) processor. Multi-speaker emotional speech synthesis technology that can express multiple voice colors and emotional expressions have been developed, and we designed Emotional SSML by extending SSML for multiple voice colors and emotional expressions. The Emotional SSML processor has a graphic user interface and consists of following four components. First, a multi-speaker emotional text editor that can easily mark specific voice colors and emotions on desired positions. Second, an Emotional SSML document generator that creates an Emotional SSML document automatically from the result of the multi-speaker emotional text editor. Third, an Emotional SSML parser that parses the Emotional SSML document. Last, a sequencer to control a multi-speaker and emotional Text-to-Speech (TTS) engine based on the result of the Emotional SSML parser. Based on SSML which is a programming language and platform independent open standard, the Emotional SSML processor can easily integrate with various speech synthesis engines and facilitates the development of multi-speaker emotional text-to-speech applications.

Brain Activation to Facial Expressions Among Alcoholics (알코올 중독자의 얼굴 표정 인식과 관련된 뇌 활성화 특성)

  • Park, Mi-Sook;Lee, Bae Hwan;Sohn, Jin-Hun
    • Science of Emotion and Sensibility
    • /
    • v.20 no.4
    • /
    • pp.1-14
    • /
    • 2017
  • The purpose of this study was to investigate the neural substrates for recognizing facial expressions among alcoholics by using functional magnetic resonance imaging (fMRI). Abstinent inpatient alcoholics (n=18 males) and demographically similar social drinkers (n=16 males) participated in the study. The participants viewed pictures from the Japanese Female Facial Expression Database (JAFFE) and evaluated intensity of facial expressions. the alcoholics had a reduced activation in the limbic areas including amygdala and hippocampus while recognizing the emotional facial expressions compared to the nonalcoholic controls. On the other hand, the alcoholics showed greater brain activations than the controls in the left lingual (BA 19)/fusiform gyrus, the left middle frontal gyrus (BA 8/9/46), and the right superior parietal lobule (BA 7) during the viewing of emotional faces. In sum, specific brain regions were identified that are associated with recognition of facial expressions among alcoholics. The implication of the present study could be used in developing intervention for alcoholism.

Mood Suggestion Framework Using Emotional Relaxation Matching Based on Emotion Meshes

  • Kim, Jong-Hyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.23 no.8
    • /
    • pp.37-43
    • /
    • 2018
  • In this paper, we propose a framework that automatically suggests emotion using emotion analysis method based on facial expression change. We use Microsoft's Emotion API to calculate and analyze emotion values in facial expressions to recognize emotions that change over time. In this step, we use standard deviations based on peak analysis to measure and classify emotional changes. The difference between the classified emotion and the normal emotion is calculated, and the difference is used to recognize the emotion abnormality. We match user's emotions to relatively relaxed emotions using histograms and emotional meshes. As a result, we provide relaxed emotions to users through images. The proposed framework helps users to recognize emotional changes easily and to train their emotions through emotional relaxation.