• Title/Summary/Keyword: Video Emotion

Search Result 146, Processing Time 0.021 seconds

Korean Emotional Speech and Facial Expression Database for Emotional Audio-Visual Speech Generation (대화 영상 생성을 위한 한국어 감정음성 및 얼굴 표정 데이터베이스)

  • Baek, Ji-Young;Kim, Sera;Lee, Seok-Pil
    • Journal of Internet Computing and Services
    • /
    • v.23 no.2
    • /
    • pp.71-77
    • /
    • 2022
  • In this paper, a database is collected for extending the speech synthesis model to a model that synthesizes speech according to emotions and generating facial expressions. The database is divided into male and female data, and consists of emotional speech and facial expressions. Two professional actors of different genders speak sentences in Korean. Sentences are divided into four emotions: happiness, sadness, anger, and neutrality. Each actor plays about 3300 sentences per emotion. A total of 26468 sentences collected by filming this are not overlap and contain expression similar to the corresponding emotion. Since building a high-quality database is important for the performance of future research, the database is assessed on emotional category, intensity, and genuineness. In order to find out the accuracy according to the modality of data, the database is divided into audio-video data, audio data, and video data.

Study on Impact of GUI Design Elements of Mobile Phone on Brand Preference With focus on senior citizens (모바일 폰의 GUI 디자인의 구성요소가 브랜드 선호도에 미치는 영향 실버세대를 중심으로)

  • Kim, Young Seok
    • Science of Emotion and Sensibility
    • /
    • v.16 no.4
    • /
    • pp.545-556
    • /
    • 2013
  • This study has established the GUI design elements of mobile phones as color, text, layout, graphic icon, and video, under the purpose of exploring the relevance between such elements and brand preference among senior citizens. To accomplish the objective, a model and hypotheses were established, which were tested through a multiple regression analysis. The findings are as follows. First, when the statistical significance was examined by GUI design element of mobile phones, the following results were obtained: Color, text, layout, graphic icon, and video were statistically significant at the significance level given, indicating that such elements all affect brand preference. Second, the relative influence of GUI design elements of mobile phones on brand preference was revealed in the following order: text, color, video, graphic icon, layout. It indicates that boosting the brand preference of senior citizens for mobile phones requires considering 'text' and 'color' first before any other element. In addition, as the influence of 'text' and 'color' becomes greater, the brand preference also becomes higher.

Differential effects of the valenced content and the interaction with pacing on information processing while watching video clips (영상물 시청에 발현된 감성 유인가의 차별적 영향과 편집속도와의 상호작용)

  • Lee, Seung-Jo
    • Science of Emotion and Sensibility
    • /
    • v.12 no.1
    • /
    • pp.33-44
    • /
    • 2009
  • This study investigates differential impacts of the positive and negative content and the interaction with pacing, as a structural feature, on information processing while watching televised video clips with moderately intensive emotional tone. College participants watched six positive messages and six negative video clips lasting approximately 60 seconds. Heart rate was used to index attention and skin conductance was used to measure arousal. After all of the stimuli were shown, the participants performed the free recall questionnaire. The result demonstrates, first, positivity superiority on attention in which participants' heart rates were slower during positive content compared to during negative content. Secondly, negativity superiority was shown on free recall memory as participants remembered positive content better than did negative content. The result also manifests the interaction of emotional valence and pacing as the effects of pacing were less for the negatively emotional content compared to those for the positively emotional content. It is suggested that future studies should examine further about the differential and independent functions of positive and negative contents on information processing and the potential interaction with formal features.

  • PDF

A study on the user's emotional change when they are using a product by using emotional word logging software (감성어휘 로깅 소프트웨어를 이용한 제품 사용중 사용자의 감성변화 연구)

  • Jeong, Sang-Hoon;Lee, Kun-Pyo
    • Science of Emotion and Sensibility
    • /
    • v.9 no.spc3
    • /
    • pp.167-177
    • /
    • 2006
  • In this study, we developed a tool for measuring user's emotions expressed while using a product in the natural and accessible environment for the design field. Also, using emotional word logging software VideoTAME, we measured a user's emotions expressed while using a product. In the testing module of VideoTAME, participants evaluate their emotional changes through playing and watching the video clips of their performing tasks in the experiment room. In the analyzing module, the researchers replay the results created by participants during the experiment and analyze the results using Microsoft Excel. In this research, we have asked users to examine their emotional changes while watching the recorded video clip of them in the experiment room performing a series of tasks using a cellular phone. In this experiment, there were no big differences in the representative emotions expressed for each characteristics of task. The reason for this can be assumed it is because of the emotional changes occurred while facing specific situations when performing a task rather than the task itself. If more data is collected and concrete statistical analysis is done, it is expected that we can clarify what effect a product's usability has on user's emotions.

  • PDF

Effects of Videos about Good and Evil on Moral Judgments Regarding Self and Others (인간의 선악을 보여주는 영상은 자신과 타인에 대한 도덕적 판단에 어떤 영향을 미치는가?)

  • Kim, ShinWoo;Lee, WonSeob;Li, Hyung-Chul O.
    • Science of Emotion and Sensibility
    • /
    • v.22 no.2
    • /
    • pp.29-36
    • /
    • 2019
  • Previous resarch demonstrated that moral judgment is not an outcome of rational reasoning, but an independent variable determined by diverse factors. The effects of disgust on moral harshness, audience effect on moralistic punishment are some examples that support this view. The variability of moral judgment raises a question on what effects video stimuli might have on moral judgments. Although a few studies (Schnall, Roper, & Fessler, 2010) have shown that watching a prosocial video clip promote moral behavior, no research have simultaneously tested the effects of both positive and negative video clips on moral (not bahavior but) judgments. Hence, this research tested the effects of viewing videos about good and evil on moral judgments regarding the self and others. To this end, participants were asked to view a video clip depicting content of either positive or negative human behavior and required to make moral judgments on conduct described in a scenario assuming that the person committing the act was either themselves or another person. The results showed significant effects of both video contents (positive, negative) and the actor (self, others) on moral judgments, but they were qualified by the interaction between the two. In particular, participants who watched evil deed of others made harsher judgments on others' moral transgression. Theses results demonstrate that video contents influence moral judgments, and the effect depends on the actor of the immoral behavior. In general discussion, we interpreted the results based on moral disgust, framing effect, and fundamental attribution error.

Motion based Autonomous Emotion Recognition System: A Preliminary Study on Bodily Map according to Type of Emotional Stimuli (동작 기반 Autonomous Emotion Recognition 시스템: 감정 유도 자극에 따른 신체 맵 형성을 중심으로)

  • Jungeun Bae;Myeongul Jung;Youngwug Cho;Hyungsook Kim;Kwanguk (Kenny) Kim
    • Journal of the Korea Computer Graphics Society
    • /
    • v.29 no.3
    • /
    • pp.33-43
    • /
    • 2023
  • Not only emotions affect physical sensations, but they also have an impact on physical movements. The responses to emotions vary depending on the type of emotional stimuli. However, research on the effects of emotional stimuli on the activation of bodily movements has not been rigorously examined, and these effects have not been investigated in Autonomous Emotion Recognition (AER) systems. In this study, we aimed to compare the emotional responses of 20 participants to three types of emotional stimuli (words, pictures, and videos) and investigate their activation or deactivation for the AER system. Our dependent measures included emotional responses, computer-based self-reporting methods, and bodily movements recorded using motion capture devices. The results suggested that video stimuli elicited higher levels of emotional movement, and emotional movement patterns were similar across different types of emotional stimuli for happiness, sadness, anger, and neutrality. Additionally, the findings indicated that bodily changes observed during video stimuli had the highest classification accuracy. These findings have implications for future research on the bodily changes elicited by emotional stimuli.

Emotion-based Gesture Stylization For Animated SMS (모바일 SMS용 캐릭터 애니메이션을 위한 감정 기반 제스처 스타일화)

  • Byun, Hae-Won;Lee, Jung-Suk
    • Journal of Korea Multimedia Society
    • /
    • v.13 no.5
    • /
    • pp.802-816
    • /
    • 2010
  • To create gesture from a new text input is an important problem in computer games and virtual reality. Recently, there is increasing interest in gesture stylization to imitate the gestures of celebrities, such as announcer. However, no attempt has been made so far to stylize a gestures using emotion such as happiness and sadness. Previous researches have not focused on real-time algorithm. In this paper, we present a system to automatically make gesture animation from SMS text and stylize the gesture from emotion. A key feature of this system is a real-time algorithm to combine gestures with emotion. Because the system's platform is a mobile phone, we distribute much works on the server and client. Therefore, the system guarantees real-time performance of 15 or more frames per second. At first, we extract words to express feelings and its corresponding gesture from Disney video and model the gesture statistically. And then, we introduce the theory of Laban Movement Analysis to combine gesture and emotion. In order to evaluate our system, we analyze user survey responses.

Analysis of Emotions in Broadcast News Using Convolutional Neural Networks (CNN을 활용한 방송 뉴스의 감정 분석)

  • Nam, Youngja
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.24 no.8
    • /
    • pp.1064-1070
    • /
    • 2020
  • In Korea, video-based news broadcasters are primarily classified into terrestrial broadcasters, general programming cable broadcasters and YouTube broadcasters. Recently, news broadcasters get subjective while targeting the desired specific audience. This violates normative expectations of impartiality and neutrality on journalism from its audience. This phenomenon may have a negative impact on audience perceptions of issues. This study examined whether broadcast news reporting conveys emotions and if so, how news broadcasters differ according to emotion type. Emotion types were classified into neutrality, happiness, sadness and anger using a convolutional neural network which is a class of deep neural networks. Results showed that news anchors or reporters tend to express their emotions during TV broadcasts regardless of broadcast systems. This study provides the first quantative investigation of emotions in broadcasting news. In addition, this study is the first deep learning-based approach to emotion analysis of broadcasting news.

Development of Scent Display and Its Authoring Tool

  • Kim, Jeong Do;Choi, Ji Hoon;Lim, Seung Ju;Park, Sung Dae;Kim, Jung Ju;Ahn, Chung Hyun
    • ETRI Journal
    • /
    • v.37 no.1
    • /
    • pp.88-96
    • /
    • 2015
  • The purpose of this study is to design an authoring tool and a corresponding device for an olfactory display that can augment the immersion and reality in broadcasting services. The developed authoring tool allows an olfactory display to be properly synchronized with the existing video service by applying the standardized format using ISO/IEC 23005 (MPEG-V) and the corresponding developed scent display device. To propose the proper data format for the olfactory display, we have analyzed both the multimodal combination and the cross-modality related to the olfactory display. From the results of the analysis, we derived a set of olfactory parameters for the olfactory display that are related to emotion. The analyzed parameters related to emotion in an olfactory display are synchronization, scent intensity, scent persistence, and hedonic tone. These parameters should be controlled so that the olfactory display can be in harmony with the existing media to augment emotion. In addition, we developed a scent display device that can generate many kinds of scents and that satisfies design conditions for olfactory parameters that are for use with broadcasting services.

Affective Computing in Education: Platform Analysis and Academic Emotion Classification

  • So, Hyo-Jeong;Lee, Ji-Hyang;Park, Hyun-Jin
    • International journal of advanced smart convergence
    • /
    • v.8 no.2
    • /
    • pp.8-17
    • /
    • 2019
  • The main purpose of this study isto explore the potential of affective computing (AC) platforms in education through two phases ofresearch: Phase I - platform analysis and Phase II - classification of academic emotions. In Phase I, the results indicate that the existing affective analysis platforms can be largely classified into four types according to the emotion detecting methods: (a) facial expression-based platforms, (b) biometric-based platforms, (c) text/verbal tone-based platforms, and (c) mixed methods platforms. In Phase II, we conducted an in-depth analysis of the emotional experience that a learner encounters in online video-based learning in order to establish the basis for a new classification system of online learner's emotions. Overall, positive emotions were shown more frequently and longer than negative emotions. We categorized positive emotions into three groups based on the facial expression data: (a) confidence; (b) excitement, enjoyment, and pleasure; and (c) aspiration, enthusiasm, and expectation. The same method was used to categorize negative emotions into four groups: (a) fear and anxiety, (b) embarrassment and shame, (c) frustration and alienation, and (d) boredom. Drawn from the results, we proposed a new classification scheme that can be used to measure and analyze how learners in online learning environments experience various positive and negative emotions with the indicators of facial expressions.