• Title/Summary/Keyword: Video Emotion

Search Result 144, Processing Time 0.023 seconds

A Study on ERP and Behavior Responses in Emotion Regulation (정서조절에 관한 Event related potentials 및 행동학적 반응 연구)

  • Seo, Ssang-Hee
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.14 no.10
    • /
    • pp.5003-5011
    • /
    • 2013
  • This paper measured whether neural and behavior responses to attention-emotion task were reflected to emotion regulation capacities. For this purpose, Nineteen healthy right-handed graduates participated in the emotion-attention task three times for three days. Before and after the negative and positive video clips were shown, the participants performed emotion-attention task. EEG and response time were recorded during emotion-attention task. There was positive correlation between ERP P100 and P300 component. The larger the P100 amplitudes at the specific positions, the longer the P300 latencies at these same positions during attention-emotion task. The longer the P300 latencies at the specific positions, the longer the delay in response time. Also, there is and individual differences in ERP components and response time during attention-emotion integration task. Individuals who had lower amplitude and shorter latency of ERP showed faster response time during attention-emotion task, regardless of the type of video clips. This characteristic was interpreted to the lower emotional controls due to premature response for target identification.

Statistical Model for Emotional Video Shot Characterization (비디오 셧의 감정 관련 특징에 대한 통계적 모델링)

  • 박현재;강행봉
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.28 no.12C
    • /
    • pp.1200-1208
    • /
    • 2003
  • Affective computing plays an important role in intelligent Human Computer Interactions(HCI). To detect emotional events, it is desirable to construct a computing model for extracting emotion related features from video. In this paper, we propose a statistical model based on the probabilistic distribution of low level features in video shots. The proposed method extracts low level features from video shots and then from a GMM(Gaussian Mixture Model) for them to detect emotional shots. As low level features, we use color, camera motion and sequence of shot lengths. The features can be modeled as a GMM by using EM(Expectation Maximization) algorithm and the relations between time and emotions are estimated by MLE(Maximum Likelihood Estimation). Finally, the two statistical models are combined together using Bayesian framework to detect emotional events in video.

Sensibility Evaluation of Internet Shoppers with the Sportswear Rustling Sounds (스포츠의류 마찰음 정보 제공에 따른 인터넷 구매자의 감성평가)

  • Baek, Gyeong-Rang;Jo, Gil-Su
    • Proceedings of the Korean Society for Emotion and Sensibility Conference
    • /
    • 2009.05a
    • /
    • pp.177-180
    • /
    • 2009
  • This study investigates the perception of different fabrics by consumers when provided with a video clip with rustling sounds of the fabric. We utilized sportswear products that are currently on the market and evaluated the emotional response of internet shoppers by measuring the physiological and psychological responses. Three kinds of vapor-permeable water-repellent fabric were selected to generate video clips each containing the fabric rustling sound and images of exercise activities wearing the sportswear made of the respective fabric. The new experimental website contained the video clips and was compared with the original website which served as a control. 30 subjects, who had experience to buy clothing online, took part in the physiological and psychological response to the video clip. Electroen-cephalography (EEG) was used to measure the physiological response while the psychological response consisted of evaluating accurate perception of the fabric, satisfaction, and consumer interest. When we offered video clips with fabric's rustling sound on the website, subjects answered they could get more accurate and rapid information to decide to purchase the products than otherwise they do the shopping without such information. However, such rustling sounds somewhat annoy customers, as proved psychological and physiological response. Our study is a critical step in evaluating the consumer's emotional response to sportswear fabric which will promote selling frequency, reduce the return rate and aid development of new sportswear fabric further evolution of the industry.

  • PDF

Audio and Video Bimodal Emotion Recognition in Social Networks Based on Improved AlexNet Network and Attention Mechanism

  • Liu, Min;Tang, Jun
    • Journal of Information Processing Systems
    • /
    • v.17 no.4
    • /
    • pp.754-771
    • /
    • 2021
  • In the task of continuous dimension emotion recognition, the parts that highlight the emotional expression are not the same in each mode, and the influences of different modes on the emotional state is also different. Therefore, this paper studies the fusion of the two most important modes in emotional recognition (voice and visual expression), and proposes a two-mode dual-modal emotion recognition method combined with the attention mechanism of the improved AlexNet network. After a simple preprocessing of the audio signal and the video signal, respectively, the first step is to use the prior knowledge to realize the extraction of audio characteristics. Then, facial expression features are extracted by the improved AlexNet network. Finally, the multimodal attention mechanism is used to fuse facial expression features and audio features, and the improved loss function is used to optimize the modal missing problem, so as to improve the robustness of the model and the performance of emotion recognition. The experimental results show that the concordance coefficient of the proposed model in the two dimensions of arousal and valence (concordance correlation coefficient) were 0.729 and 0.718, respectively, which are superior to several comparative algorithms.

A Design of real sound recommendation service based-on User's preference, emotion and circumstance (사용자 취향, 감성 및 상황인지 기반 음원 추천 서비스 구현)

  • Jung, Jong-Jin;Lim, Tae-Beom;Lee, Seok-Pil
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2011.04a
    • /
    • pp.689-691
    • /
    • 2011
  • Due to the rapid development of Information and communication, the technology of multimedia presentation technology is evolving into the service that user can actively, realistically enjoy and play based on user's preference and taste not only for User's passive service. Especially, the industry related the realistic multimedia service that supports targeting Human emotion with the property of Human hearing is expected to be formed of the high value-added premium market. Audio technology is affected on human's emotion and the viewing environment around than video technology. Also the audio technology compared to video technology is a research part that appeals to human emotion and emphasize on psychological aspects. With this viewpoint, the development of intelligent and realistic audio technology needs highly specialty. In this study, "intelligent real-sound presentation technology" that support high quality and realistic audio and the "core technologies" that are composing of this will be introduced.

Video image analysis algorithms with happy emotion tree (영상 이미지 행복 감성 트리를 이용한 분석 알고리즘)

  • Lee, Yean-Ran;Lim, Young-Hwan
    • Cartoon and Animation Studies
    • /
    • s.33
    • /
    • pp.403-423
    • /
    • 2013
  • Video images of emotional happiness or unhappiness, stress or emotional division of tranquility in the form of a tree is evaluated by weighting. Representative evaluation of the video image brightness contrast sensitivity ratings 1 car happy, unhappy or nervous, calm and refined with two car dependency, sensitivity to visual images are separated. Emotion Recognition of four compared to the numerical data is measured by brightness. OpenCV implementation through evaluation graph the stress intensity contrast, tranquility, happiness, unhappiness with changes in the value of four, separated by sensitivity to computing. Contrast sensitivity of computing the brightness according to the input value 'unhappy' to 'happy' or 'stress' to 'calm' the emotional changes are implemented. Emotion computing the regularity of the image to calculate the sensitivity localized computing system can be controlled according to the emotion of the contrast value of the brightness changes are implemented. The future direction of industry on the application of emotion recognition will play a positive role.

An Analysis on the Change in ERP caused from watching Fear of Crime Video contents (범죄관련 공포 영상 콘텐츠 시청 시 발생하는 뇌파의 ERP 변화 분석)

  • Kim, Yong-Woo;Kang, Hang-Bong
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.6
    • /
    • pp.950-959
    • /
    • 2017
  • Even though there are many studies on emotion recognition using EEG, there is a few research on specific emotion recognition in detail. In this paper, we construct test videos and conduct congruent-incongruent test for analysis on fear of crime. We compared the differences of event-related potential of subjects before and after watching the video using the congruent-incongruent test and analyzed the fear that subject perceived. Our results demonstrate that subjects showed lower amplitudes for unexpected stimuli in the PZ, POZ, and OZ electrodes of the occipital lobe, which process visual stimuli. These results indicate that the perception of subject is slower for unexpected objects.

Exploration of the Emotion for Daily Conversation on Facebook (페이스북 일상담화의 감정 탐색)

  • Hwang, Yoosun
    • The Journal of the Korea Contents Association
    • /
    • v.16 no.2
    • /
    • pp.1-13
    • /
    • 2016
  • The purpose of this study is to explore the emotions of Facebook. Various types of emotions are being exchanged on Facebook. The emotional reactions make the Facebook different from previous electronic bulletin board. According to previous researches, computer-mediated communication can deliver visual symbols and non-verbal cues to enhance the abundance of meanings. Data were collected from 205 Facebook users and the number of users' posts were total 10308. The contents analysis was conducted to explore emotions of the 10308 Facebook posts. The results showed that the most frequent emotion was pleasure. The emotional distributions were different according to the contents types; text, video, photo, and link. For the text content type, emotion of curiosity was apparent and for the photo content type, emotion of love was more frequent than others, and for the video content type, emotion of surprise was salient. The results of the analysis for the shared contents also revealed that pleasure and hope were more frequent emotions than other emotions.

Emotion-based Video Scene Retrieval using Interactive Genetic Algorithm (대화형 유전자 알고리즘을 이용한 감성기반 비디오 장면 검색)

  • Yoo Hun-Woo;Cho Sung-Bae
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.10 no.6
    • /
    • pp.514-528
    • /
    • 2004
  • An emotion-based video scene retrieval algorithm is proposed in this paper. First, abrupt/gradual shot boundaries are detected in the video clip representing a specific story Then, five video features such as 'average color histogram' 'average brightness', 'average edge histogram', 'average shot duration', and 'gradual change rate' are extracted from each of the videos and mapping between these features and the emotional space that user has in mind is achieved by an interactive genetic algorithm. Once the proposed algorithm has selected videos that contain the corresponding emotion from initial population of videos, feature vectors from the selected videos are regarded as chromosomes and a genetic crossover is applied over them. Next, new chromosomes after crossover and feature vectors in the database videos are compared based on the similarity function to obtain the most similar videos as solutions of the next generation. By iterating above procedures, new population of videos that user has in mind are retrieved. In order to show the validity of the proposed method, six example categories such as 'action', 'excitement', 'suspense', 'quietness', 'relaxation', 'happiness' are used as emotions for experiments. Over 300 commercial videos, retrieval results show 70% effectiveness in average.

Effects of Induced Emotional Changes on Bicep Brachii Muscle Activity (유도된 감정변화가 위팔두갈래근의 근활성도에 미치는 영향)

  • Yang, Sangwon;Shin, Yumi;Kim, Sujin
    • Physical Therapy Korea
    • /
    • v.28 no.2
    • /
    • pp.101-107
    • /
    • 2021
  • Background: Studies suggest that induced emotional changes can affect the sensory-motor system involved in the practice of muscle activity and movement in physical aspects. Previous studies have shown focused on effects just feedback on muscle activity associated with emotions but rarely have focused induced emotional change on gross motor function such as muscle activity. Objects: The purpose of this study was to compare biceps activity and emotion that before and after viewing a video was induced positive or negative emotion. Methods: The study enrolled 34 healthy male and female who scored at normal points on the Center for Epidermiological Studies-Depression Scale. The study measured over two weeks, showing subjects pleasant and sad videos one by one in a week. We performed to measure the biceps brachii activity which is maximal voluntary isometric contraction (MVIC) and the visual analog mood scale (VAMS) scores before and after one week. The significance level was set to α = 0.05. Results: There was no significant difference in muscle activity of the biceps brachii before and after each video was viewed (p > 0.05). However, the visual analogue mood scale showed an increase in VAMS after viewing each video (p < 0.05). Conclusion: We figured out induced emotional changes are cause actual emotional changes but there are no differences in muscle activity. In this research, watching the video with a short time looks like insufficient to change muscle activity. Nevertheless, there might be different when we check various muscles with sufficient time for viewing the video. Further study is needed to measure a variety of muscles with more time for viewing the video.