• Title/Summary/Keyword: Emotion processing

Search Result 318, Processing Time 0.033 seconds

Effects of Emotional Information on Visual Perception and Working Memory in Biological Motion (정서 정보가 생물형운동자극의 시지각 및 작업기억에 미치는 영향)

  • Lee, Hannah;Kim, Jejoong
    • Science of Emotion and Sensibility
    • /
    • v.21 no.3
    • /
    • pp.151-164
    • /
    • 2018
  • The appropriate interpretation of social cues is a crucial ability for everyday life. While processing socially relevant information, beyond the low-level physical features of the stimuli to emotional information is known to influence human cognition in various stages, from early perception to later high-level cognition, such as working memory (WM). However, it remains unclear how the influence of each type of emotional information on cognitive processes changes in response to what has occurred in the processing stage. Past studies have largely adopted face stimuli to address this type of research question, but we used a unique class of socially relevant motion stimuli, called biological motion (BM), which depicts various human actions and emotions with moving dots to exhibit the effects of anger, happiness, and neutral emotion on task performance in perceptual and working memory. In this study, participants determined whether two BM stimuli, sequentially presented with a delay between them (WM task) or one immediately after the other (perceptual task), were identical. The perceptual task showed that discrimination accuracies for emotional stimuli (i.e., angry and happy) were lower than those for neutral stimuli, implying that emotional information has a negative impact on early perceptual processes. Alternatively, the results of the WM task showed that the accuracy drop as the interstimulus interval increased was actually lower in emotional BM conditions than in the neutral condition, which suggests that emotional information benefited maintenance. Moreover, anger and happiness had distinct impacts on the performance of perception and WM. Our findings have significance as we provide evidence for the interaction of type of emotion and information-processing stage.

An Implementation of a Classification and Recommendation Method for a Music Player Using Customized Emotion (맞춤형 감성 뮤직 플레이어를 위한 음악 분류 및 추천 기법 구현)

  • Song, Yu-Jeong;Kang, Su-Yeon;Ihm, Sun-Young;Park, Young-Ho
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.4 no.4
    • /
    • pp.195-200
    • /
    • 2015
  • Recently, most people use android based smartphones and we can find music players in any smartphones. However, it's hard to find a personalized music player which applies user's preference. In this paper, we propose an emotion-based music player, which analyses and classifies the music with user's emotion, recommends the music, applies the user's preference, and visualizes the music by color. Through the proposed music player, user could be able to select musics easily and use an optimized application.

Development of a Stretchable Wearable Device Using Emotion Information (감성 정보를 이용한 스트레처블 웨어러블 디바이스 개발)

  • Kim, Bonam;Do, Hyun-Ku;Lee, Seong-Min;Lee, Soo-Uk
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2016.05a
    • /
    • pp.515-517
    • /
    • 2016
  • In this paper, we develope a stretchable wearable device containing services for processing physiological signals to extract emotion information. The emotion extracting algorithm conducts to recognize emotion from EDR, SKT, and HRV signals measured with the fabric sensors. In addition, the suggested wearable device can also solve the problems faced with today's many other wearable devices: 1) limited battery life 2) the lack of compatibility and expandability due to run on internal components designed for smart phone 3) the design has always been a crucial factor in determining the success of main stream consumer wearable devices.

  • PDF

The Effects of Components of Social Information Processing and Emotional Factors on Preschoolers' Overt and Relational Aggression (사회정보처리 구성요소와 정서요인이 유아의 외현적 공격성과 관계적 공격성에 미치는 영향)

  • Choi, In-Suk;Lee, Kang-Yi
    • Korean Journal of Child Studies
    • /
    • v.31 no.6
    • /
    • pp.15-34
    • /
    • 2010
  • The present study examines the sex differences in 5-year-old preschoolers' aggression according to the type of aggression (overt, relational) and the effect of components of social information processing (SIP : interpretation, goal clarification, response generation, response evaluation) and emotional factors (emotionality, emotional knowledge, emotion regulation) on their aggression. The subjects were 112 5-year-olds (56 boys, 56 girls) and their 11 teachers recruited from 9 day-care centers in Seoul and Kyung-Ki province. Each child's SIP and emotional knowledge were individually assessed with pictorial tasks and teachers reported on children's aggression, emotionality, and emotion regulation by questionnaires. Results indicated that there was a significant sex difference only in the preschoolers' overt aggression. Overtly aggressive response generation in SIP was the strongest predictor of preschoolers' overt aggression while anger of negative emotionality in emotional factors was the strongest predictor of preschoolers' relational aggression.

Representation and Detection of Video Shot s Features for Emotional Events (감정에 관련된 비디오 셧의 특징 표현 및 검출)

  • Kang, Hang-Bong;Park, Hyun-Jae
    • The KIPS Transactions:PartB
    • /
    • v.11B no.1
    • /
    • pp.53-62
    • /
    • 2004
  • The processing of emotional information is very important in Human-Computer Interaction (HCI). In particular, it is very important in video information processing to deal with a user's affection. To handle emotional information, it is necessary to represent meaningful features and detect them efficiently. Even though it is not an easy task to detect emotional events from low level features such as colour and motion, it is possible to detect them if we use statistical analysis like Linear Discriminant Analysis (LDA). In this paper, we propose a representation scheme for emotion-related features and a defection method. We experiment with extracted features from video to detect emotional events and obtain desirable results.

Interactive Roles of Local versus Global Primed Identity and Advertisement Framing on Brand Evaluation (브랜드평가에 대한 아이덴티티의 점화와 광고프레임의 상호작용효과)

  • Choi, Nak Hwan;Liu, Cong
    • Science of Emotion and Sensibility
    • /
    • v.16 no.1
    • /
    • pp.11-28
    • /
    • 2013
  • This article aims to explore the interactive roles of types of primed identity (local versus global identity) and types of ad framing on brand evaluations. The authors designed 2 experiments in which each experiment followed a $2{\times}2$ between-subject design. The empirical results showed that a gain-framed ad induced more positive emotional responses than a loss-framed ad, and the positive affective responses lead to more favorable brand evaluation. Furthermore, the results showed that there were interactive effects of primed identity and types of advertisement frame on brand evaluation. In the additional analysis, the results showed that when people with local identity were exposed to the gain-framed ad, they would engage in a higher level of integration processing than those in the control group, which in turn induced more favorable evaluation to the local brand. That is, the integration processing mode played a mediating role between the interaction (local id priming ${\times}$ ad frame) and the local brand evaluation. However, in the case of global brand evaluation, the integration processing mode did not play such a mediating role.

  • PDF

Emotion Recognition Using The Color Image Scale in Clothing Images (의류 영상에서 컬러 영상 척도를 이용한 감성 인식)

  • Lee, Seul-Gi;Woo, Hyo-Jeong;Ryu, Sung-Pil;Kim, Dong-Woo;Ahn, Jae-Hyeong
    • The Journal of the Korea Contents Association
    • /
    • v.14 no.11
    • /
    • pp.1-6
    • /
    • 2014
  • Emotion recognition is defined as that machines automatically recognize human emotions. Because the human emotions is very subjective, it is impossible to measure objectively. Therefore, the goal of emotion recognition is to obtain a measure that is agreed by as many people as possible. Emotion recognition in a image is implemented as the method that matches human emotions to the various features of the image. In the paper, we propose an emotion recognition system using color features of clothing image based on the Kobayashi's image scale. The proposed system stores colors of image scale into a database. And extracted major colors from a input clothing image are compared with those in the database. The proposed system can obtain three emotions maximally. In order to evaluate the system performance 70 observers are tested. The test results shows that recognized emotions of the proposed system are very similar to the observers emotions.

Differences in Large-scale and Sliding-window-based Functional Networks of Reappraisal and Suppression

  • Jun, Suhnyoung;Lee, Seung-Koo;Han, Sanghoon
    • Science of Emotion and Sensibility
    • /
    • v.21 no.3
    • /
    • pp.83-102
    • /
    • 2018
  • The process model of emotion regulation suggests that cognitive reappraisal and expressive suppression engage at different time points in the regulation process. Although multiple brain regions and networks have been identified for each strategy, no articles have explored changes in network characteristics or network connectivity over time. The present study examined (a) the whole-brain network and six other resting-state networks, (b) their modularity and global efficiency, which is an index of the efficiency of information exchange across the network, (c) the degree and betweenness centrality for 160 brain regions to identify the hub nodes with the most control over the entire network, and (d) the intra-network and inter-network functional connectivity (FC). Such investigations were performed using a traditional large-scale FC analysis and a relatively recent sliding window correlation analysis. The results showed that the right inferior orbitofrontal cortex was the hub region of the whole-brain network for both strategies. The present findings of temporally altering functional activity of the networks revealed that the default mode network (DMN) activated at the early stage of reappraisal, followed by the task-positive networks (cingulo-opercular network and fronto-parietal network), emotion-processing networks (the cerebellar network and DMN), and sensorimotor network (SMN) that activated at the early stage of suppression, followed by the greater recruitment of task-positive networks and their functional connection with the emotional response-related networks (SMN and occipital network). This is the first study that provides neuroimaging evidence supporting the process model of emotion regulation by revealing the temporally varying network efficiency and intra- and inter-network functional connections of reappraisal and suppression.

Emotion Classification Method Using Various Ocular Features (다양한 눈의 특징 분석을 통한 감성 분류 방법)

  • Kim, Yoonkyoung;Won, Myoung Ju;Lee, Eui Chul
    • The Journal of the Korea Contents Association
    • /
    • v.14 no.10
    • /
    • pp.463-471
    • /
    • 2014
  • In this paper, emotion classification was performed by using four ocular features extracted from near-infrared camera image. According to comparing with previous work, the proposed method used more ocular features and each feature was validated as significant one in terms of emotion classification. To minimize side effects on ocular features caused by using visual stimuli, auditory stimuli for causing two opposite emotion pairs such as "positive-negative" and "arousal-relaxation" were used. As four features for emotion classification, pupil size, pupil accommodation rate, blink frequency, and eye cloased duration were adopted which could be automatically extracted by using lab-made image processing software. At result, pupil accommodation rate and blink frequency were statistically significant features for classification arousal-relaxation. Also, eye closed duration was the most significant feature for classification positive-negative.

Emotion from Color images and Its Application to Content-based Image Retrievals (칼라영상의 감성평가와 이를 이용한 내용기반 영상검색)

  • Park, Joong-Soo;Eum, Kyoung-Bae;Shin, Kyung-Hae;Lee, Joon-Whoan;Park, Dong-Sun
    • The KIPS Transactions:PartB
    • /
    • v.10B no.2
    • /
    • pp.179-188
    • /
    • 2003
  • In content-based image retrieval, the query is an image itself and the retrieval process is the process that seeking the similar images to the given query image. In this way of retrieval, the user has to know the basic physical features of target images that he wants to retrieve. But it has some restriction because to retrieve the target image he has to know the basic physical feature space such as color, texture, shape and spatial relationship. In this paper, we propose an emotion-based retrieval system. It uses the emotion that color images have. It is different from past emotion-based image retrieval in point of view that it uses relevance feedback to estimate the users intend and it is easily combined with past content-based image retrieval system. To test the performance of our proposed system, we use MPEG-7 color descriptor and emotion language such as "warm", "clean", "bright" and "delight" We test about 1500 wallpaper images and get successful result.lpaper images and get successful result.