• Title/Summary/Keyword: Emotional Expressions

Search Result 233, Processing Time 0.034 seconds

Study of Emotion Recognition based on Facial Image for Emotional Rehabilitation Biofeedback (정서재활 바이오피드백을 위한 얼굴 영상 기반 정서인식 연구)

  • Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.16 no.10
    • /
    • pp.957-962
    • /
    • 2010
  • If we want to recognize the human's emotion via the facial image, first of all, we need to extract the emotional features from the facial image by using a feature extraction algorithm. And we need to classify the emotional status by using pattern classification method. The AAM (Active Appearance Model) is a well-known method that can represent a non-rigid object, such as face, facial expression. The Bayesian Network is a probability based classifier that can represent the probabilistic relationships between a set of facial features. In this paper, our approach to facial feature extraction lies in the proposed feature extraction method based on combining AAM with FACS (Facial Action Coding System) for automatically modeling and extracting the facial emotional features. To recognize the facial emotion, we use the DBNs (Dynamic Bayesian Networks) for modeling and understanding the temporal phases of facial expressions in image sequences. The result of emotion recognition can be used to rehabilitate based on biofeedback for emotional disabled.

Application and Analysis of Emotional Attributes using Crowdsourced Method for Hangul Font Recommendation System (한글 글꼴 추천시스템을 위한 크라우드 방식의 감성 속성 적용 및 분석)

  • Kim, Hyun-Young;Lim, Soon-Bum
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.4
    • /
    • pp.704-712
    • /
    • 2017
  • Various researches on content sensibility with the development of digital contents are under way. Emotional research on fonts is also underway in various fields. There is a requirement to use the content expressions in the same way as the content, and to use the font emotion and the textual sensibility of the text in harmony. But it is impossible to select a proper font emotion in Korea because each of more than 6,000 fonts has a certain emotion. In this paper, we analysed emotional classification attributes and constructed the Hangul font recommendation system. Also we verified the credibility and validity of the attributes themselves in order to apply to Korea Hangul fonts. After then, we tested whether general users can find a proper font in a commercial font set through this emotional recommendation system. As a result, when users want to express their emotions in sentences more visually, they can get a recommendation of a Hangul font having a desired emotion by utilizing font-based emotion attribute values collected through the crowdsourced method.

GA-optimized Support Vector Regression for an Improved Emotional State Estimation Model

  • Ahn, Hyunchul;Kim, Seongjin;Kim, Jae Kyeong
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.6
    • /
    • pp.2056-2069
    • /
    • 2014
  • In order to implement interactive and personalized Web services properly, it is necessary to understand the tangible and intangible responses of the users and to recognize their emotional states. Recently, some studies have attempted to build emotional state estimation models based on facial expressions. Most of these studies have applied multiple regression analysis (MRA), artificial neural network (ANN), and support vector regression (SVR) as the prediction algorithm, but the prediction accuracies have been relatively low. In order to improve the prediction performance of the emotion prediction model, we propose a novel SVR model that is optimized using a genetic algorithm (GA). Our proposed algorithm-GASVR-is designed to optimize the kernel parameters and the feature subsets of SVRs in order to predict the levels of two aspects-valence and arousal-of the emotions of the users. In order to validate the usefulness of GASVR, we collected a real-world data set of facial responses and emotional states via a survey. We applied GASVR and other algorithms including MRA, ANN, and conventional SVR to the data set. Finally, we found that GASVR outperformed all of the comparative algorithms in the prediction of the valence and arousal levels.

Development of Face Robot Actuated by Artificial Muscle

  • Choi, H.R.;Kwak, J.W.;Chi, H.J.;Jung, K.M.;Hwang, S.H.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1229-1234
    • /
    • 2004
  • Face robots capable of expressing their emotional status, can be adopted as an e cient tool for friendly communication between the human and the machine. In this paper, we present a face robot actuated with arti cial muscle based on dielectric elastomer. By exploiting the properties of polymers, it is possible to actuate the covering skin, and provide human-like expressivity without employing complicated mechanisms. The robot is driven by seven types of actuator modules such as eye, eyebrow, eyelid, brow, cheek, jaw and neck module corresponding to movements of facial muscles. Although they are only part of the whole set of facial motions, our approach is su cient to generate six fundamental facial expressions such as surprise, fear, angry, disgust, sadness, and happiness. Each module communicates with the others via CAN communication protocol and according to the desired emotional expressions, the facial motions are generated by combining the motions of each actuator module. A prototype of the robot has been developed and several experiments have been conducted to validate its feasibility.

  • PDF

Relations between Parenting-Related Variables and Mother-Infant Interactive Behaviors (양육관련변인과 어머니-영아 상호작용행동간의 관계)

  • Yang, Ha-Young;ParkChoi, Hye-Won
    • Journal of the Korean Home Economics Association
    • /
    • v.49 no.3
    • /
    • pp.99-111
    • /
    • 2011
  • Relations between mother-infant interactive behaviors and parenting-related variables were analyzed in a sample of 72 infants (35 boys and 37 girls, average age=31 mo.) and their mothers in Ulsan, Korea. Parents' views on children, parenting stress, fathers' participation in parenting and social support were measured using questionnaires and mother-infant interactive behaviors were observed using the 3-bags test. Among the parenting-related variables, parents' views on children were related significantly with mothers' emotional expressions & infant's participation with mothers. Social support was correlated with mother's behaviors, including positive rewards and emotional expressions. Mother-infant interactive behaviors were closely correlated with one another: Mothers' positive behaviors such as overall reactivity and cognitive stimulation were correlated positively with infants' positive behaviors, including sustained attention and positive affect. Future studies will provide us with greater insights into the mechanisms underlying the effects of these parenting-related variables on infant behavior and development.

A Face Robot Actuated With Artificial Muscle Based on Dielectric Elastomer

  • Kwak Jong Won;Chi Ho June;Jung Kwang Mok;Koo Ja Choon;Jeon Jae Wook;Lee Youngkwan;Nam Jae-do;Ryew Youngsun;Choi Hyouk Ryeol
    • Journal of Mechanical Science and Technology
    • /
    • v.19 no.2
    • /
    • pp.578-588
    • /
    • 2005
  • Face robots capable of expressing their emotional status, can be adopted as an efficient tool for friendly communication between the human and the machine. In this paper, we present a face robot actuated with artificial muscle based on dielectric elastomer. By exploiting the properties of dielectric elastomer, it is possible to actuate the covering skin, eyes as well as provide human-like expressivity without employing complicated mechanisms. The robot is driven by seven actuator modules such eye, eyebrow, eyelid, brow, cheek, jaw and neck module corresponding to movements of facial muscles. Although they are only part of the whole set of facial motions, our approach is sufficient to generate six fundamental facial expressions such as surprise, fear, angry, disgust, sadness, and happiness. In the robot, each module communicates with the others via CAN communication protocol and according to the desired emotional expressions, the facial motions are generated by combining the motions of each actuator module. A prototype of the robot has been developed and several experiments have been conducted to validate its feasibility.

A Study on the Creative Expression of Fashion illustration - Focusing on The Expression of Elements - (패션일러스트레이션의 창의적 표현 방법 연구 - 표현 요소를 중심으로 -)

  • 성유정;유영선
    • Journal of the Korean Society of Costume
    • /
    • v.52 no.7
    • /
    • pp.13-25
    • /
    • 2002
  • The purpose of this study is to develope a creative expression technique in fashion illustration through analyzing applied techniques of the elements related to creative expressions in fashion illustrations. They were investigated in view of line, form, texture, color, space and the results were summarized as follows. In the creative expression using line in the fashion illustration. emotional effects of line created by specific character of mediums and duplicated lines have been applied to visualize movements of the objects. In shape, the transformed figure by destructing. covering or eliminating a part of the figure or the dress has been adapted. In texture. the invented texture reconstructed from actual texture has been applied, collage technique. computer graphic being used to give various images of texture. The creative expressions using color have been achieved by the shading. and spreading effects and the symbolic meaning of color for creating a spatiality in a picture plane and to give emotional effects and visual concentricity. In space, the color perspective together with detailed description. the combination of various points of view and liner perspective have been used to create depth and illusional space in pictorial plane.

A Study on Characteristics of Experiencing Sensibility Space in The Perception of Kineticism 'Movement' (키네티시즘의 '움직임' 지각을 통한 체험적 감성 공간 특성에 관한 연구)

  • Kim, Jun-Young;Yoon, Jae-Eun
    • Korean Institute of Interior Design Journal
    • /
    • v.19 no.3
    • /
    • pp.67-76
    • /
    • 2010
  • Recently the trend that human emotions become central is strongly on the rise beyond functions in the area of space. In the present situation, the current study is to examine the characteristics of experiential space of emotions through an approach to space appearing in kinetic art paying attention to the fact that the new spatial expressions and forms of Kineticism lie in introduction of 'movement' as a phenomenon that humans who are main agents of experiences can perceive. For research methods, spatial expressions were proposed according to each characteristic extracting characteristics to create space through 'movement' of Kineticism, and the features of experiential space of emotion were elicited by analyzing sensible elements and perceptual characteristics which stimulate human sensitivity through expressive aspects of 'movement' appearing in the case. As a result, it was found that characteristics appeared including immersion through non-daily stimulation, empathy through visual·perceptual stimulation, syn-aesthetic experiences through stimulation of thinking senses, and perceptual activation through physical movement etc. Namely, the present study has its meanings in seeking another directions and possibilities as emotional space to activate experiencers' diverse perceptions and senses by analyzing the characteristics of experiential emotional space through 'movement' of Kineticism which is one of modern plastic arts.

A Comparison of Effective Feature Vectors for Speech Emotion Recognition (음성신호기반의 감정인식의 특징 벡터 비교)

  • Shin, Bo-Ra;Lee, Soek-Pil
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.67 no.10
    • /
    • pp.1364-1369
    • /
    • 2018
  • Speech emotion recognition, which aims to classify speaker's emotional states through speech signals, is one of the essential tasks for making Human-machine interaction (HMI) more natural and realistic. Voice expressions are one of the main information channels in interpersonal communication. However, existing speech emotion recognition technology has not achieved satisfactory performances, probably because of the lack of effective emotion-related features. This paper provides a survey on various features used for speech emotional recognition and discusses which features or which combinations of the features are valuable and meaningful for the emotional recognition classification. The main aim of this paper is to discuss and compare various approaches used for feature extraction and to propose a basis for extracting useful features in order to improve SER performance.

The Effects of Emotional Contexts on Infant Smiling (정서 유발 맥락이 영아의 미소 얼굴 표정에 미치는 영향)

  • Hong, Hee Young;Lee, Young
    • Korean Journal of Child Studies
    • /
    • v.24 no.6
    • /
    • pp.15-31
    • /
    • 2003
  • This study examined the effects of emotion inducing contexts on types of infants smiling. Facial expressions of forty-five 11-to 15-month-old infants were videotaped in an experimental lab with positive and negative emotional contests. Infants' smiling was identified as the Duchenne smile or non-Duchenne smile based on FACS(Facial Action Coding System, Ekman & Friesen, 1978). Duration of smiling types was analyzed. Overall, infants showed more smiling in the positive than in the negative emotional context. Occurrence of Duchenne smiling was more likely in the positive than in the negative context and in the peek-a-boo than in the melody toy condition within the same positive context. Non-Duchenne smiling did not differ by context.

  • PDF