• Title/Summary/Keyword: Facial emotion

Search Result 309, Processing Time 0.031 seconds

Implementation of Multi Channel Network Platform based Augmented Reality Facial Emotion Sticker using Deep Learning (딥러닝을 이용한 증강현실 얼굴감정스티커 기반의 다중채널네트워크 플랫폼 구현)

  • Kim, Dae-Jin
    • Journal of Digital Contents Society
    • /
    • v.19 no.7
    • /
    • pp.1349-1355
    • /
    • 2018
  • Recently, a variety of contents services over the internet are becoming popular, among which MCN(Multi Channel Network) platform services have become popular with the generalization of smart phones. The MCN platform is based on streaming, and various factors are added to improve the service. Among them, augmented reality sticker service using face recognition is widely used. In this paper, we implemented the MCN platform that masks the augmented reality sticker on the face through facial emotion recognition in order to further increase the interest factor. We analyzed seven facial emotions using deep learning technology for facial emotion recognition, and applied the emotional sticker to the face based on it. To implement the proposed MCN platform, emotional stickers were applied to the clients and various servers that can stream the servers were designed.

Emotion Recognition of Korean and Japanese using Facial Images (얼굴영상을 이용한 한국인과 일본인의 감정 인식 비교)

  • Lee, Dae-Jong;Ahn, Ui-Sook;Park, Jang-Hwan;Chun, Myung-Geun
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.15 no.2
    • /
    • pp.197-203
    • /
    • 2005
  • In this paper, we propose an emotion recognition using facial Images to effectively design human interface. Facial database consists of six basic human emotions including happiness, sadness, anger, surprise, fear and dislike which have been known as common emotions regardless of nation and culture. Emotion recognition for the facial images is performed after applying the discrete wavelet. Here, the feature vectors are extracted from the PCA and LDA. Experimental results show that human emotions such as happiness, sadness, and anger has better performance than surprise, fear and dislike. Expecially, Japanese shows lower performance for the dislike emotion. Generally, the recognition rates for Korean have higher values than Japanese cases.

Facial Image Analysis Algorithm for Emotion Recognition (감정 인식을 위한 얼굴 영상 분석 알고리즘)

  • Joo, Y.H.;Jeong, K.H.;Kim, M.H.;Park, J.B.;Lee, J.;Cho, Y.J.
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.14 no.7
    • /
    • pp.801-806
    • /
    • 2004
  • Although the technology for emotion recognition is important one which demanded in various fields, it still remains as the unsolved problem. Especially, it needs to develop the algorithm based on human facial image. In this paper, we propose the facial image analysis algorithm for emotion recognition. The proposed algorithm is composed as the facial image extraction algorithm and the facial component extraction algorithm. In order to have robust performance under various illumination conditions, the fuzzy color filter is proposed in facial image extraction algorithm. In facial component extraction algorithm, the virtual face model is used to give information for high accuracy analysis. Finally, the simulations are given in order to check and evaluate the performance.

Recognition of Facial Emotion Using Multi-scale LBP (멀티스케일 LBP를 이용한 얼굴 감정 인식)

  • Won, Chulho
    • Journal of Korea Multimedia Society
    • /
    • v.17 no.12
    • /
    • pp.1383-1392
    • /
    • 2014
  • In this paper, we proposed a method to automatically determine the optimal radius through multi-scale LBP operation generalizing the size of radius variation and boosting learning in facial emotion recognition. When we looked at the distribution of features vectors, the most common was $LBP_{8.1}$ of 31% and sum of $LBP_{8.1}$ and $LBP_{8.2}$ was 57.5%, $LBP_{8.3}$, $LBP_{8.4}$, and $LBP_{8.5}$ were respectively 18.5%, 12.0%, and 12.0%. It was found that the patterns of relatively greater radius express characteristics of face well. In case of normal and anger, $LBP_{8.1}$ and $LBP_{8.2}$ were mainly distributed. The distribution of $LBP_{8.3}$ is greater than or equal to the that of $LBP_{8.1}$ in laugh and surprise. It was found that the radius greater than 1 or 2 was useful for a specific emotion recognition. The facial expression recognition rate of proposed multi-scale LBP method was 97.5%. This showed the superiority of proposed method and it was confirmed through various experiments.

A Study on the System of Facial Expression Recognition for Emotional Information and Communication Technology Teaching (감성ICT 교육을 위한 얼굴감성 인식 시스템에 관한 연구)

  • Song, Eun Jee
    • The Journal of Korean Institute for Practical Engineering Education
    • /
    • v.4 no.2
    • /
    • pp.171-175
    • /
    • 2012
  • Recently, the research on ICT (Information and Communication Technology), which cognizes and communicates human's emotion through information technology, is increasing. For instance, there are researches on phones and services that perceive users' emotions through detecting people's voices, facial emotions, and biometric data. In short, emotions which were used to be predicted only by humans are, now, predicted by digital equipment instead. Among many ICT researches, research on emotion recognition from face is fully expected as the most effective and natural human interface. This paper studies about sensitivity ICT and examines mechanism of facial expression recognition system as an example of sensitivity ICT.

  • PDF

Development of Emotional Feature Extraction Method based on Advanced AAM (Advanced AAM 기반 정서특징 검출 기법 개발)

  • Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.6
    • /
    • pp.834-839
    • /
    • 2009
  • It is a key element that the problem of emotional feature extraction based on facial image to recognize a human emotion status. In this paper, we propose an Advanced AAM that is improved version of proposed Facial Expression Recognition Systems based on Bayesian Network by using FACS and AAM. This is a study about the most efficient method of optimal facial feature area for human emotion recognition about random user based on generalized HCI system environments. In order to perform such processes, we use a Statistical Shape Analysis at the normalized input image by using Advanced AAM and FACS as a facial expression and emotion status analysis program. And we study about the automatical emotional feature extraction about random user.

Difference of Facial Skin Temperature Responses between Fear and Joy (공포와 기쁨 정서 간 안면온도 반응의 차이)

  • Eum, Yeong-Ji;Eom, Jin-Sup;Sohn, Jin-Hun
    • Science of Emotion and Sensibility
    • /
    • v.15 no.1
    • /
    • pp.1-8
    • /
    • 2012
  • There have been many emotion researches to investigate physiological responses on specific emotions with physiological parameters such as heart rate, blood volume flow, and skin conductance. Very few researches, however, exists by detecting them with facial skin temperature. The purpose of present study was to observe the differences of facial skin temperature by using thermal camera, when participants stimulated by monitor scenes which could evoke fear or joy. There were totally 98 of participants; undergraduate students who were in their adult age and middle, high school students who were in their adolescence. We measured their facial temperature, before and after presenting emotional stimulus to see changes between both times. Temperature values were extracted in these regions; forehead, inner corners of the eyes, bridge of the nose, end of the nose, and cheeks. Temperature values in bridge and end of the nose were significantly decreased in fear emotion stimulated. There was also significant temperature increase in the area of forehead and the inner corners of the eyes, while the temperature value in end of the nose decreased. It showed decrease in both stimulated fear and joy. These results might be described as follows: When arousal level going up, sympathetic nervous activity increases, and in turn it makes blood flow in peripheral vessels under the nose decrease. Facial temperature changes by fear or joy in this study were the same as the previous studies which measured temperature of finger tip, when participants experiencing emotions. Our results may help to develop emotion-measuring techniques and establish computer system bases which are to detect human emotions.

  • PDF

The Effects of Emotional Contexts on Infant Smiling (정서 유발 맥락이 영아의 미소 얼굴 표정에 미치는 영향)

  • Hong, Hee Young;Lee, Young
    • Korean Journal of Child Studies
    • /
    • v.24 no.6
    • /
    • pp.15-31
    • /
    • 2003
  • This study examined the effects of emotion inducing contexts on types of infants smiling. Facial expressions of forty-five 11-to 15-month-old infants were videotaped in an experimental lab with positive and negative emotional contests. Infants' smiling was identified as the Duchenne smile or non-Duchenne smile based on FACS(Facial Action Coding System, Ekman & Friesen, 1978). Duration of smiling types was analyzed. Overall, infants showed more smiling in the positive than in the negative emotional context. Occurrence of Duchenne smiling was more likely in the positive than in the negative context and in the peek-a-boo than in the melody toy condition within the same positive context. Non-Duchenne smiling did not differ by context.

  • PDF

Facial expression recognition based on pleasure and arousal dimensions (쾌 및 각성차원 기반 얼굴 표정인식)

  • 신영숙;최광남
    • Korean Journal of Cognitive Science
    • /
    • v.14 no.4
    • /
    • pp.33-42
    • /
    • 2003
  • This paper presents a new system for facial expression recognition based in dimension model of internal states. The information of facial expression are extracted to the three steps. In the first step, Gabor wavelet representation extracts the edges of face components. In the second step, sparse features of facial expressions are extracted using fuzzy C-means(FCM) clustering algorithm on neutral faces, and in the third step, are extracted using the Dynamic Model(DM) on the expression images. Finally, we show the recognition of facial expression based on the dimension model of internal states using a multi-layer perceptron. The two dimensional structure of emotion shows that it is possible to recognize not only facial expressions related to basic emotions but also expressions of various emotion.

  • PDF

A Comparative Analysis on Facial Expression in Advertisements -By Utilising Facial Action Coding System(FACS) (광고 속의 얼굴 표정에 따른 비교 연구 -FACS를 활용하여)

  • An, Kyoung Hee
    • The Journal of the Korea Contents Association
    • /
    • v.19 no.3
    • /
    • pp.61-71
    • /
    • 2019
  • Due to the limit of the time length of advertisement, facial expressions among the types of nonverbal communication are much more expressive and convincing to appeal to costumers. The purpose of this paper is not only to investigate how facial expressions are portrayed but also to examine how facial expressions convey emotion in TV advertisements. Research subjects are TV advertisements of and which had the wide range of popularity for customer known as one of the most touching commercials. The research method is Facial Action Coding System based on the theoretical perspective of a discrete emotions and designed to measure specific facial muscle movements. This research is to analyse the implications of facial expressions in the both TV ads by using FACS based on Psychology as well as anatomy. From the all the result of this, it is shown that the facial expressions portrayed with the conflict of emotional states and the dramatic emotional relief of the heroin could move more customers' emotions.