Browse > Article
http://dx.doi.org/10.13088/jiis.2013.19.2.073

The Audience Behavior-based Emotion Prediction Model for Personalized Service  

Ryoo, Eun Chung (Graduate School of Business Administration, Kyung Hee University)
Ahn, Hyunchul (School of Management Information Systems, Kookmin University)
Kim, Jae Kyeong (Graduate School of Business Administration, Kyung Hee University)
Publication Information
Journal of Intelligence and Information Systems / v.19, no.2, 2013 , pp. 73-85 More about this Journal
Abstract
Nowadays, in today's information society, the importance of the knowledge service using the information to creative value is getting higher day by day. In addition, depending on the development of IT technology, it is ease to collect and use information. Also, many companies actively use customer information to marketing in a variety of industries. Into the 21st century, companies have been actively using the culture arts to manage corporate image and marketing closely linked to their commercial interests. But, it is difficult that companies attract or maintain consumer's interest through their technology. For that reason, it is trend to perform cultural activities for tool of differentiation over many firms. Many firms used the customer's experience to new marketing strategy in order to effectively respond to competitive market. Accordingly, it is emerging rapidly that the necessity of personalized service to provide a new experience for people based on the personal profile information that contains the characteristics of the individual. Like this, personalized service using customer's individual profile information such as language, symbols, behavior, and emotions is very important today. Through this, we will be able to judge interaction between people and content and to maximize customer's experience and satisfaction. There are various relative works provide customer-centered service. Specially, emotion recognition research is emerging recently. Existing researches experienced emotion recognition using mostly bio-signal. Most of researches are voice and face studies that have great emotional changes. However, there are several difficulties to predict people's emotion caused by limitation of equipment and service environments. So, in this paper, we develop emotion prediction model based on vision-based interface to overcome existing limitations. Emotion recognition research based on people's gesture and posture has been processed by several researchers. This paper developed a model that recognizes people's emotional states through body gesture and posture using difference image method. And we found optimization validation model for four kinds of emotions' prediction. A proposed model purposed to automatically determine and predict 4 human emotions (Sadness, Surprise, Joy, and Disgust). To build up the model, event booth was installed in the KOCCA's lobby and we provided some proper stimulative movie to collect their body gesture and posture as the change of emotions. And then, we extracted body movements using difference image method. And we revised people data to build proposed model through neural network. The proposed model for emotion prediction used 3 type time-frame sets (20 frames, 30 frames, and 40 frames). And then, we adopted the model which has best performance compared with other models.' Before build three kinds of models, the entire 97 data set were divided into three data sets of learning, test, and validation set. The proposed model for emotion prediction was constructed using artificial neural network. In this paper, we used the back-propagation algorithm as a learning method, and set learning rate to 10%, momentum rate to 10%. The sigmoid function was used as the transform function. And we designed a three-layer perceptron neural network with one hidden layer and four output nodes. Based on the test data set, the learning for this research model was stopped when it reaches 50000 after reaching the minimum error in order to explore the point of learning. We finally processed each model's accuracy and found best model to predict each emotions. The result showed prediction accuracy 100% from sadness, and 96% from joy prediction in 20 frames set model. And 88% from surprise, and 98% from disgust in 30 frames set model. The findings of our research are expected to be useful to provide effective algorithm for personalized service in various industries such as advertisement, exhibition, performance, etc.
Keywords
Emotion Prediction Model; Audience Experience; Gesture Recognition; Neural Network; Difference Image Method;
Citations & Related Records
Times Cited By KSCI : 6  (Citation Analysis)
연도 인용수 순위
1 McCulloch, W. S. and W. Pitts, "A logical calculus of the ideas immanent in nervous activity," The Bulletin of Mathematical Biophysics, Vol.5, No.4(1943), 15-33.   DOI
2 Plutchik, R., Emotion : A psycho-evolutionary synthesis, New York : Harper and Row, 1980.
3 Plutchik, R., Emotions and life : Perspectives from psychology, biology, and evolution, Washington, DC : American Psychological Association, 2002.
4 Ryoo, E. C., S. B. Park, and J. K. Kim, "Audience reaction judgement model for interactive performance," Proceeding of the Conference on Intelligence and Information Systems, (2012), 121-126.
5 You, H. J., "The re-experience intention affects experiential traits and emotions of consumer on the performances cultural service of companies," The e-Business Studies, Vol.11, No.3 (2010), 519-538.   DOI
6 Zhu, Y., Y. Huang, G. Xu, H. Ren, and Z. Wen, "Vision-based Interpretation of Hand Gestures by Modeling Appearance Changes in Image Sequences," Proceedings of MVA'98 IAPR Workshop on Machine Vision Applications, (1998), 573-576.
7 Dempster, A. P., N. M. Laird, and D. B. Rubin, "Maximum-likelihood from incomplete data via the EM algorithm," Journal of the Royal Statistical Society, Series B, Vol.39, No.1 (1977), 1-38.
8 Ekman, P. and W. V. Friesen, Unmasking the face : a guide to recognizing emotions from facial clues, Englewood Cliffs, NJ : Prentice-Hall, 1975.
9 Hong, D. P. and W. T. Woo, "AR-based Tangible User Interface for I2-NEXT," Proceedings of the conference on HCI 2005, (2005), 1628-1633.
10 Jang, N. S., Data Mining, Dae Chung Media, 1999.
11 Joo, J. T., I. H. Jang, H. C. Yang, and K. B. Sim, "Emotion Recognition and Expression Method using Bi-Modal Sensor Fusion Algorithm," Journal of Control, Automation and Systems Engineering, Vol.13, No.8(2007), 754-759.   과학기술학회마을   DOI   ScienceOn
12 Jung, M. K. and J. K. Kim, "The Intelligent Determination Model of Audience Emotion for Implementing Personalized Exhibition," Journal of Intelligence and Information Systems, Vol.18, No.1(2012), 39-57.   과학기술학회마을
13 Kim, J. H., Y. S. Yun, T. Y. Kim, and C. S. Lim, "Human primitive motion recognition based on the hidden markov models," Journal of Korea Multimedia Society, Vol.12, No.4(2009), 521- 529.   과학기술학회마을
14 Lee, S. H., "The Study for Equilibrium of Emotions by Oriental Medicine Music Therapy," Journal of Oriental Medical Classics, Vol.24, No.6(2011), 39-47.   과학기술학회마을
15 Korea Mecenat Association, (2013), Available at http://www.mecenat.or.kr/(Accessed 14 June, 2013).
16 Laban, R., Modern Educational Dance, Macdonald and Evans, 1968.
17 Lee, I. H. and C. J. Park, "The status of the motion capture technology and Applications," Journal of Korea Multimedia Society, Vol.3, No.1(1999), 38-48.
18 Ambady, N. and R. Rosenthal, "Thin slices of expressive behavior as predictors of interpersonal consequences : A meta-analysis," Psychological Bulletin, Vol.111, No.2(1992), 256-274.   DOI
19 Cho, K. E., C. S. Lee, and K. H. Um, "Design of a Character' System to Express Mixed-Emotion by Gesture and Posture," Proceeding of the conference on Korea Multimedia Society, (2009), 191-194.
20 Bae, J. H., S. H. So, and L. K. Choi, "Designing Mobile Framework for Intelligent Personalized Marketing Service in Interactive Exhibition Space," Journal of Intelligence and Information Systems, Vol.18, No.1(2012), 59-69.   과학기술학회마을
21 Coulson, M., "Attributing emotion to static body postures : recognition accuracy, confusions, and viewpoint dependence," Journal of Nonverbal Behavior, Vol.28, No.2(1992), 117-139.
22 Davis, J. W., Appearance-Based Motion Recognition of Human Actions, Technical Report, MIT, 1996.
23 Watanabe, T. and M. Yachida, "Real time recognition of gesture and gesture degree information using multi input image sequences," Proceeding of the 14th International Conference on Pattern Recognition, Vol.2(1998), 1855-1858.
24 Burgoon, J. K., M. L. Jensen, T. O. Meservy, J. Kruse, and J. F. Nunamaker, "Augmenting human identification of emotional states in video," Proceedings of the international conference on intelligent data analysis, 2005.
25 Kim, D. S., Neural Network Theory and Application, Hitech information, 1993.
26 Kim, S. J., E. C. Ryoo, M. K. Jung, J. K. Kim, and H. C. Ahn, "Application of support vector regression for improving the performance of the emotion prediction model," Journal of Intelligence and Information Systems, Vol.18, No.3(2012), 185-202.   과학기술학회마을