DOI QR코드

DOI QR Code

다중 감성 기반의 선호도 평가 시스템

A Evaluation System for Preference based on Multi-Emotion

  • 이기영 (을지대학교 의료IT마케팅학과) ;
  • 임명재 (을지대학교 의료IT마케팅학과) ;
  • 김규호 (을지대학교 의료IT마케팅학과) ;
  • 이용환 (우송정보대학 방송영상과)
  • Lee, Ki-Young (Dept. of Medical IT & Marketing, Eulji University) ;
  • Lim, Myung-Jae (Dept. of Medical IT & Marketing, Eulji University) ;
  • Kim, Kyu-Ho (Dept. of Medical IT & Marketing, Eulji University) ;
  • Lee, Yong-Whan (Dept. of Broadcasting & Visual Media, Woosong Information College)
  • 투고 : 2011.02.15
  • 심사 : 2011.02.27
  • 발행 : 2011.05.31

초록

현대 사회에서는 기업의 의사결정에 있어 고객의 중요성이 지속적으로 증가되고 있으며, 정보통신 기술의 발전에 힘입어 컴퓨터상에서 효과적으로 주요 고객의 선호도를 측정하는 기법이 연구되고 있다. 그러나 이러한 선호도는 개인의 성향이 크게 반영되므로 명확하게 수치화하기 어렵고 측정 기준에 따라 모호한 결과가 산출되는 어려움이 있다. 따라서 본 논문에서는 측정된 생체정보를 이용하여 구성한 다중 감성모델을 기반으로 고객의 선호도를 평가하는 시스템을 제안하였다. 본 시스템은 여러 생체정보로 이루어진 다차원 벡터의 학습을 통하여 구조화된 감성모델을 이용하므로 동일한 기준을 적용하여 고객 선호도를 평가할 수 있다. 또한 특정 대상에 특화된 감성모델을 학습하여 정확도를 더 향상시키는 것도 가능하며 실험을 통하여 정확도의 향상을 보였다.

In modern society, in business decisions of our customers are continually increasing in importance, and owing to the development of information and communication technology effectively on a computer to measure the preferences of key customer techniques are being studied. However, this preference reflects significantly on personal ideas, and therefore, it is difficult to commercialize a measure calculated according to the ambiguous results. In this paper, by using biometric information that has been measure; we have configured the multi-sensitivity models based on customer preferences to evaluate the proposed system. This system consists of multiple biometric information of multi-dimensional vector model for learning through the use of structured emotional to apply the same criteria to evaluate customer preferences. In addition, by studying the specific subject-specific emotion model, it is shown to improve accuracy with further experiments.

키워드

참고문헌

  1. Gediminas A., Alexander T., Rong Z., "REQUEST: A Query Language for Customizing Recommendations", Conditional accepted by Information System Research, 2010.
  2. Christoph Fuchs, Martin Schreier, "Customer Empowerment in New Product Development", Journal of Product InnovationManagement, 2010.
  3. Florian E., Martin W., Bjorn S., "openEAR - Introducing the Munich Open-Source Emotion and Affect Recognition Toolkit", Proc. ACII. IEEE, 2009.
  4. Zhihong Z. et al., "Audio-Visual Spontaneous Emotion Recognition", AI for Human Computing, Springer Berlin / Heidelberg, LNAI 4451, pp. 72-90, 2007.
  5. Thurid V., Elisabeth A., Nikolaus B., "EmoVoice - A framework for online recognition of emotions from voice", PIT 2008, Springer Berlin / Heidelberg, LNAI 5078, pp. 188-199, 2008.
  6. Jun Hakura et al., "Facial Expression Recognition and Synthesis for Virtual Miyazawa Kenji System", WSEAS Transactions on Circuits and Systems, Vol. 3, No. 6, pp. 288-295, 2007.
  7. Christos D. Katsis et al., "Toward Emotion Recognition in Car-Racing Drivers: A Biosignal Processing Approach", IEEE Transactions On Systems, Man, And Cybernetics-Part A: Systems And Humans, Vol. 38, No. 3, pp. 502-512, May 2008. https://doi.org/10.1109/TSMCA.2008.918624
  8. Chiara C., David S., Patrik V., "Recognition of Emotional Face Expressions and Amygdala Pathology", Epileptologie 2007, pp. 130-138, 2007.
  9. Johannes W., Elisabeth A., Frank J., "Smart Sensor Integration: A Framework for Multimodal Emotion Recognition in Real-Time", Affective Computing and Intelligent Interaction (ACII) IEEE, 2009.
  10. Martin Wollmer et al., "Data-driven Clustering in Emotional Space for Affect Recognition Using Discriminatively Trained LSTM Networks", Proc. of Interspeech, pp. 1595-1598, 2009.
  11. Marko L., Marie-Elise J., Bin Y., "Combining Classifiers With Diverse Feature Sets For Robust Speaker Independent Emotion Recognition", Proceedings of the 17th European Signal Processing Conference (EUSIPCO 2009), pp. 1225-1229, 2009.
  12. Martin Wollmer, et al., "Abandoning Emotion Classes - Towards Continuous Emotion Recognition with Modelling of Long-Range Dependencies", Proceedings Interspeech, pp. 597-600, 2008.