3 레벨 구조 기반의 음악 무드분류

Music Emotion Classification Based On Three-Level Structure

  • 발행 : 2007.06.30

초록

This paper presents the automatic music emotion classification on acoustic data. A three-level structure is developed. The low-level extracts the timbre and rhythm features. The middle-level estimates the indication functions that represent the emotion probability of a single analysis unit. The high-level predicts the emotion result based on the indication function values. Experiments are carried out on 695 homogeneous music pieces labeled with four emotions, including pleasant, calm, sad, and excited. Three machine learning methods, GMM, MLP, and SVM, are compared on the high-level. The best result of 90.16% is obtained by MLP method.

키워드

참고문헌

  1. B.L. Feldman, and J.A Russell, 'Independence and bipolarity in the structure of affect,' Journal of Personality and Social Psychology, 74, 967-984, 1998 https://doi.org/10.1037/0022-3514.74.4.967
  2. P.N. Juslin, and J.A Sloboda, 'Music and emotion: theory and Research,' Oxford Univ. Press, 2001
  3. D. Liu, L. Lu, and H.J. Zhang, 'Automatic mood detection from acoustic music data,' ISMIR 2003
  4. Y.Z. Feng, Y.T. Zhuang, and Y.H, Pan, 'Music information retrieval by detecting mood via computational media aesthetics,' IEEE/WlC International Conf. on Web Intelligence 2003