Browse > Article

Music Emotion Classification Based On Three-Level Structure  

Kim, Hyoung-Gook (Intelligent Multimedia Signal Processing, Kwangwoon University)
Jeong, Jin-Guk (Samsung Advanced Institute of Technology)
Abstract
This paper presents the automatic music emotion classification on acoustic data. A three-level structure is developed. The low-level extracts the timbre and rhythm features. The middle-level estimates the indication functions that represent the emotion probability of a single analysis unit. The high-level predicts the emotion result based on the indication function values. Experiments are carried out on 695 homogeneous music pieces labeled with four emotions, including pleasant, calm, sad, and excited. Three machine learning methods, GMM, MLP, and SVM, are compared on the high-level. The best result of 90.16% is obtained by MLP method.
Keywords
automatic music emotion classification; timbre and rhythm features; machine learning methods;
Citations & Related Records
연도 인용수 순위
  • Reference
1 B.L. Feldman, and J.A Russell, 'Independence and bipolarity in the structure of affect,' Journal of Personality and Social Psychology, 74, 967-984, 1998   DOI   ScienceOn
2 D. Liu, L. Lu, and H.J. Zhang, 'Automatic mood detection from acoustic music data,' ISMIR 2003
3 Y.Z. Feng, Y.T. Zhuang, and Y.H, Pan, 'Music information retrieval by detecting mood via computational media aesthetics,' IEEE/WlC International Conf. on Web Intelligence 2003
4 P.N. Juslin, and J.A Sloboda, 'Music and emotion: theory and Research,' Oxford Univ. Press, 2001