• Title/Summary/Keyword: Emotion-based music visualization

Search Result 3, Processing Time 0.018 seconds

Emotion-based music visualization using LED lighting control system (LED조명 시스템을 이용한 음악 감성 시각화에 대한 연구)

  • Nguyen, Van Loi;Kim, Donglim;Lim, Younghwan
    • Journal of Korea Game Society
    • /
    • v.17 no.3
    • /
    • pp.45-52
    • /
    • 2017
  • This paper proposes a new strategy of emotion-based music visualization. Emotional LED lighting control system is suggested to help audiences enhance the musical experience. In the system, emotion in music is recognized by a proposed algorithm using a dimensional approach. The algorithm used a method of music emotion variation detection to overcome some weaknesses of Thayer's model in detecting emotion in a one-second music segment. In addition, IRI color model is combined with Thayer's model to determine LED light colors corresponding to 36 different music emotions. They are represented on LED lighting control system through colors and animations. The accuracy of music emotion visualization achieved to over 60%.

Sound Visualization based on Emotional Analysis of Musical Parameters (음악 구성요소의 감정 구조 분석에 기반 한 시각화 연구)

  • Kim, Hey-Ran;Song, Eun-Sung
    • The Journal of the Korea Contents Association
    • /
    • v.21 no.6
    • /
    • pp.104-112
    • /
    • 2021
  • In this study, emotional analysis was conducted based on the basic attribute data of music and the emotional model in psychology, and the result was applied to the visualization rules in the formative arts. In the existing studies using musical parameter, there were many cases with more practical purposes to classify, search, and recommend music for people. In this study, the focus was on enabling sound data to be used as a material for creating artworks and used for aesthetic expression. In order to study the music visualization as an art form, a method that can include human emotions should be designed, which is the characteristics of the arts itself. Therefore, a well-structured basic classification of musical attributes and a classification system on emotions were provided. Also, through the shape, color, and animation of the visual elements, the visualization of the musical elements was performed by reflecting the subdivided input parameters based on emotions. This study can be used as basic data for artists who explore a field of music visualization, and the analysis method and work results for matching emotion-based music components and visualizations will be the basis for automated visualization by artificial intelligence in the future.

Research of real-time image which is responding to the strings sound in art performance (무대 공연에서 현악기 소리에 반응하는 실시간 영상에 관한 연구)

  • Jang, Eun-Sun;Hong, Sung-Dae;Park, Jin-Wan
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2009.05a
    • /
    • pp.185-190
    • /
    • 2009
  • Recent performing-art has a trend to be new cultural contents style which mixes various genre not just traditional way. Especially in stage performance, unique performance is playing using high technology and image. In sound performance, one of technology, a new experiment is trying which re-analyze the sound and mixes the result with image. But in public performance we have a technical difficulty with making visualization regarding the sound in realtime. Because we can not make visualization with instant sound from performers and audience it is difficult to interact smoothly between performer and audience. To resolve this kind of restriction, this paper suggests Real-time sound visualization. And we use string music instrument for sound source. Using the MaxMSP/Jitter based the Midi, we build image control system then we test and control the image with Korg Nano Kontrol. With above experiment we can verify verious emotion, feeling and rhythm of performer according to performance environment and also we can verify the real time interactive image which can be changed momently by performer's action.

  • PDF