• Title/Summary/Keyword: The Visualization of Music

Search Result 30, Processing Time 0.026 seconds

Sound Visualization based on Emotional Analysis of Musical Parameters (음악 구성요소의 감정 구조 분석에 기반 한 시각화 연구)

  • Kim, Hey-Ran;Song, Eun-Sung
    • The Journal of the Korea Contents Association
    • /
    • v.21 no.6
    • /
    • pp.104-112
    • /
    • 2021
  • In this study, emotional analysis was conducted based on the basic attribute data of music and the emotional model in psychology, and the result was applied to the visualization rules in the formative arts. In the existing studies using musical parameter, there were many cases with more practical purposes to classify, search, and recommend music for people. In this study, the focus was on enabling sound data to be used as a material for creating artworks and used for aesthetic expression. In order to study the music visualization as an art form, a method that can include human emotions should be designed, which is the characteristics of the arts itself. Therefore, a well-structured basic classification of musical attributes and a classification system on emotions were provided. Also, through the shape, color, and animation of the visual elements, the visualization of the musical elements was performed by reflecting the subdivided input parameters based on emotions. This study can be used as basic data for artists who explore a field of music visualization, and the analysis method and work results for matching emotion-based music components and visualizations will be the basis for automated visualization by artificial intelligence in the future.

Creating the Idea of Textile Print Pattern Design Using the Visual Expression of Popular Music (대중음악의 시각화를 통한 텍스타일 프린트 패턴디자인 발상)

  • Kim, Ji Yeon;Oh, Kyung Wha;Jung, Hye Jung
    • Fashion & Textile Research Journal
    • /
    • v.17 no.4
    • /
    • pp.524-540
    • /
    • 2015
  • This study develops textile pattern design ideas created through the visualization of music. Methods of auditory and synesthesia were employed to analyze various attributes of popular music genres and appoint language image, shape image, and color image to obtain their interrelationships. This study provides data that can be used to express emotional images on textile print pattern designs. This research used different genres of popular music as stimuli. The language image was extracted and introduced to the overall color scheme; in addition, the color image was verified. The analysis of the color image was executed by applying it with the color set image scale of I.R.I colors. Then, the color image of the target genre of popular music was examined and analyzed through a color tone system. The preference in shape image was realized through visual images based on basic principles of points, lines, and sides composition; subsequently, an analysis of the emotional image of popular music followed. An examination of the emotional images of different popular music genres have led to the discovery that language image, color image, and shape image all share a common emotional image. There was also a realization that similarity and interrelationship exists in language, color, and shape images experienced by listening to popular music.

Implementation of the System Converting Image into Music Signals based on Intentional Synesthesia (의도적인 공감각 기반 영상-음악 변환 시스템 구현)

  • Bae, Myung-Jin;Kim, Sung-Ill
    • Journal of IKEEE
    • /
    • v.24 no.1
    • /
    • pp.254-259
    • /
    • 2020
  • This paper is the implementation of the conversion system from image to music based on intentional synesthesia. The input image based on color, texture, and shape was converted into melodies, harmonies and rhythms of music, respectively. Depending on the histogram of colors, the melody can be selected and obtained probabilistically to form the melody. The texture in the image expressed harmony and minor key with 7 characteristics of GLCM, a statistical texture feature extraction method. Finally, the shape of the image was extracted from the edge image, and using Hough Transform, a frequency component analysis, the line components were detected to produce music by selecting the rhythm according to the distribution of angles.

A study of sound graphic equalizer configuration using photo image (이미지를 이용한 사운드 그래픽 이퀄라이저의 구성에 대한 연구)

  • Seo, June-Seok;Hong, Sung-Dae;Park, Jin-Wan
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02b
    • /
    • pp.430-435
    • /
    • 2008
  • Thanks to the development of IT technology, there have been developed a variety of types of portable music players. IT technology didn't stop there, however. It has gone to developing GUIs (Graphic User Interfaces) to deliver more information to the user. As the function of GUIs has become important, the music players are being required to show characteristics of the sounds they output visually beyond just delivering the sounds through analyzing the information that the sounds contain. To visualize the information of sounds, that is to say, has become substantial. In this process, sound graphic equalizers have been developed in order. The object of this study is to produce a new sound graphic equalizer with new forms of expressing visual images of sounds besides the bar graphs, in which user feedback is possible. This study has devised a new sound visualization form in visually expressing the information of sounds by analyzing their characteristics. This new sound visualization provides a sound graphic equalizer with which the user can select images for the information of the sounds s/he listens. This study suggests a new alternative GUI with which the user can change the form of the outputted images in realtime as communicating with the player.

  • PDF

Development of the Artwork using Music Visualization based on Sentiment Analysis of Lyrics (가사 텍스트의 감성분석에 기반 한 음악 시각화 콘텐츠 개발)

  • Kim, Hye-Ran
    • The Journal of the Korea Contents Association
    • /
    • v.20 no.10
    • /
    • pp.89-99
    • /
    • 2020
  • In this study, we tried to produce moving-image works through sentiment analysis of music. First, Google natural language API was used for the sentiment analysis of lyrics, then the result was applied to the image visualization rules. In prior engineering researches, text-based sentiment analysis has been conducted to understand users' emotions and attitudes by analyzing users' comments and reviews in social media. In this study, the data was used as a material for the creation of artworks so that it could be used for aesthetic expressions. From the machine's point of view, emotions are substituted with numbers, so there is a limit to normalization and standardization. Therefore, we tried to overcome these limitations by linking the results of sentiment analysis of lyrics data with the rules of formative elements in visual arts. This study aims to transform existing traditional art works such as literature, music, painting, and dance to a new form of arts based on the viewpoint of the machine, while reflecting the current era in which artificial intelligence even attempts to create artworks that are advanced mental products of human beings. In addition, it is expected that it will be expanded to an educational platform that facilitates creative activities, psychological analysis, and communication for people with developmental disabilities who have difficulty expressing emotions.

Research of real-time image which is responding to the strings sound in art performance (무대 공연에서 현악기 소리에 반응하는 실시간 영상에 관한 연구)

  • Jang, Eun-Sun;Hong, Sung-Dae;Park, Jin-Wan
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2009.05a
    • /
    • pp.185-190
    • /
    • 2009
  • Recent performing-art has a trend to be new cultural contents style which mixes various genre not just traditional way. Especially in stage performance, unique performance is playing using high technology and image. In sound performance, one of technology, a new experiment is trying which re-analyze the sound and mixes the result with image. But in public performance we have a technical difficulty with making visualization regarding the sound in realtime. Because we can not make visualization with instant sound from performers and audience it is difficult to interact smoothly between performer and audience. To resolve this kind of restriction, this paper suggests Real-time sound visualization. And we use string music instrument for sound source. Using the MaxMSP/Jitter based the Midi, we build image control system then we test and control the image with Korg Nano Kontrol. With above experiment we can verify verious emotion, feeling and rhythm of performer according to performance environment and also we can verify the real time interactive image which can be changed momently by performer's action.

  • PDF

Visualized recommender system based on Freebase (Freebase 기반의 추천 시스템 시각화)

  • Hong, Myung-Duk;Ha, Inay;Jo, Geun-Sik
    • Journal of the Korea Society of Computer and Information
    • /
    • v.18 no.10
    • /
    • pp.23-37
    • /
    • 2013
  • In this paper, the proposed movie recommender system constructs trust network, which is similar to social network, using user's trust information that users explicitly present. Recommendation on items is performed by using relation degree between users and information of recommended item is provided by a visualization method. We discover the hidden relationships via the constructed trust network. To provide visualized recommendation information, we employ Freebase which is large knowledge base supporting information such as movie, music, and people in structured format. We provide three visualization methods as the followings: i) visualization based on movie posters with the number of movies that user required. ii) visualization on extra information such as director, actor and genre and so on when user selected a movie from recommendation list. iii) visualization based on movie posters that is recommended by neighbors who a user selects from trust network. The proposed system considers user's social relations and provides visualization which can reflect user's requirements. Using the visualization methods, user can reach right decision making on items. Furthermore, the proposed system reflects the user's opinion through recommendation visualization methods and can provide rich information to users through LOD(Linked Open Data) Cloud such as Freebase, LinkedMDB and Wikipedia and so on.

The Possibility of Being an Alternative as Uncontact Concert Format for BTS's Recent Online Concert Called "Bang Bang Con The Live" (BTS '방방콘 The Live'의 비접촉 콘서트로서의 대안 포맷 가능성)

  • Yu, An-Na;Lee, Jong-Oh
    • Journal of Korea Entertainment Industry Association
    • /
    • v.14 no.5
    • /
    • pp.27-35
    • /
    • 2020
  • In 2020, one after another, musicians started to hold non-contact online concerts as the pop music industry began to shrink due to the unprecedented "Covid-19" incident, in seeking for various channels and alternatives. Taking this into account, this study conducted a survey with BTS' "Bang Bang Con The Live" as the subject that was held June 2020 to see if they were qualified as an alternative format for quarantine-focused non-contact concerts. The survey was carried out with the content analysis of online concert videos participated by music experts and by collecting evaluations and alternative suggestions from the audiences. The study showed that the sample performance video well fulfilled the actual feels of offline concert elements (field visualization, face-to-face interaction, and responsive behavior), satisfaction, and convenience as an alternative concert format as an emergency correspondence. However, it was analyzed that there was a serious lack of empathy and realism with fans. Therefore, the study showed the possibility that BTS' "Bangbangcon The Live" will function as an alternative format for non-contact concerts, but suggests that measures such as supplementing digital communication systems and technology devices must be taken to be recognized as a level of settlement in the global music community as well as in Korea.

Personalized Book Curation System based on Integrated Mining of Book Details and Body Texts (도서 정보 및 본문 텍스트 통합 마이닝 기반 사용자 맞춤형 도서 큐레이션 시스템)

  • Ahn, Hee-Jeong;Kim, Kee-Won;Kim, Seung-Hoon
    • Journal of Information Technology Applications and Management
    • /
    • v.24 no.1
    • /
    • pp.33-43
    • /
    • 2017
  • The content curation service through big data analysis is receiving great attention in various content fields, such as film, game, music, and book. This service recommends personalized contents to the corresponding user based on user's preferences. The existing book curation systems recommended books to users by using bibliographic citation, user profile or user log data. However, these systems are difficult to recommend books related to character names or spatio-temporal information in text contents. Therefore, in this paper, we suggest a personalized book curation system based on integrated mining of a book. The proposed system consists of mining system, recommendation system, and visualization system. The mining system analyzes book text, user information or profile, and SNS data. The recommendation system recommends personalized books for users based on the analysed data in the mining system. This system can recommend related books using based on book keywords even if there is no user information like new customer. The visualization system visualizes book bibliographic information, mining data such as keyword, characters, character relations, and book recommendation results. In addition, this paper also includes the design and implementation of the proposed mining and recommendation module in the system. The proposed system is expected to broaden users' selection of books and encourage balanced consumption of book contents.