• Title/Summary/Keyword: Intelligent Musical Fountain

Search Result 3, Processing Time 0.013 seconds

Development of Intelligent Green Fountain Culture System for Healthy Emotional Self-Concept (건강한 정서 자아를 위한 지능형 녹색경관 제어시스템 개발)

  • Park, Seung-Min;Lee, Young-Hwan;Kim, Jun-Yeup;Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.22 no.3
    • /
    • pp.281-286
    • /
    • 2012
  • In the growing standard of people's lives, we want desire to create eco-friendly water space what is called the Green Technology that is in the limelight. These green space is introduced the cultural contents and we use the water, music, and nature as tool of emotional verbalism. Presently, when we want to make scenario, water landscape scenario is made by director. but these systems have some disadvantages as the cost and limitation of direction. There is a growing interest in the integrated control system based on PC and Internet. In this paper, it is about fountain control system. Previous research area was only one using programmable logic controller or industrial PC. we proposed the development of intelligent green fountain culture system for healthy emotional self-concept. And we made automatic weather sensing system that is designed by the intelligent green fountain culture system to estimate the time-variant system.

Study of Music Classification Optimized Environment and Atmosphere for Intelligent Musical Fountain System (지능형 음악분수 시스템을 위한 환경 및 분위기에 최적화된 음악분류에 관한 연구)

  • Park, Jun-Heong;Park, Seung-Min;Lee, Young-Hwan;Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.21 no.2
    • /
    • pp.218-223
    • /
    • 2011
  • Various research studies are underway to explore music classification by genre. Because sound professionals define the criterion of music to categorize differently each other, those classification is not easy to come up clear result. When a new genre is appeared, there is onerousness to renew the criterion of music to categorize. Therefore, music is classified by emotional adjectives, not genre. We classified music by light and shade in precedent study. In this paper, we propose the music classification system that is based on emotional adjectives to suitable search for atmosphere, and the classification criteria is three kinds; light and shade in precedent study, intense and placid, and grandeur and trivial. Variance Considered Machines that is an improved algorithm for Support Vector Machine was used as classification algorithm, and it represented 85% classification accuracy with the result that we tried to classify 525 songs.

Development of Music Classification of Light and Shade using VCM and Beat Tracking (VCM과 Beat Tracking을 이용한 음악의 명암 분류 기법 개발)

  • Park, Seung-Min;Park, Jun-Heong;Lee, Young-Hwan;Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.20 no.6
    • /
    • pp.884-889
    • /
    • 2010
  • Recently, a music genre classification has been studied. However, experts use different criteria to classify each of these classifications is difficult to derive accurate results. In addition, when the emergence of a new genre of music genre is a newly re-defined. Music as a genre rather than to separate search should be classified as emotional words. In this paper, the feelings of people on the basis of brightness and darkness tries to categorize music. The proposed classification system by applying VCM(Variance Considered Machines) is the contrast of the music. In this paper, we are using three kinds of musical characteristics. Based on surveys made throughout the learning, based on musical attributes(beat, timbre, note) was used to study in the VCM. VCM is classified by the trained compared with the results of the survey were analyzed. Note extraction using the MATLAB, sampled at regular intervals to share music via the FFT frequency analysis by the sector average is defined as representing the element extracted note by quantifying the height of the entire distribution was identified. Cumulative frequency distribution in the entire frequency rage, using the difference in Timbre and were quantified. VCM applied to these three characteristics with the experimental results by comparing the survey results to see the contrast of the music with a probability of 95.4% confirmed that the two separate.