• Title/Summary/Keyword: Flow-based

Search Result 11,999, Processing Time 0.048 seconds

Development of Rapid-cycling Brassica rapa Plant Program based on Cognitive Apprenticeship Model and its Application Effects (인지적 도제 모델 기반의 Rapid-cycling Brassica rapa 식물 프로그램의 개발 및 적용 효과)

  • Jae Kwon Kim;Sung-Ha Kim
    • Journal of Science Education
    • /
    • v.47 no.2
    • /
    • pp.192-210
    • /
    • 2023
  • This study was intended to develop the plant molecular biology experimental program using Rapid-cycling Brassica rapa (RcBr) based on the teaching steps and teaching methods of the cognitive apprenticeship model and to determine its application effects. In order to improve a subject's cognitive function and expertise on molecular biology experiments, two themes composed of a total 8 class sessions were selected: 'Identification of DFR gene in purple RcBr and non-purple RcBr' and 'Identification of RcBr's genetic polymorphism site using the DNA profiling method'. Research subjects were 18 pre-service teaching majors in biology education of H University in Chungbuk, Korea. The effectiveness of the developed program was verified by analyzing the enhancement of 'cognitive function' related to the use of molecular biology knowledge and technology, and the enhancement of 'domain-general metacognitive abilities.' The effect of the developed program was also determined by analyzing the task flow diagram provided. The developed program was effective in improving the cognitive functions of the pre-service teachers on the use of knowledge and technology of molecular biology experiments. It was especially effective to improve the higher cognitive function of pre-service teachers who did not have the previous experience. The developed program also showed a significant improvement in the task of metacognitive knowledge and in the planning, checking, and evaluation of metacognitive regulation, which are sub-elements of domain-general metacognitive abilities. It was found that the developed program's self-test activity could help the pre-service teachers to improve their metacognitive regulation. Therefore, this developed program turned out to be helpful for pre-service teachers to develop core competencies needed for molecular biology experimental classes. If the teaching and learning materials of the developed program could be reconstructed and applied to in-service teachers or high school students, it would be expected to improve their metacognitive abilities.

Management Planning of Wind Corridor based on Mountain for Improving Urban Climate Environment - A Case Study of the Nakdong Jeongmaek - (도시환경개선을 위한 산림 기반 바람길 관리 계획 - 낙동정맥을 사례로 -)

  • Uk-Je SUNG;Jeong-Min SON;Jeong-Hee EUM;Jin-Kyu MIN
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.26 no.1
    • /
    • pp.21-40
    • /
    • 2023
  • This study analyzed the cold air characteristics of the Nakdong Jeongmaek, which is advantageous for the formation of cold air that can flow into the city, in order to suggest the wind ventilation corridor plans, which have recently been increasing interest as a way to improve the urban thermal environment. In addition, based on the watershed analysis, specific cold-air watershed areas were established and management plans were suggested to expand the cold air function of the Nakdong Jeongmaek. As a result of the analysis of cold air in the Nakdong Jeongaek, cold air was strongly generated in the northern forest of the Jeongamek, and flowed into nearby cities along the valley topography. On average, the speed of cold air was high in cities located to the east of the Jeongmaek, while the height of cold air layer was high in cities located to the west. By synthesizing these cold air characteristics and watershed analysis results, the cold-air watershed area was classified into 8 zones, And the plans were proposed to preserve and strengthen the temperature reduction of the Jeongmaek by designating the zones as 'Conservation area of Cold-air', 'Management area of Cold-air', and 'Intensive management area of Cold-air'. In addition, in order to verify the temperature reduction of cold air, the effect of night temperature reduction effect was compared with the cold air analysis using weather observation data. As a result, the temperature reduction of cold air was confirmed because the night temperature reduction was large at the observation station with strong cold air characteristics. This study is expected to be used as basic data in establishing a systematic preservation and management plan to expand the cold air function of the Nakdong Jeongmaek.

Yeomjae Song Tae-hoe Origin and art world of calligraphy and painting (염재(念齋) 송태회(宋太會) 서화의 연원과 예술세계)

  • Kim Doyoung
    • The Journal of the Convergence on Culture Technology
    • /
    • v.9 no.5
    • /
    • pp.255-262
    • /
    • 2023
  • In the early 20th century, Yeomjae Song Tae-hoe (念齋 宋泰會, 1872-1941), a disciple and onetime adopted son of teacher Song Su-myeon(宋修勉, 1847-1916), moved to Gochang and laid the foundation for Gochang calligraphy and painting, and it can be seen that a full-fledged flow began. Yeomjae Song Tae-hoe was a scholar and calligrapher of the late Joseon Dynasty and modern period from Hwasun, Jeollanam-do. He is a person who created the foundation of Gochang calligraphy and painting while working as an educator in Chinese literature, calligraphy, and painting, mainly in his hometown of Hwasun and Gochang, while engaging in creative activities. He was intelligent from a young age and showed an extraordinary talent for calligraphy. At the age of 16, he passed the Jinsa exam (童蒙進士) and became the youngest student to study at Sungkyunkwan. He was active by holding exhibitions nationwide based in Gochang and Jeonju, and was also an educator who fostered younger students by establishing Gochang High School (currently, Gochang Middle and High School) to cultivate national spirit and history. Yeomjae drew strong and healthy landscape paintings under the absolute influence of the painting style of Saho Song Su-myeon, and dealt with various materials of southern school literati paintings such as flowers and birds and four plants. In particular, he is a representative calligrapher who encompasses the early modern era and the modern era in that he expressed his interest in new cultural artifacts as well as the realization of a modern-oriented realistic landscape based on Korean natural beauty. He laid the foundation for modern and contemporary calligraphy and painting. Goam Lee Eung-no (顧菴 李應魯, 1904-1989), a world-renowned painter, learned the basics of ink painting from Yeomjae in his late teens.However, compared to his various artistic and social activities, it is regrettable that he is limited and evaluated as a local writer.

Contrast Media in Abdominal Computed Tomography: Optimization of Delivery Methods

  • Joon Koo Han;Byung Ihn Choi;Ah Young Kim;Soo Jung Kim
    • Korean Journal of Radiology
    • /
    • v.2 no.1
    • /
    • pp.28-36
    • /
    • 2001
  • Objective: To provide a systematic overview of the effects of various parameters on contrast enhancement within the same population, an animal experiment as well as a computer-aided simulation study was performed. Materials and Methods: In an animal experiment, single-level dynamic CT through the liver was performed at 5-second intervals just after the injection of contrast medium for 3 minutes. Combinations of three different amounts (1, 2, 3 mL/kg), concentrations (150, 200, 300 mgI/mL), and injection rates (0.5, 1, 2 mL/sec) were used. The CT number of the aorta (A), portal vein (P) and liver (L) was measured in each image, and time-attenuation curves for A, P and L were thus obtained. The degree of maximum enhancement (Imax) and time to reach peak enhancement (Tmax) of A, P and L were determined, and times to equilibrium (Teq) were analyzed. In the computed-aided simulation model, a program based on the amount, flow, and diffusion coefficient of body fluid in various compartments of the human body was designed. The input variables were the concentrations, volumes and injection rates of the contrast media used. The program generated the time-attenuation curves of A, P and L, as well as liver-to-hepatocellular carcinoma (HCC) contrast curves. On each curve, we calculated and plotted the optimal temporal window (time period above the lower threshold, which in this experiment was 10 Hounsfield units), the total area under the curve above the lower threshold, and the area within the optimal range. Results: A. Animal Experiment: At a given concentration and injection rate, an increased volume of contrast medium led to increases in Imax A, P and L. In addition, Tmax A, P, L and Teq were prolonged in parallel with increases in injection time The time-attenuation curve shifted upward and to the right. For a given volume and injection rate, an increased concentration of contrast medium increased the degree of aortic, portal and hepatic enhancement, though Tmax A, P and L remained the same. The time-attenuation curve shifted upward. For a given volume and concentration of contrast medium, changes in the injection rate had a prominent effect on aortic enhancement, and that of the portal vein and hepatic parenchyma also showed some increase, though the effect was less prominent. A increased in the rate of contrast injection led to shifting of the time enhancement curve to the left and upward. B. Computer Simulation: At a faster injection rate, there was minimal change in the degree of hepatic attenuation, though the duration of the optimal temporal window decreased. The area between 10 and 30 HU was greatest when contrast media was delivered at a rate of 2 3 mL/sec. Although the total area under the curve increased in proportion to the injection rate, most of this increase was above the upper threshould and thus the temporal window was narrow and the optimal area decreased. Conclusion: Increases in volume, concentration and injection rate all resulted in improved arterial enhancement. If cost was disregarded, increasing the injection volume was the most reliable way of obtaining good quality enhancement. The optimal way of delivering a given amount of contrast medium can be calculated using a computer-based mathematical model.

  • PDF

Betweenness Centrality-based Evacuation Vulnerability Analysis for Subway Stations: Case Study on Gwanggyo Central Station (매개 중심성 기반 지하철 역사 재난 대피 취약성 분석: 광교중앙역 사례연구)

  • Jeong, Ji Won;Ahn, Seungjun;Yoo, Min-Taek
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.44 no.3
    • /
    • pp.407-416
    • /
    • 2024
  • Over the past 20 years, there has been a rapid increase in the number and size of subway stations and underground structures worldwide, and the importance of safety for subway users has also continuously grown. Subway stations, due to their structural characteristics, have limited visibility and escape routes in disaster situations, posing a high risk of human casualties and economic losses. Therefore, an analysis of disaster vulnerabilities is essential not only for existing subway systems but also for deep underground facilities like GTX. This paper presents a case study applying a betweenness centrality-based disaster vulnerability analysis framework to the case of Gwanggyo Central Station. The analysis of Gwanggyo Central Station's base model and various disaster scenarios revealed that the betweenness centrality distribution is symmetrical, following the symmetrical spatial structure of the station, with high centrality concentrated in the central areas of basement levels one and two. These areas exhibited values more than 220% above the average, indicating a high likelihood of bottleneck phenomena during evacuation in disaster situations. To mitigate this vulnerability, scenarios were proposed to distribute evacuation flows concentrated in the central areas, enhancing the usability of peripheral areas as evacuation routes by connecting staircases continuously. This modification, when considered, showed a decrease in centrality concentration, confirming that the proposed addition of evacuation paths could effectively contribute to dispersing the flow of evacuation in Gwanggyo Central Station. This case study demonstrates the effectiveness of the proposed framework for assessing evacuation vulnerability in enhancing subway station user safety and can be effectively applied in disaster response and management plans for major underground facilities.

Hypoxemia In Liver Cirrhosis And Intrapulmonary Shunt Determination Using Tc-99m-MAA Whole Body Scan (간경화 환자에서의 저산소혈증과 Tc-99m-MAA 주사를 이용한 폐내단락 측정)

  • Lee, Kye-Young;Kim, Young-Whan;Han, Sung-Koo;Shim, Young-Soo;Kim, Keun-Youl;Han, Yong-Chol
    • Tuberculosis and Respiratory Diseases
    • /
    • v.41 no.5
    • /
    • pp.504-512
    • /
    • 1994
  • Background: It is well known that severe hypoxemia is often associated with liver cirrhosis without preexisting cardiac or pulmonary diseases. Pulmonary vascular impairments, more specifically, intrapulmonary shunting have been considered as a major mechanism. Intrapulmonary shunting arises from pulmonary vascular dilatation at the precapillary level or direct arteriovenous communication and has relationship with the characteristic skin findings of spider angioma. However, these results are mainly from Western countries where alcoholic and primary biliary cirrhosis are dominant cuases of cirrhosis. It is uncertain that the same is true in viral hepatitiss associated liver cirrhosis, which is dominant causes of liver cirrhosis in Korea. We investigated the incidences of hypoxemia and orthodeoxia in Korean cirrhotic patients dominantly composed of postnecrotic cirrhosis and the significance of intrapulmonary shunting as the suggested mechanism of hypoxemia, Method: We performed the arterial blood gas analysis separately both at the supine and errect position in 48 stable cirrhotic patients without the evidences of severe complications such as ascites, variceal bleeding, and hepatic coma. According to the results of arterial blood gas analysis, all patients were divided into hypoxemic and normoxemic group. In each group, pulmonary function test and Tc-99m-MAA whole body scan were performed. The shunting fraction was calculated based on the fact that the sum of cerebral and bilateral renal blood flow is 32% of the systemic blood flow. Results: The hypoxemia of $PaO_2$ less than 80 mmHg was observed in 9 patients(18.8%) and Orthodeoxia more than 10 mmHg was observed in 8 patients(16.7%). But there was no patient with significant hypoxemia of $PaO_2$ less than 60 mmHg. $PaO_2$ was significantly decreased in the patients with spider angioma than the pathients without spider angioma and showed no correlation with the serologic type and severities of liver function test findings. Any parameters of pulmonary function test did not demonstrate the difference between normoxemic and hypoxemic group. But hypoxemic group showed significantly increased shunt fraction of $11.4{\pm}4.1%$ than normoxemic group of $4.1{\pm}2.0%$ (p<0.05). Conclusions: Hypoxemia is not infrequently observed complication in liver cirrhosis and intrapulmonary shunting is suggested to p1ay a major ro1e in the development of hypxemia. But there was no great likelihood of clinically significant hypoxemia in our domestic cirrhotic patients predominantly composed of postnecrotic type.

  • PDF

The Experimental Study for Myocardial Preservation Effect of Ischemic Preconditioning (허혈성 전조건화 유발이 심근보호에 미치는 영향에 관한 실험적 연구)

  • 이종국;박일환;이상헌
    • Journal of Chest Surgery
    • /
    • v.37 no.2
    • /
    • pp.119-130
    • /
    • 2004
  • Decrease in cardiac function after open heart surgery is due to an ischemia induced myocardial damage during surgery, and ischemic preconditioning, a condition in which the myocardial damage does not accumulate after repeated episodes of ischemia but protects itself from damage after prolonged ischemia due to myocytes tolerating the ischemia, is known to diminish myocardial damage, which also helps the recovery of myocardium after reperfusion, and decreases incidences of arrythmia. Our study is performed to display the ischemic preconditioning and show the myocardial protective effect by applying cardioplegic solution to the heart removed from rat. Material and Method: Sprague-Dawley male rats were used, They were fixed on a modified isolated working heart model after cannulation. The reperfusion process was according to non-working and working heart methods and the working method was executed for 20 minutes in which the heart rate, aortic pressure, aortic flow and coronary flow were measured and recorded. The control group is the group which the extracted heart was fixed on the isolated working heart model, recovered by reperfusion 60 minutes after infusion and preserved in the cardioplegic solution 20 minutes after the working heart perfusion and aortic cross clamp, The thesis groups were divided into group I, which ischemic hearts that were hypoxia induced were perfused by cardioplegic solution and preserved for 60 minutes; group II, the cardioplegic solution was infused 45 seconds (II-1), 1 minutes (II-2), 3 minutes (II-3), after the ischemia induction, 20 minutes after working heart perfusion and aortic cross clamp; and group III, hearts were executed on working heart perfusion for 20 minutes and aortic cross clamp was performed for 45 seconds (III-1), 1minute (III-2), 3 minutes (III-3), reperfused for 2 minutes to recover the heart, and then aortic cross clamping was repeated for reperfusion, all the groups were compared based on hemodynamic performance after reperfusion of the heart after preservation for 60 minutes. Result: The recovery time until spontaneous heart beat was longer in groups I, II-3, III-2 and III-3 to control group (p<0.01). Group III-1 (p<0.05) had better results in terms of recovery in number of heart rates compared to control group, and recovered better compared to II-1 (p<0.05). The recovery of aortic blood pressure favored group III-1 (p<0.05) and had better outcomes compared with II-1 (p<0.01). Group III-1 also showed best results in terms of cardiac output (p<0.05) and group III-2 was better compared to II-2 (p<0.05). Group I (p<0.01) and II-3 (p<0.05) showed more cardiac edema than control group. Conclusion: When the effects of other organs are dismissed, protecting the heart by infusion of cardioplegic solution after enforcing ischemia for a short period of time before the onset of abnormal heart beats for preconditioning has a better recovery effect in the cardioplegic group with preconditioning compared to the cardioplegic solution itself. we believe that further study is needed to find a more effective method of preconditioning.

A Study on the Development Trend of Artificial Intelligence Using Text Mining Technique: Focused on Open Source Software Projects on Github (텍스트 마이닝 기법을 활용한 인공지능 기술개발 동향 분석 연구: 깃허브 상의 오픈 소스 소프트웨어 프로젝트를 대상으로)

  • Chong, JiSeon;Kim, Dongsung;Lee, Hong Joo;Kim, Jong Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.1-19
    • /
    • 2019
  • Artificial intelligence (AI) is one of the main driving forces leading the Fourth Industrial Revolution. The technologies associated with AI have already shown superior abilities that are equal to or better than people in many fields including image and speech recognition. Particularly, many efforts have been actively given to identify the current technology trends and analyze development directions of it, because AI technologies can be utilized in a wide range of fields including medical, financial, manufacturing, service, and education fields. Major platforms that can develop complex AI algorithms for learning, reasoning, and recognition have been open to the public as open source projects. As a result, technologies and services that utilize them have increased rapidly. It has been confirmed as one of the major reasons for the fast development of AI technologies. Additionally, the spread of the technology is greatly in debt to open source software, developed by major global companies, supporting natural language recognition, speech recognition, and image recognition. Therefore, this study aimed to identify the practical trend of AI technology development by analyzing OSS projects associated with AI, which have been developed by the online collaboration of many parties. This study searched and collected a list of major projects related to AI, which were generated from 2000 to July 2018 on Github. This study confirmed the development trends of major technologies in detail by applying text mining technique targeting topic information, which indicates the characteristics of the collected projects and technical fields. The results of the analysis showed that the number of software development projects by year was less than 100 projects per year until 2013. However, it increased to 229 projects in 2014 and 597 projects in 2015. Particularly, the number of open source projects related to AI increased rapidly in 2016 (2,559 OSS projects). It was confirmed that the number of projects initiated in 2017 was 14,213, which is almost four-folds of the number of total projects generated from 2009 to 2016 (3,555 projects). The number of projects initiated from Jan to Jul 2018 was 8,737. The development trend of AI-related technologies was evaluated by dividing the study period into three phases. The appearance frequency of topics indicate the technology trends of AI-related OSS projects. The results showed that the natural language processing technology has continued to be at the top in all years. It implied that OSS had been developed continuously. Until 2015, Python, C ++, and Java, programming languages, were listed as the top ten frequently appeared topics. However, after 2016, programming languages other than Python disappeared from the top ten topics. Instead of them, platforms supporting the development of AI algorithms, such as TensorFlow and Keras, are showing high appearance frequency. Additionally, reinforcement learning algorithms and convolutional neural networks, which have been used in various fields, were frequently appeared topics. The results of topic network analysis showed that the most important topics of degree centrality were similar to those of appearance frequency. The main difference was that visualization and medical imaging topics were found at the top of the list, although they were not in the top of the list from 2009 to 2012. The results indicated that OSS was developed in the medical field in order to utilize the AI technology. Moreover, although the computer vision was in the top 10 of the appearance frequency list from 2013 to 2015, they were not in the top 10 of the degree centrality. The topics at the top of the degree centrality list were similar to those at the top of the appearance frequency list. It was found that the ranks of the composite neural network and reinforcement learning were changed slightly. The trend of technology development was examined using the appearance frequency of topics and degree centrality. The results showed that machine learning revealed the highest frequency and the highest degree centrality in all years. Moreover, it is noteworthy that, although the deep learning topic showed a low frequency and a low degree centrality between 2009 and 2012, their ranks abruptly increased between 2013 and 2015. It was confirmed that in recent years both technologies had high appearance frequency and degree centrality. TensorFlow first appeared during the phase of 2013-2015, and the appearance frequency and degree centrality of it soared between 2016 and 2018 to be at the top of the lists after deep learning, python. Computer vision and reinforcement learning did not show an abrupt increase or decrease, and they had relatively low appearance frequency and degree centrality compared with the above-mentioned topics. Based on these analysis results, it is possible to identify the fields in which AI technologies are actively developed. The results of this study can be used as a baseline dataset for more empirical analysis on future technology trends that can be converged.

Deriving adoption strategies of deep learning open source framework through case studies (딥러닝 오픈소스 프레임워크의 사례연구를 통한 도입 전략 도출)

  • Choi, Eunjoo;Lee, Junyeong;Han, Ingoo
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.27-65
    • /
    • 2020
  • Many companies on information and communication technology make public their own developed AI technology, for example, Google's TensorFlow, Facebook's PyTorch, Microsoft's CNTK. By releasing deep learning open source software to the public, the relationship with the developer community and the artificial intelligence (AI) ecosystem can be strengthened, and users can perform experiment, implementation and improvement of it. Accordingly, the field of machine learning is growing rapidly, and developers are using and reproducing various learning algorithms in each field. Although various analysis of open source software has been made, there is a lack of studies to help develop or use deep learning open source software in the industry. This study thus attempts to derive a strategy for adopting the framework through case studies of a deep learning open source framework. Based on the technology-organization-environment (TOE) framework and literature review related to the adoption of open source software, we employed the case study framework that includes technological factors as perceived relative advantage, perceived compatibility, perceived complexity, and perceived trialability, organizational factors as management support and knowledge & expertise, and environmental factors as availability of technology skills and services, and platform long term viability. We conducted a case study analysis of three companies' adoption cases (two cases of success and one case of failure) and revealed that seven out of eight TOE factors and several factors regarding company, team and resource are significant for the adoption of deep learning open source framework. By organizing the case study analysis results, we provided five important success factors for adopting deep learning framework: the knowledge and expertise of developers in the team, hardware (GPU) environment, data enterprise cooperation system, deep learning framework platform, deep learning framework work tool service. In order for an organization to successfully adopt a deep learning open source framework, at the stage of using the framework, first, the hardware (GPU) environment for AI R&D group must support the knowledge and expertise of the developers in the team. Second, it is necessary to support the use of deep learning frameworks by research developers through collecting and managing data inside and outside the company with a data enterprise cooperation system. Third, deep learning research expertise must be supplemented through cooperation with researchers from academic institutions such as universities and research institutes. Satisfying three procedures in the stage of using the deep learning framework, companies will increase the number of deep learning research developers, the ability to use the deep learning framework, and the support of GPU resource. In the proliferation stage of the deep learning framework, fourth, a company makes the deep learning framework platform that improves the research efficiency and effectiveness of the developers, for example, the optimization of the hardware (GPU) environment automatically. Fifth, the deep learning framework tool service team complements the developers' expertise through sharing the information of the external deep learning open source framework community to the in-house community and activating developer retraining and seminars. To implement the identified five success factors, a step-by-step enterprise procedure for adoption of the deep learning framework was proposed: defining the project problem, confirming whether the deep learning methodology is the right method, confirming whether the deep learning framework is the right tool, using the deep learning framework by the enterprise, spreading the framework of the enterprise. The first three steps (i.e. defining the project problem, confirming whether the deep learning methodology is the right method, and confirming whether the deep learning framework is the right tool) are pre-considerations to adopt a deep learning open source framework. After the three pre-considerations steps are clear, next two steps (i.e. using the deep learning framework by the enterprise and spreading the framework of the enterprise) can be processed. In the fourth step, the knowledge and expertise of developers in the team are important in addition to hardware (GPU) environment and data enterprise cooperation system. In final step, five important factors are realized for a successful adoption of the deep learning open source framework. This study provides strategic implications for companies adopting or using deep learning framework according to the needs of each industry and business.

Derivation of Digital Music's Ranking Change Through Time Series Clustering (시계열 군집분석을 통한 디지털 음원의 순위 변화 패턴 분류)

  • Yoo, In-Jin;Park, Do-Hyung
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.3
    • /
    • pp.171-191
    • /
    • 2020
  • This study focused on digital music, which is the most valuable cultural asset in the modern society and occupies a particularly important position in the flow of the Korean Wave. Digital music was collected based on the "Gaon Chart," a well-established music chart in Korea. Through this, the changes in the ranking of the music that entered the chart for 73 weeks were collected. Afterwards, patterns with similar characteristics were derived through time series cluster analysis. Then, a descriptive analysis was performed on the notable features of each pattern. The research process suggested by this study is as follows. First, in the data collection process, time series data was collected to check the ranking change of digital music. Subsequently, in the data processing stage, the collected data was matched with the rankings over time, and the music title and artist name were processed. Each analysis is then sequentially performed in two stages consisting of exploratory analysis and explanatory analysis. First, the data collection period was limited to the period before 'the music bulk buying phenomenon', a reliability issue related to music ranking in Korea. Specifically, it is 73 weeks starting from December 31, 2017 to January 06, 2018 as the first week, and from May 19, 2019 to May 25, 2019. And the analysis targets were limited to digital music released in Korea. In particular, digital music was collected based on the "Gaon Chart", a well-known music chart in Korea. Unlike private music charts that are being serviced in Korea, Gaon Charts are charts approved by government agencies and have basic reliability. Therefore, it can be considered that it has more public confidence than the ranking information provided by other services. The contents of the collected data are as follows. Data on the period and ranking, the name of the music, the name of the artist, the name of the album, the Gaon index, the production company, and the distribution company were collected for the music that entered the top 100 on the music chart within the collection period. Through data collection, 7,300 music, which were included in the top 100 on the music chart, were identified for a total of 73 weeks. On the other hand, in the case of digital music, since the cases included in the music chart for more than two weeks are frequent, the duplication of music is removed through the pre-processing process. For duplicate music, the number and location of the duplicated music were checked through the duplicate check function, and then deleted to form data for analysis. Through this, a list of 742 unique music for analysis among the 7,300-music data in advance was secured. A total of 742 songs were secured through previous data collection and pre-processing. In addition, a total of 16 patterns were derived through time series cluster analysis on the ranking change. Based on the patterns derived after that, two representative patterns were identified: 'Steady Seller' and 'One-Hit Wonder'. Furthermore, the two patterns were subdivided into five patterns in consideration of the survival period of the music and the music ranking. The important characteristics of each pattern are as follows. First, the artist's superstar effect and bandwagon effect were strong in the one-hit wonder-type pattern. Therefore, when consumers choose a digital music, they are strongly influenced by the superstar effect and the bandwagon effect. Second, through the Steady Seller pattern, we confirmed the music that have been chosen by consumers for a very long time. In addition, we checked the patterns of the most selected music through consumer needs. Contrary to popular belief, the steady seller: mid-term pattern, not the one-hit wonder pattern, received the most choices from consumers. Particularly noteworthy is that the 'Climbing the Chart' phenomenon, which is contrary to the existing pattern, was confirmed through the steady-seller pattern. This study focuses on the change in the ranking of music over time, a field that has been relatively alienated centering on digital music. In addition, a new approach to music research was attempted by subdividing the pattern of ranking change rather than predicting the success and ranking of music.