Characteristics and Meanings of the Hwanghae-do Gutchum (황해도굿춤의 특성과 의미)
-
- (The) Research of the performance art and culture
- /
- no.42
- /
- pp.233-256
- /
- 2021
-
The purpose of this article is to understand the characteristics and meanings of the Hwanghae-do Gutchum, or shamanic ritual dance. First, the characteristics of the Hwanghae-do Gutchum are summarized as follows. The regular dances that appear in all pieces of Gutgeori or the tune of Gut of the Hwanghae-do Gutchum feature Geosangchum, followed by domu and heojeonmu in the sequential order. The accompaniment rhythms are Geosang rhythm, Chum rhythm, and Yeonpung rhythm. The dance featuring mugu, or shaman implements held on shaman's hand as part of the Hwanghae-do Gutchum, which symbolizes the characteristics of deities, is the same as domu aligned with the dance rhythm and the whirling dance aligned with the Yeonpung rhythm. The name of mugu, mubok (shaman clothing) and/or deities may be used as the name of Gutchum but there is no originality of Gutchum. The Beokgu Chum and Samhyeon Chum as part of the Hwanghae-do Gutchum use Beockgu Jangdan and Samhyeon Jangdan, which deserves to have their originality acknowledged. Hwanghae-do Gutchum is closely related to the rhythm. The harmony of janggu player and a female shaman is essential in practicing the Hwanghae-do Gut. If a janggu player fails to perform to properly support the gut practice of a female shaman, the shaman is not able to proceed with a smooth practice and causes confusion. On the other hand, if the gut performance of a female shaman fails to catch up with the performance of janggu, the gut becomes plain and simple at best. Janggu is the single most important element that determines the success or failure of the Hwanghae-do Gutchum. A female shaman takes the harmony and collaboration with a janggu player so seriously that she is willing to reschedule the practice of gut if its schedule does not match that of the janggu player. The Hwanghae-do Gutchum is largely dependent on gyeolrye. However, the difference between the chum and the rhythm caused by gyeolrye has disappeared due to the intangible cultural assets. That is, designating an intangible cultural asset has resulted in eliminating all distinctive characteristics of Hwanghae-do Gutchum. With the distinction of gyeolrye becoming vague, they have lost interest in the genealogy of gut they have learned. It is no longer gyeolrye but the intangible cultural property system that serves as an important factor to distinguish chums.
Lean startup is a concept that combines the words "lean," meaning an efficient way of running a business, and "startup," meaning a new business. It is often cited as a strategy for minimizing failure in early-stage businesses, especially in software-based startups. By scrutinizing the case of a startup L, this study suggests that lean startup methodology(LSM) can be useful for hardware and manufacturing companies and identifies ways for early startups to successfully implement LSM. To this end, the study explained the core of LSM including the concepts of hypothesis-driven approach, BML feedback loop, minimum viable product(MVP), and pivot. Five criteria to evaluate the successful implementation of LSM were derived from the core concepts and applied to evaluate the case of startup L . The early startup L pivoted its main business model from defecation alert system for patients with limited mobility to one for infants or toddlers, and finally to a smart bottle for infants. In developing the former two products, analyzed from LSM's perspective, company L neither established a specific customer value proposition for its startup idea and nor verified it through MVP experiment, thus failed to create a BML feedback loop. However, through two rounds of pivots, startup L discovered new target customers and customer needs, and was able to establish a successful business model by repeatedly experimenting with MVPs with minimal effort and time. In other words, Company L's case shows that it is essential to go through the customer-market validation stage at the beginning of the business, and that it should be done through an MVP method that does not waste the startup's time and resources. It also shows that it is necessary to abandon and pivot a product or service that customers do not want, even if it is technically superior and functionally complete. Lastly, the study proves that the lean startup methodology is not limited to the software industry, but can also be applied to technology-based hardware industry. The findings of this study can be used as guidelines and methodologies for early-stage companies to minimize failures and to accelerate the process of establishing a business model, scaling up, and going global.
To survive in the global competitive environment, enterprise should be able to solve various problems and find the optimal solution effectively. The big-data is being perceived as a tool for solving enterprise problems effectively and improve competitiveness with its' various problem solving and advanced predictive capabilities. Due to its remarkable performance, the implementation of big data systems has been increased through many enterprises around the world. Currently the big-data is called the 'crude oil' of the 21st century and is expected to provide competitive superiority. The reason why the big data is in the limelight is because while the conventional IT technology has been falling behind much in its possibility level, the big data has gone beyond the technological possibility and has the advantage of being utilized to create new values such as business optimization and new business creation through analysis of big data. Since the big data has been introduced too hastily without considering the strategic value deduction and achievement obtained through the big data, however, there are difficulties in the strategic value deduction and data utilization that can be gained through big data. According to the survey result of 1,800 IT professionals from 18 countries world wide, the percentage of the corporation where the big data is being utilized well was only 28%, and many of them responded that they are having difficulties in strategic value deduction and operation through big data. The strategic value should be deducted and environment phases like corporate internal and external related regulations and systems should be considered in order to introduce big data, but these factors were not well being reflected. The cause of the failure turned out to be that the big data was introduced by way of the IT trend and surrounding environment, but it was introduced hastily in the situation where the introduction condition was not well arranged. The strategic value which can be obtained through big data should be clearly comprehended and systematic environment analysis is very important about applicability in order to introduce successful big data, but since the corporations are considering only partial achievements and technological phases that can be obtained through big data, the successful introduction is not being made. Previous study shows that most of big data researches are focused on big data concept, cases, and practical suggestions without empirical study. The purpose of this study is provide the theoretically and practically useful implementation framework and strategies of big data systems with conducting comprehensive literature review, finding influencing factors for successful big data systems implementation, and analysing empirical models. To do this, the elements which can affect the introduction intention of big data were deducted by reviewing the information system's successful factors, strategic value perception factors, considering factors for the information system introduction environment and big data related literature in order to comprehend the effect factors when the corporations introduce big data and structured questionnaire was developed. After that, the questionnaire and the statistical analysis were performed with the people in charge of the big data inside the corporations as objects. According to the statistical analysis, it was shown that the strategic value perception factor and the inside-industry environmental factors affected positively the introduction intention of big data. The theoretical, practical and political implications deducted from the study result is as follows. The frist theoretical implication is that this study has proposed theoretically effect factors which affect the introduction intention of big data by reviewing the strategic value perception and environmental factors and big data related precedent studies and proposed the variables and measurement items which were analyzed empirically and verified. This study has meaning in that it has measured the influence of each variable on the introduction intention by verifying the relationship between the independent variables and the dependent variables through structural equation model. Second, this study has defined the independent variable(strategic value perception, environment), dependent variable(introduction intention) and regulatory variable(type of business and corporate size) about big data introduction intention and has arranged theoretical base in studying big data related field empirically afterwards by developing measurement items which has obtained credibility and validity. Third, by verifying the strategic value perception factors and the significance about environmental factors proposed in the conventional precedent studies, this study will be able to give aid to the afterwards empirical study about effect factors on big data introduction. The operational implications are as follows. First, this study has arranged the empirical study base about big data field by investigating the cause and effect relationship about the influence of the strategic value perception factor and environmental factor on the introduction intention and proposing the measurement items which has obtained the justice, credibility and validity etc. Second, this study has proposed the study result that the strategic value perception factor affects positively the big data introduction intention and it has meaning in that the importance of the strategic value perception has been presented. Third, the study has proposed that the corporation which introduces big data should consider the big data introduction through precise analysis about industry's internal environment. Fourth, this study has proposed the point that the size and type of business of the corresponding corporation should be considered in introducing the big data by presenting the difference of the effect factors of big data introduction depending on the size and type of business of the corporation. The political implications are as follows. First, variety of utilization of big data is needed. The strategic value that big data has can be accessed in various ways in the product, service field, productivity field, decision making field etc and can be utilized in all the business fields based on that, but the parts that main domestic corporations are considering are limited to some parts of the products and service fields. Accordingly, in introducing big data, reviewing the phase about utilization in detail and design the big data system in a form which can maximize the utilization rate will be necessary. Second, the study is proposing the burden of the cost of the system introduction, difficulty in utilization in the system and lack of credibility in the supply corporations etc in the big data introduction phase by corporations. Since the world IT corporations are predominating the big data market, the big data introduction of domestic corporations can not but to be dependent on the foreign corporations. When considering that fact, that our country does not have global IT corporations even though it is world powerful IT country, the big data can be thought to be the chance to rear world level corporations. Accordingly, the government shall need to rear star corporations through active political support. Third, the corporations' internal and external professional manpower for the big data introduction and operation lacks. Big data is a system where how valuable data can be deducted utilizing data is more important than the system construction itself. For this, talent who are equipped with academic knowledge and experience in various fields like IT, statistics, strategy and management etc and manpower training should be implemented through systematic education for these talents. This study has arranged theoretical base for empirical studies about big data related fields by comprehending the main variables which affect the big data introduction intention and verifying them and is expected to be able to propose useful guidelines for the corporations and policy developers who are considering big data implementationby analyzing empirically that theoretical base.
The wall shear stress in the vicinity of end-to end anastomoses under steady flow conditions was measured using a flush-mounted hot-film anemometer(FMHFA) probe. The experimental measurements were in good agreement with numerical results except in flow with low Reynolds numbers. The wall shear stress increased proximal to the anastomosis in flow from the Penrose tubing (simulating an artery) to the PTFE: graft. In flow from the PTFE graft to the Penrose tubing, low wall shear stress was observed distal to the anastomosis. Abnormal distributions of wall shear stress in the vicinity of the anastomosis, resulting from the compliance mismatch between the graft and the host artery, might be an important factor of ANFH formation and the graft failure. The present study suggests a correlation between regions of the low wall shear stress and the development of anastomotic neointimal fibrous hyperplasia(ANPH) in end-to-end anastomoses. 30523 T00401030523 ^x Air pressure decay(APD) rate and ultrafiltration rate(UFR) tests were performed on new and saline rinsed dialyzers as well as those roused in patients several times. C-DAK 4000 (Cordis Dow) and CF IS-11 (Baxter Travenol) reused dialyzers obtained from the dialysis clinic were used in the present study. The new dialyzers exhibited a relatively flat APD, whereas saline rinsed and reused dialyzers showed considerable amount of decay. C-DAH dialyzers had a larger APD(11.70
The wall shear stress in the vicinity of end-to end anastomoses under steady flow conditions was measured using a flush-mounted hot-film anemometer(FMHFA) probe. The experimental measurements were in good agreement with numerical results except in flow with low Reynolds numbers. The wall shear stress increased proximal to the anastomosis in flow from the Penrose tubing (simulating an artery) to the PTFE: graft. In flow from the PTFE graft to the Penrose tubing, low wall shear stress was observed distal to the anastomosis. Abnormal distributions of wall shear stress in the vicinity of the anastomosis, resulting from the compliance mismatch between the graft and the host artery, might be an important factor of ANFH formation and the graft failure. The present study suggests a correlation between regions of the low wall shear stress and the development of anastomotic neointimal fibrous hyperplasia(ANPH) in end-to-end anastomoses. 30523 T00401030523 ^x Air pressure decay(APD) rate and ultrafiltration rate(UFR) tests were performed on new and saline rinsed dialyzers as well as those roused in patients several times. C-DAK 4000 (Cordis Dow) and CF IS-11 (Baxter Travenol) reused dialyzers obtained from the dialysis clinic were used in the present study. The new dialyzers exhibited a relatively flat APD, whereas saline rinsed and reused dialyzers showed considerable amount of decay. C-DAH dialyzers had a larger APD(11.70
Background: Percutaneous cardiopulmonary support. (PCPS) has the potential to rescue patients in cardiogenic shock who might otherwise die. PCPS has been a therapeutic option in a variety of the clinical settings such as for patients with myocardial Infarction, high-risk coronary intervention and postcardiotomy cardiogenic shock, and the PCPS device is easy to install. We report our early experience with PCPS as a life saving procedure in cardiogenic shock patients due to acute myocardial infarction. Material and Method: From January 2005 to December 2006, eight patients in cardiogenic shock with acute myocardial infarction underwent PCPS using the CAPIOX emergency bypass system(
Extracorporeal life support (ECLS) system is a device for respiratory and/or heart failure treatment, and there have been many trials for development and clinical application in the world. Currently, a non-pulsatile blood pump is a standard for ECLS system. Although a pulsatile blood pump is advantageous in physiologic aspects, high pressure generated in the circuits and resultant blood cell trauma remain major concerns which make one reluctant to use a pulsatile blood pump in artificial lung circuits containing a membrane oxygenator. The study was designed to evaluate the hypothesis that placement of a pressure-relieving compliance chamber between a pulsatile pump and a membrane oxygenator might reduce the above mentioned side effects while providing physiologic pulsatile blood flow. The study was performed in a canine model of oleic acid induced acute lung injury (N=16). The animals were divided into three groups according to the type of pump used and the presence of the compliance chamber, In group 1, a non-pulsatile centrifugal pump was used as a control (n=6). In group 2 (n=4), a single-pulsatile pump was used. In group 3 (n=6), a single-pulsatile pump equipped with a compliance chamber was used. The experimental model was a partial bypass between the right atrium and the aorta at a pump flow of 1.8∼2 L/min for 2 hours. The observed parameters were focused on hemodynamic changes, intra-circuit pressure, laboratory studies for blood profile, and the effect on blood cell trauma. In hemodynamics, the pulsatile group II & III generated higher arterial pulse pressure (47
Background : The committee of tuberculosis(TB) survey planning for the year 2000 decided to construct the Korean Tuberculosis Surveillance System (KTBS), based on a doctor's routine reporting method. The successful keys of the KTBS rely on the precision of the recorded TB notification forms. The purpose of this study was to determine that the accuracy of the TB notification form written at a private general hospital given to the corresponding health center and to improve the comprehensiveness of these reporting systems. Materials and Methods : 291 adult TB patients who had been diagnosed from August 2000 to January 2001, were enrolled in this study. The lists of TB notification forms were compared with the medical records and the various laboratory results; case characteristics, history of previous treatment, examinations for diagnosis, site of the TB by the international classification of the disease, and treatment. Results : In the list of examinations for a diagnosis in 222 pulmonary TB patients, the concordance rate of the 'sputum smear exam' was 76% but that of the 'sputum culture exam' was only 23%. Among the 198 cases of the sputum culture exam labeled 'not examined', 43(21.7%) cases proved to be true 'not examined', 70 cases(35.4%) were proven to be 'culture positive', and 85(43.0%) cases were proven to be 'culture negative'. In the list of examinations for a diagnosis in 69 extrapulmonary TB patients, the concordance rate of the 'smear exam other than sputum' was 54%. In the list of treatments, the overall concordance rate of the 'type of registration' in the TB notification form was 85%. Among the 246 'new' cases on the TB notification form, 217(88%) cases were true 'new' cases and 13 were proven to be 'relapse', 2 were proven to be 'treatment after failure', one was proven to be 'treatment after default', 12 were proven to be 'transferred-in' and one was proven to be 'chronic'. Among the 204 HREZ prescribed regimen, 172(84.3%) patients were taking the HREZ regimen, and the others were prescribed other drug regimens. Conclusion : Correct recording of the TB notification form at the private sectors is necessary for supporting the effective TB surveillance system in Korea.
Maintenance and prevention of failure through anomaly detection of ICT infrastructure is becoming important. System monitoring data is multidimensional time series data. When we deal with multidimensional time series data, we have difficulty in considering both characteristics of multidimensional data and characteristics of time series data. When dealing with multidimensional data, correlation between variables should be considered. Existing methods such as probability and linear base, distance base, etc. are degraded due to limitations called the curse of dimensions. In addition, time series data is preprocessed by applying sliding window technique and time series decomposition for self-correlation analysis. These techniques are the cause of increasing the dimension of data, so it is necessary to supplement them. The anomaly detection field is an old research field, and statistical methods and regression analysis were used in the early days. Currently, there are active studies to apply machine learning and artificial neural network technology to this field. Statistically based methods are difficult to apply when data is non-homogeneous, and do not detect local outliers well. The regression analysis method compares the predictive value and the actual value after learning the regression formula based on the parametric statistics and it detects abnormality. Anomaly detection using regression analysis has the disadvantage that the performance is lowered when the model is not solid and the noise or outliers of the data are included. There is a restriction that learning data with noise or outliers should be used. The autoencoder using artificial neural networks is learned to output as similar as possible to input data. It has many advantages compared to existing probability and linear model, cluster analysis, and map learning. It can be applied to data that does not satisfy probability distribution or linear assumption. In addition, it is possible to learn non-mapping without label data for teaching. However, there is a limitation of local outlier identification of multidimensional data in anomaly detection, and there is a problem that the dimension of data is greatly increased due to the characteristics of time series data. In this study, we propose a CMAE (Conditional Multimodal Autoencoder) that enhances the performance of anomaly detection by considering local outliers and time series characteristics. First, we applied Multimodal Autoencoder (MAE) to improve the limitations of local outlier identification of multidimensional data. Multimodals are commonly used to learn different types of inputs, such as voice and image. The different modal shares the bottleneck effect of Autoencoder and it learns correlation. In addition, CAE (Conditional Autoencoder) was used to learn the characteristics of time series data effectively without increasing the dimension of data. In general, conditional input mainly uses category variables, but in this study, time was used as a condition to learn periodicity. The CMAE model proposed in this paper was verified by comparing with the Unimodal Autoencoder (UAE) and Multi-modal Autoencoder (MAE). The restoration performance of Autoencoder for 41 variables was confirmed in the proposed model and the comparison model. The restoration performance is different by variables, and the restoration is normally well operated because the loss value is small for Memory, Disk, and Network modals in all three Autoencoder models. The process modal did not show a significant difference in all three models, and the CPU modal showed excellent performance in CMAE. ROC curve was prepared for the evaluation of anomaly detection performance in the proposed model and the comparison model, and AUC, accuracy, precision, recall, and F1-score were compared. In all indicators, the performance was shown in the order of CMAE, MAE, and AE. Especially, the reproduction rate was 0.9828 for CMAE, which can be confirmed to detect almost most of the abnormalities. The accuracy of the model was also improved and 87.12%, and the F1-score was 0.8883, which is considered to be suitable for anomaly detection. In practical aspect, the proposed model has an additional advantage in addition to performance improvement. The use of techniques such as time series decomposition and sliding windows has the disadvantage of managing unnecessary procedures; and their dimensional increase can cause a decrease in the computational speed in inference.The proposed model has characteristics that are easy to apply to practical tasks such as inference speed and model management.
The object of this study is to detect of land-cover change in western DMZ and vicinity. This was performed as a basic study to construct a decision support system for the conservation or a sustainable development of the DMZ and Vicinity near future. DMZ is an is 4km wide and 250km long and it's one of the most highly fortified boundaries in the world and also a unique thin green line. Environmentalists want to declare the DMZ as a natural reserve and a biodiversity zone, but nowadays through the strengthening of the inter-Korean economic cooperation, some developers are trying to construct a new-town or an industrial complex inside of the DMZ. This study investigates the current environmental conditions, especially deforestation of the western DMZ adopting remote sensing and GIS techniques. The Land-covers were identified through the linear spectvral mixture analysis(LSMA) which was used to handle the spectral mixture problem of low spatial resolution imagery of Landsat TM and ETM+ imagery. To analyze quantitative and spatial change of vegetation-cover in western DMZ, GIS overlay method was used. In LSMA, to develop high-quality fraction images, three endmembers of green vegetation(GV), soil, water were driven from pure features in the imagery. Through 15 years, from 1987 to 2002, forest of western DMZ and vicinity was devastated and changed to urban, farmland or barren land. Northern part of western DMZ and vicinity was more deforested than that of southern part. (