• Title/Summary/Keyword: Performance Monitoring

Search Result 3,612, Processing Time 0.03 seconds

Development of simultaneous analytical method for investigation of ketamine and dexmedetomidine in feed (사료 내 케타민과 덱스메데토미딘의 잔류조사를 위한 동시분석법 개발)

  • Chae, Hyun-young;Park, Hyejin;Seo, Hyung-Ju;Jang, Su-nyeong;Lee, Seung Hwa;Jeong, Min-Hee;Cho, Hyunjeong;Hong, Seong-Hee;Na, Tae Woong
    • Analytical Science and Technology
    • /
    • v.35 no.3
    • /
    • pp.136-142
    • /
    • 2022
  • According to media reports, the carcasses of euthanized abandoned dogs were processed at high temperature and pressure to make powder, and then used as feed materials (meat and bone meal), raising the possibility of residuals in the feed of the anesthetic ketamine and dexmedetomidine used for euthanasia. Therefore, a simultaneous analysis method using QuEChERS combined with high-performance liquid chromatography coupled with electrospray ionization tandem mass spectrometry was developed for rapid residue analysis. The method developed in this study exhibited linearity of 0.999 and higher. Selectivity was evaluated by analyzing blank and spiked samples at the limit of quantification. The MRM chromatograms of blank samples were compared with those of spiked samples with the analyte, and there were no interferences at the respective retention times of ketamine and dexmedetomidine. The detection and quantitation limits of the instrument were 0.6 ㎍/L and 2 ㎍/L, respectively. The limit of quantitation for the method was 10 ㎍/kg. The results of the recovery test on meat and bone meal, meat meal, and pet food showed ketamine in the range of 80.48-98.63 % with less than 5.00 % RSD, and dexmedetomidine in the range of 72.75-93.00 % with less than 4.83 % RSD. As a result of collecting and analyzing six feeds, such as meat and bone meal, prepared at the time the raw material was distributed, 10.8 ㎍/kg of ketamine was detected in one sample of meat and bone meal, while dexmedetomidine was found to have a concentration below the limit of quantitation. It was confirmed that the detected sample was distributed before the safety issue was known, and thereafter, all the meat and bone meal made with the carcasses of euthanized abandoned dogs was recalled and completely discarded. To ensure the safety of the meat and bone meal, 32 samples of the meat and bone meal as well as compound feed were collected, and additional residue investigations were conducted for ketamine and dexmedetomidine. As a result of the analysis, no component was detected. However, through this investigation, it was confirmed that some animal drugs, such as anesthetics, can remain without decomposition even at high temperature and pressure; therefore, there is a need for further investigation of other potentially hazardous substances not controlled in the feed.

Verification of Multi-point Displacement Response Measurement Algorithm Using Image Processing Technique (영상처리기법을 이용한 다중 변위응답 측정 알고리즘의 검증)

  • Kim, Sung-Wan;Kim, Nam-Sik
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.30 no.3A
    • /
    • pp.297-307
    • /
    • 2010
  • Recently, maintenance engineering and technology for civil and building structures have begun to draw big attention and actually the number of structures that need to be evaluate on structural safety due to deterioration and performance degradation of structures are rapidly increasing. When stiffness is decreased because of deterioration of structures and member cracks, dynamic characteristics of structures would be changed. And it is important that the damaged areas and extent of the damage are correctly evaluated by analyzing dynamic characteristics from the actual behavior of a structure. In general, typical measurement instruments used for structure monitoring are dynamic instruments. Existing dynamic instruments are not easy to obtain reliable data when the cable connecting measurement sensors and device is long, and have uneconomical for 1 to 1 connection process between each sensor and instrument. Therefore, a method without attaching sensors to measure vibration at a long range is required. The representative applicable non-contact methods to measure the vibration of structures are laser doppler effect, a method using GPS, and image processing technique. The method using laser doppler effect shows relatively high accuracy but uneconomical while the method using GPS requires expensive equipment, and has its signal's own error and limited speed of sampling rate. But the method using image signal is simple and economical, and is proper to get vibration of inaccessible structures and dynamic characteristics. Image signals of camera instead of sensors had been recently used by many researchers. But the existing method, which records a point of a target attached on a structure and then measures vibration using image processing technique, could have relatively the limited objects of measurement. Therefore, this study conducted shaking table test and field load test to verify the validity of the method that can measure multi-point displacement responses of structures using image processing technique.

Seasonal Variations of Microphytobenthos in Sediments of the Estuarine Muddy Sandflat of Gwangyang Bay: HPLC Pigment Analysis (광합성색소 분석을 통한 광양만 갯벌 퇴적물 중 저서미세조류의 계절변화)

  • Lee, Yong-Woo;Choi, Eun-Jung;Kim, Young-Sang;Kang, Chang-Keun
    • The Sea:JOURNAL OF THE KOREAN SOCIETY OF OCEANOGRAPHY
    • /
    • v.14 no.1
    • /
    • pp.48-55
    • /
    • 2009
  • Seasonal variations of microalgal biomass and community composition in both the sediment and the seawater were investigated by HPLC pigment analysis in an estuarine muddy sandflat of Gwangyang Bay from January to November 2002. Based on the photosynthetic pigments, fucoxanthin, diadinoxanthin, and diatoxanthin were the most dominant pigments all the year round, indicating that diatoms were the predominant algal groups of both the sediment and the seawater in Gwangyang Bay. The other algal pigments except the diatom-marker pigments showed relatively low concentrations. Microphytobenthic chlorophyll ${\alpha}$ concentrations in the upper layer (0.5 cm) of sediments ranged from 3.44 (March at the middle site of the tidal flat) to 169 (July at the upper site) mg $m^{-2}$, with the annual mean concentrations of $68.4{\pm}45.5,\;21.3{\pm}14.3,\;22.9{\pm}15.6mg\;m^{-2}$ at the upper, middle, and lower tidal sites, respectively. Depth-integrated chlorophyll ${\alpha}$ concentrations in the overlying water column ranged from 1.66 (November) to 11.7 (July) mg $m^{-2}$, with an annual mean of $6.96{\pm}3.04mg\;m^{-2}$. Microphytobenthic biomasses were about 3${\sim}$10 times higher than depth-integrated phytoplankton biomass in the overlying water column. The physical characteristics of this shallow estuarine tidal flat, similarity in taxonomic composition of the phytoplankton and microphytobenthos, and similar seasonal patterns in their biomasses suggest that resuspended microphytobenthos are an important component of phytoplankton biomass in Gwangyang Bay. Therefore, considering the importance of microphytobenthos as possible food source for the estuarine benthic and pelagic consumers, a consistent monitoring work on the behavior of microphytobenthos is needed in the tidal flat ecosystems.

Studies on Xylooligosaccharide Analysis Method Standardization using HPLC-UVD in Health Functional Food (건강기능식품에서 HPLC-UVD를 이용한 자일로올리고당 시험법의 표준화 연구)

  • Se-Yun Lee;Hee-Sun Jeong;Kyu-Heon Kim;Mi-Young Lee;Jung-Ho Choi;Jeong-Sun Ahn;Kwang-Il Kwon;Hye-Young Lee
    • Journal of Food Hygiene and Safety
    • /
    • v.39 no.2
    • /
    • pp.72-82
    • /
    • 2024
  • This study aimed to develop a scientifically and systematically standardized xylooligosaccharide analytical method that can be applied to products with various formulations. The analysis method was conducted using HPLC with Cadenza C18 column, involving pre-column derivatization with 1-phenyl-3-methyl-5-pyrazoline (PMP) and UV detection at 254 nm. The xylooligosaccharide content was analyzed by converting xylooligosaccharide into xylose through acid hydrolysis. The pre-treated methods were compared and evaluated by varying sonication time, acid hydrolysis time, and concentration. Optimal equipment conditions were achieved with a mobile phase consisting of 20 mM potassium phosphate buffer (pH 6)-acetonitrile (78:22, v/v) through isocratic elution at a flow rate of 0.5 mL/min (254 nm). Furthermore, we validated the advanced standardized analysis method to support the suitability of the proposed analytical procedure such as specificity, linearity, detection limits (LOD), quantitative limits (LOQ), accuracy, and precision. The standardized analysis method is now in use for monitoring relevant health-functional food products available in the market. Our results have demonstrated that the standardized analysis method is expected to enhance the reliability of quality control for healthy functional foods containing xylooligosaccharide.

<Field Action Report> Local Governance for COVID-19 Response of Daegu Metropolitan City (<사례보고> 코로나바이러스감염증-19 유행과 로컬 거버넌스 - 2020년 대구광역시 유행에 대한 대응을 중심으로 -)

  • Kyeong-Soo Lee;Jung Jeung Lee;Keon-Yeop Kim;Jong-Yeon Kim;Tae-Yoon Hwang;Nam-Soo Hong;Jun Hyun Hwang;Jaeyoung Ha
    • Journal of agricultural medicine and community health
    • /
    • v.49 no.1
    • /
    • pp.13-36
    • /
    • 2024
  • Objectives: The purpose of this field case report is 1) to analyze the community's strategy and performance in responding to infectious diseases through the case of COVID-19 infectious disease crisis response of Daegu Metropolitan City, and 2) to interpret this case using governance theory and infectious disease response governance framework. and 3) to propose a strategic model to prepare for future infectious disease outbreaks of the community. Methods: Cases of Daegu Metropolitan City's infectious disease crisis response were analyzed through researchers' participatory observations. And review of OVID-19 White Paper of Daegu Metropolitan City, Daegu Medical Association's COVID-19 White Paper, and literature review of domestic and international governance, and administrative documents. Results: Through the researcher's participatory observation and literature review, 1) establishment of leadership and response system to respond to the infectious disease crisis in Daegu Metropolitan City, 2) citizen's participation and communication strategy through the pan-citizen response committee, 3) cooperation between Daegu Metropolitan City and governance of public-private medical facilities, 4) decision-making and crisis response through participation and communication between the Daegu Metropolitan City Medical Association, Medi-City Daegu Council, and medical experts of private sector, 5) symptom monitoring and patient triage strategies and treatment response for confirmed infectious disease patients by member of Daegu Medical Association, 6) strategies and implications for establishing and utilizing a local infectious disease crisis response information system were derived. Conclusions: The results of the study empirically demonstrate that collaborative governance of the community through the participation of citizens, private sector experts, and community medical facilities is a key element for effective response to infectious disease crises.

Comparative study of flood detection methodologies using Sentinel-1 satellite imagery (Sentinel-1 위성 영상을 활용한 침수 탐지 기법 방법론 비교 연구)

  • Lee, Sungwoo;Kim, Wanyub;Lee, Seulchan;Jeong, Hagyu;Park, Jongsoo;Choi, Minha
    • Journal of Korea Water Resources Association
    • /
    • v.57 no.3
    • /
    • pp.181-193
    • /
    • 2024
  • The increasing atmospheric imbalance caused by climate change leads to an elevation in precipitation, resulting in a heightened frequency of flooding. Consequently, there is a growing need for technology to detect and monitor these occurrences, especially as the frequency of flooding events rises. To minimize flood damage, continuous monitoring is essential, and flood areas can be detected by the Synthetic Aperture Radar (SAR) imagery, which is not affected by climate conditions. The observed data undergoes a preprocessing step, utilizing a median filter to reduce noise. Classification techniques were employed to classify water bodies and non-water bodies, with the aim of evaluating the effectiveness of each method in flood detection. In this study, the Otsu method and Support Vector Machine (SVM) technique were utilized for the classification of water bodies and non-water bodies. The overall performance of the models was assessed using a Confusion Matrix. The suitability of flood detection was evaluated by comparing the Otsu method, an optimal threshold-based classifier, with SVM, a machine learning technique that minimizes misclassifications through training. The Otsu method demonstrated suitability in delineating boundaries between water and non-water bodies but exhibited a higher rate of misclassifications due to the influence of mixed substances. Conversely, the use of SVM resulted in a lower false positive rate and proved less sensitive to mixed substances. Consequently, SVM exhibited higher accuracy under conditions excluding flooding. While the Otsu method showed slightly higher accuracy in flood conditions compared to SVM, the difference in accuracy was less than 5% (Otsu: 0.93, SVM: 0.90). However, in pre-flooding and post-flooding conditions, the accuracy difference was more than 15%, indicating that SVM is more suitable for water body and flood detection (Otsu: 0.77, SVM: 0.92). Based on the findings of this study, it is anticipated that more accurate detection of water bodies and floods could contribute to minimizing flood-related damages and losses.

A New Exploratory Research on Franchisor's Provision of Exclusive Territories (가맹본부의 배타적 영업지역보호에 대한 탐색적 연구)

  • Lim, Young-Kyun;Lee, Su-Dong;Kim, Ju-Young
    • Journal of Distribution Research
    • /
    • v.17 no.1
    • /
    • pp.37-63
    • /
    • 2012
  • In franchise business, exclusive sales territory (sometimes EST in table) protection is a very important issue from an economic, social and political point of view. It affects the growth and survival of both franchisor and franchisee and often raises issues of social and political conflicts. When franchisee is not familiar with related laws and regulations, franchisor has high chance to utilize it. Exclusive sales territory protection by the manufacturer and distributors (wholesalers or retailers) means sales area restriction by which only certain distributors have right to sell products or services. The distributor, who has been granted exclusive sales territories, can protect its own territory, whereas he may be prohibited from entering in other regions. Even though exclusive sales territory is a quite critical problem in franchise business, there is not much rigorous research about the reason, results, evaluation, and future direction based on empirical data. This paper tries to address this problem not only from logical and nomological validity, but from empirical validation. While we purse an empirical analysis, we take into account the difficulties of real data collection and statistical analysis techniques. We use a set of disclosure document data collected by Korea Fair Trade Commission, instead of conventional survey method which is usually criticized for its measurement error. Existing theories about exclusive sales territory can be summarized into two groups as shown in the table below. The first one is about the effectiveness of exclusive sales territory from both franchisor and franchisee point of view. In fact, output of exclusive sales territory can be positive for franchisors but negative for franchisees. Also, it can be positive in terms of sales but negative in terms of profit. Therefore, variables and viewpoints should be set properly. The other one is about the motive or reason why exclusive sales territory is protected. The reasons can be classified into four groups - industry characteristics, franchise systems characteristics, capability to maintain exclusive sales territory, and strategic decision. Within four groups of reasons, there are more specific variables and theories as below. Based on these theories, we develop nine hypotheses which are briefly shown in the last table below with the results. In order to validate the hypothesis, data is collected from government (FTC) homepage which is open source. The sample consists of 1,896 franchisors and it contains about three year operation data, from 2006 to 2008. Within the samples, 627 have exclusive sales territory protection policy and the one with exclusive sales territory policy is not evenly distributed over 19 representative industries. Additional data are also collected from another government agency homepage, like Statistics Korea. Also, we combine data from various secondary sources to create meaningful variables as shown in the table below. All variables are dichotomized by mean or median split if they are not inherently dichotomized by its definition, since each hypothesis is composed by multiple variables and there is no solid statistical technique to incorporate all these conditions to test the hypotheses. This paper uses a simple chi-square test because hypotheses and theories are built upon quite specific conditions such as industry type, economic condition, company history and various strategic purposes. It is almost impossible to find all those samples to satisfy them and it can't be manipulated in experimental settings. However, more advanced statistical techniques are very good on clean data without exogenous variables, but not good with real complex data. The chi-square test is applied in a way that samples are grouped into four with two criteria, whether they use exclusive sales territory protection or not, and whether they satisfy conditions of each hypothesis. So the proportion of sample franchisors which satisfy conditions and protect exclusive sales territory, does significantly exceed the proportion of samples that satisfy condition and do not protect. In fact, chi-square test is equivalent with the Poisson regression which allows more flexible application. As results, only three hypotheses are accepted. When attitude toward the risk is high so loyalty fee is determined according to sales performance, EST protection makes poor results as expected. And when franchisor protects EST in order to recruit franchisee easily, EST protection makes better results. Also, when EST protection is to improve the efficiency of franchise system as a whole, it shows better performances. High efficiency is achieved as EST prohibits the free riding of franchisee who exploits other's marketing efforts, and it encourages proper investments and distributes franchisee into multiple regions evenly. Other hypotheses are not supported in the results of significance testing. Exclusive sales territory should be protected from proper motives and administered for mutual benefits. Legal restrictions driven by the government agency like FTC could be misused and cause mis-understandings. So there need more careful monitoring on real practices and more rigorous studies by both academicians and practitioners.

  • PDF

A Monitoring of Aflatoxins in Commercial Herbs for Food and Medicine (식·약공용 농산물의 아플라톡신 오염 실태 조사)

  • Kim, Sung-dan;Kim, Ae-kyung;Lee, Hyun-kyung;Lee, Sae-ram;Lee, Hee-jin;Ryu, Hoe-jin;Lee, Jung-mi;Yu, In-sil;Jung, Kweon
    • Journal of Food Hygiene and Safety
    • /
    • v.32 no.4
    • /
    • pp.267-274
    • /
    • 2017
  • This paper deals with the natural occurrence of total aflatoxins ($B_1$, $B_2$, $G_1$, and $G_2$) in commercial herbs for food and medicine. To monitor aflatoxins in commercial herbs for food and medicine not included in the specifications of Food Code, a total of 62 samples of 6 different herbs (Bombycis Corpus, Glycyrrhizae Radix et Rhizoma, Menthae Herba, Nelumbinis Semen, Polygalae Radix, Zizyphi Semen) were collected from Yangnyeong market in Seoul, Korea. The samples were treated by the immunoaffinity column clean-up method and quantified by high performance liquid chromatography (HPLC) with on-line post column photochemical derivatization (PHRED) and fluorescence detection (FLD). The analytical method for aflatoxins was validated by accuracy, precision and detection limits. The method showed recovery values in the 86.9~114.0% range and the values of percent coefficient of variaton (CV%) in the 0.9~9.8% range. The limits of detection (LOD) and quantitation (LOQ) in herb were ranged from 0.020 to $0.363{\mu}g/kg$ and from 0.059 to $1.101{\mu}g/kg$, respectively. Of 62 samples analyzed, 6 semens (the original form of 2 Nelumbinis Semen and 2 Zizyphi Semen, the powder of 1 Nelumbinis Semen and 1 Zizyphi Semen) were aflatoxin positive. Aflatoxins $B_1$ or $B_2$ were detected in all positive samples, and the presence of aflatoxins $G_1$ and $G_2$ were not detected. The amount of total aflatoxins ($B_1$, $B_2$, $G_1$, and $G_2$) in the powder and original form of Nelumbinis Semen and Zizyphi Semen were observed around $ND{\sim}21.8{\mu}g/kg$, which is not regulated presently in Korea. The 56 samples presented levels below the limits of detection and quantitation.

The Characteristics and Performances of Manufacturing SMEs that Utilize Public Information Support Infrastructure (공공 정보지원 인프라 활용한 제조 중소기업의 특징과 성과에 관한 연구)

  • Kim, Keun-Hwan;Kwon, Taehoon;Jun, Seung-pyo
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.1-33
    • /
    • 2019
  • The small and medium sized enterprises (hereinafter SMEs) are already at a competitive disadvantaged when compared to large companies with more abundant resources. Manufacturing SMEs not only need a lot of information needed for new product development for sustainable growth and survival, but also seek networking to overcome the limitations of resources, but they are faced with limitations due to their size limitations. In a new era in which connectivity increases the complexity and uncertainty of the business environment, SMEs are increasingly urged to find information and solve networking problems. In order to solve these problems, the government funded research institutes plays an important role and duty to solve the information asymmetry problem of SMEs. The purpose of this study is to identify the differentiating characteristics of SMEs that utilize the public information support infrastructure provided by SMEs to enhance the innovation capacity of SMEs, and how they contribute to corporate performance. We argue that we need an infrastructure for providing information support to SMEs as part of this effort to strengthen of the role of government funded institutions; in this study, we specifically identify the target of such a policy and furthermore empirically demonstrate the effects of such policy-based efforts. Our goal is to help establish the strategies for building the information supporting infrastructure. To achieve this purpose, we first classified the characteristics of SMEs that have been found to utilize the information supporting infrastructure provided by government funded institutions. This allows us to verify whether selection bias appears in the analyzed group, which helps us clarify the interpretative limits of our study results. Next, we performed mediator and moderator effect analysis for multiple variables to analyze the process through which the use of information supporting infrastructure led to an improvement in external networking capabilities and resulted in enhancing product competitiveness. This analysis helps identify the key factors we should focus on when offering indirect support to SMEs through the information supporting infrastructure, which in turn helps us more efficiently manage research related to SME supporting policies implemented by government funded institutions. The results of this study showed the following. First, SMEs that used the information supporting infrastructure were found to have a significant difference in size in comparison to domestic R&D SMEs, but on the other hand, there was no significant difference in the cluster analysis that considered various variables. Based on these findings, we confirmed that SMEs that use the information supporting infrastructure are superior in size, and had a relatively higher distribution of companies that transact to a greater degree with large companies, when compared to the SMEs composing the general group of SMEs. Also, we found that companies that already receive support from the information infrastructure have a high concentration of companies that need collaboration with government funded institution. Secondly, among the SMEs that use the information supporting infrastructure, we found that increasing external networking capabilities contributed to enhancing product competitiveness, and while this was no the effect of direct assistance, we also found that indirect contributions were made by increasing the open marketing capabilities: in other words, this was the result of an indirect-only mediator effect. Also, the number of times the company received additional support in this process through mentoring related to information utilization was found to have a mediated moderator effect on improving external networking capabilities and in turn strengthening product competitiveness. The results of this study provide several insights that will help establish policies. KISTI's information support infrastructure may lead to the conclusion that marketing is already well underway, but it intentionally supports groups that enable to achieve good performance. As a result, the government should provide clear priorities whether to support the companies in the underdevelopment or to aid better performance. Through our research, we have identified how public information infrastructure contributes to product competitiveness. Here, we can draw some policy implications. First, the public information support infrastructure should have the capability to enhance the ability to interact with or to find the expert that provides required information. Second, if the utilization of public information support (online) infrastructure is effective, it is not necessary to continuously provide informational mentoring, which is a parallel offline support. Rather, offline support such as mentoring should be used as an appropriate device for abnormal symptom monitoring. Third, it is required that SMEs should improve their ability to utilize, because the effect of enhancing networking capacity through public information support infrastructure and enhancing product competitiveness through such infrastructure appears in most types of companies rather than in specific SMEs.

Development of a complex failure prediction system using Hierarchical Attention Network (Hierarchical Attention Network를 이용한 복합 장애 발생 예측 시스템 개발)

  • Park, Youngchan;An, Sangjun;Kim, Mintae;Kim, Wooju
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.4
    • /
    • pp.127-148
    • /
    • 2020
  • The data center is a physical environment facility for accommodating computer systems and related components, and is an essential foundation technology for next-generation core industries such as big data, smart factories, wearables, and smart homes. In particular, with the growth of cloud computing, the proportional expansion of the data center infrastructure is inevitable. Monitoring the health of these data center facilities is a way to maintain and manage the system and prevent failure. If a failure occurs in some elements of the facility, it may affect not only the relevant equipment but also other connected equipment, and may cause enormous damage. In particular, IT facilities are irregular due to interdependence and it is difficult to know the cause. In the previous study predicting failure in data center, failure was predicted by looking at a single server as a single state without assuming that the devices were mixed. Therefore, in this study, data center failures were classified into failures occurring inside the server (Outage A) and failures occurring outside the server (Outage B), and focused on analyzing complex failures occurring within the server. Server external failures include power, cooling, user errors, etc. Since such failures can be prevented in the early stages of data center facility construction, various solutions are being developed. On the other hand, the cause of the failure occurring in the server is difficult to determine, and adequate prevention has not yet been achieved. In particular, this is the reason why server failures do not occur singularly, cause other server failures, or receive something that causes failures from other servers. In other words, while the existing studies assumed that it was a single server that did not affect the servers and analyzed the failure, in this study, the failure occurred on the assumption that it had an effect between servers. In order to define the complex failure situation in the data center, failure history data for each equipment existing in the data center was used. There are four major failures considered in this study: Network Node Down, Server Down, Windows Activation Services Down, and Database Management System Service Down. The failures that occur for each device are sorted in chronological order, and when a failure occurs in a specific equipment, if a failure occurs in a specific equipment within 5 minutes from the time of occurrence, it is defined that the failure occurs simultaneously. After configuring the sequence for the devices that have failed at the same time, 5 devices that frequently occur simultaneously within the configured sequence were selected, and the case where the selected devices failed at the same time was confirmed through visualization. Since the server resource information collected for failure analysis is in units of time series and has flow, we used Long Short-term Memory (LSTM), a deep learning algorithm that can predict the next state through the previous state. In addition, unlike a single server, the Hierarchical Attention Network deep learning model structure was used in consideration of the fact that the level of multiple failures for each server is different. This algorithm is a method of increasing the prediction accuracy by giving weight to the server as the impact on the failure increases. The study began with defining the type of failure and selecting the analysis target. In the first experiment, the same collected data was assumed as a single server state and a multiple server state, and compared and analyzed. The second experiment improved the prediction accuracy in the case of a complex server by optimizing each server threshold. In the first experiment, which assumed each of a single server and multiple servers, in the case of a single server, it was predicted that three of the five servers did not have a failure even though the actual failure occurred. However, assuming multiple servers, all five servers were predicted to have failed. As a result of the experiment, the hypothesis that there is an effect between servers is proven. As a result of this study, it was confirmed that the prediction performance was superior when the multiple servers were assumed than when the single server was assumed. In particular, applying the Hierarchical Attention Network algorithm, assuming that the effects of each server will be different, played a role in improving the analysis effect. In addition, by applying a different threshold for each server, the prediction accuracy could be improved. This study showed that failures that are difficult to determine the cause can be predicted through historical data, and a model that can predict failures occurring in servers in data centers is presented. It is expected that the occurrence of disability can be prevented in advance using the results of this study.