• Title/Summary/Keyword: Evaluation Data Set

Search Result 1,089, Processing Time 0.032 seconds

A Survey on the Actual State of Laboratory Facilities and Equipments at Nursing Schools (간호교육기관의 실험실습설비 보유실태 조사)

  • Lim, N.Y.;Lee, S.O.;Suh, M.J.;Kim, H.S.;Kim, M.S.;Oh, K.O.
    • The Korean Nurse
    • /
    • v.36 no.1
    • /
    • pp.108-117
    • /
    • 1997
  • This study was carried out to examine the standards for evaluation of laboratory facilities and equipment. These constitute the most important yet vulnerable area of our system of higher education among the six school evaluation categories provided by the Korean Council for University Education. To obtain data on the present situation of holdings and management of laboratory facilities and equipment at nursing schools in Korea, questionnaires were prepared by members of a special committee of the Korea Nursing Education Society on the basis of the Standards for University Laboratory Facilities and Equipment issued by the Ministry of Education. The questionnaires were sent to nursing schools across the nation by mail on October 4, 1995. 39 institutions completed and returned the questionnaires by mail by December 31 of the same year. The results of the analysis of the survey were as follows: 1. The Physical Environment of Laboratories According to the results of investigation of 14 nursing departments at four-year colleges, laboratories vary in size ranging from 24 to 274.91 pyeong ($1{\;}pyeong{\;}={\;}3.3m^2).$. The average number of students in a laboratory class was 46.93 at four-year colleges, while the number ranged from 40 to 240 in junior colleges. The average floor space of laboratories at junior colleges, however, was almost the same as those, of laboratories at four-year colleges. 2. The Actual State of Laboratory Facilities and Equipment Laboratory equipment possessed by nursing schools at colleges and universities showed a very wide distribution by type, but most of it does not meet government standards according to applicable regulations while some types of equipment are in excess supply. The same is true of junior colleges. where laboratory equipment should meet a different set of government standards specifically established for junior colleges. Closer investigation is called for with regard to those types of equipment which are in short supply in more than 80 percent of colleges and universities. As for the types of equipment in excess supply, investigation should be carried out to determine whether they are really needed in large quantities or should be installed. In many cases, it would appear that unnecessary equipment is procured, even if it is already obsolete, merely for the sake of holding a seemingly impressive armamentarium. 3. Basic Science Laboratory Equipment Among the 39 institutions, five four-year colleges were found to possess equipment for basic science. Only one type of essential equipment, tele-thermometers, and only two types of recommended equipment, rotators and dip chambers, were installed in sufficient numbers to meet the standards. All junior colleges failed to meet the standards in all of equipment categories. Overall, nursing schools at all of the various institutions were found to be below per in terms of laboratory equipment. 4. Required Equipment In response to the question concerning which type of equipment was most needed and not currently in possession, cardiopulmonary resuscitation (CPR) machines and electrocardiogram (ECG) monitors topped the list with four respondents each, followed by measuring equipment. 5. Management of Laboratory Equipment According to the survey, the professors in charge of clinical training and teaching assistants are responsible for management of the laboratory at nursing schools at all colleges and universities, whereas the chief of the general affairs section or chairman of the nursing department manages the laboratory at junior colleges. This suggests that the administrative systems are more or less different. According to the above results, laboratory training could be defined as a process by which nursing students pick up many of the nursing skills necessary to become fully qualified nurses. Laboratory training should therefore be carefully planned to provide students with high levels of hands-on experience so that they can effectively handle problems and emergencies in actual situations. All nursing students should therefore be thoroughly drilled and given as much on-the-job experience as possible. In this regard, there is clearly a need to update the equipment criteria as demanded by society's present situation rather than just filling laboratory equipment quotas according to the current criteria.

  • PDF

Experimental Study on the Determination of Slope and Height of Curbs Considering the VRUs (교통약자를 고려한 보도의 경사도와 높이 결정을 위한 실험연구)

  • Kim, Hyunjin;Lim, Joonbeom;Choe, Byongho;Oh, Cheol;Kang, Inhyeng
    • International Journal of Highway Engineering
    • /
    • v.20 no.1
    • /
    • pp.107-115
    • /
    • 2018
  • PURPOSES : As the population of the mobility handicapped, who are classified as the disabled, the elderly, pregnant women, children, etc., has increased, the voices for guaranteeing their rights have been increasing as well. Thus, the design manuals for roads and sidewalks for the mobility handicapped were developed by the local government, such as the Ministry of Land, Transport, and Tourism, in Seoul City. However, according to the 2013 survey results of the Seoul Metropolitan City, the mobility handicapped still feel uncomfortable with the sidewalks, and particularly request for the improvement of the step and slope of the sidewalk curb. Therefore, in this study, we conducted an empirical experimental study to determine the slope of the sidewalk curb and height of the steps considering the mobility handicapped and analyzed whether there is a statistically significant difference. METHODS : The methodology of this study is an empirical experimental one. In the study, five non-disabled people, 10 wheelchair users, and 10 eye patch and stick users walked about 2-3 min on the sidewalk plates of the sloped type (0%, 5%, 6.3%, 8.3%) and stepped type (0 cm, 1 cm, 3 cm, 6 cm), and their human physiological responses, such as the skin temperature, volume of perspiration on forehead and chest, and heart rate, were measured and recorded. After combining the data, we conducted a nonparametric test, ANOVA, or t-test to determine whether there was a statistically significant difference according to each slope and step type. RESULTS : It was found that for the non-disabled, there was no significant difference in human physiological responses according to the slope and steps of the sidewalk. It can be said that the non-disabled do not feel much physiological discomfort while walking. In the case of the sloped sidewalk plate, the heart rate of the wheel chair users increased when the slope was 6.3%. In the case of the eye patch and stick users, the volume of perspiration on the chest increased at a slope of 5.0%. In general, it is judged that a sidewalk with a slope that is less than 5% does not cause a change in the physiological response. In the case of a stepped sidewalk plate, when 0 cm, 1 cm, and 3 cm were compared for wheelchair users, the amount of forehead perspiration increased from 1 cm. Meanwhile, in the case of the eye patch and stick users, when 0 cm and 6 cm were compared, the amount of perspiration on the forehead and chest as well as the heart rate all increased at 6 cm. Taken together, in the case of wheelchair users, a difference was shown when the height of the step of the sidewalk plate was 1 cm, suggesting that installing it at 0 cm does not cause any physiological discomfort. Moreover, in the case of the eye patch and stick users, when comparing only 0 cm and 6 cm, 0 cm was considered to be suitable, as there was a difference in physiological response at 6 cm. CONCLUSIONS : In this study, we set the human physiological responses such as chest skin temperature, amount of perspiration, and heart rate as evaluation items, and our study was considered to be a meaningful experiment that targeted wheelchair users as well as eye patch and stick users. The validity of the evaluation items was confirmed, as the results of human physiological responses were significant. As for the sidewalk design, according to the experiment result, it is considered that differential application should be implemented according to the type of mobility handicap, rather than uniformly applying a sidewalk step of 2 cm and sidewalk slope of 1/25, which are the current legal standards.

Comparison of Ultrasound Image Quality using Edge Enhancement Mask (경계면 강조 마스크를 이용한 초음파 영상 화질 비교)

  • Jung-Min, Son;Jun-Haeng, Lee
    • Journal of the Korean Society of Radiology
    • /
    • v.17 no.1
    • /
    • pp.157-165
    • /
    • 2023
  • Ultrasound imaging uses sound waves of frequencies to cause physical actions such as reflection, absorption, refraction, and transmission at the edge between different tissues. Improvement is needed because there is a lot of noise due to the characteristics of the data generated from the ultrasound equipment, and it is difficult to grasp the shape of the tissue to be actually observed because the edge is vague. The edge enhancement method is used as a method to solve the case where the edge surface looks clumped due to a decrease in image quality. In this paper, as a method to strengthen the interface, the quality improvement was confirmed by strengthening the interface, which is the high-frequency part, in each image using an unsharpening mask and high boost. The mask filtering used for each image was evaluated by measuring PSNR and SNR. Abdominal, head, heart, liver, kidney, breast, and fetal images were obtained from Philips epiq5g and affiniti70g and Alpinion E-cube 15 ultrasound equipment. The program used to implement the algorithm was implemented with MATLAB R2022a of MathWorks. The unsharpening and high-boost mask array size was set to 3*3, and the laplacian filter, a spatial filter used to create outline-enhanced images, was applied equally to both masks. ImageJ program was used for quantitative evaluation of image quality. As a result of applying the mask filter to various ultrasound images, the subjective image quality showed that the overall contour lines of the image were clearly visible when unsharpening and high-boost mask were applied to the original image. When comparing the quantitative image quality, the image quality of the image to which the unsharpening mask and the high boost mask were applied was evaluated higher than that of the original image. In the portal vein, head, gallbladder, and kidney images, the SNR, PSNR, RMSE and MAE of the image to which the high-boost mask was applied were measured to be high. Conversely, for images of the heart, breast, and fetus, SNR, PSNR, RMSE and MAE values were measured as images with the unsharpening mask applied. It is thought that using the optimal mask according to the image will help to improve the image quality, and the contour information was provided to improve the image quality.

DEVELOPMENT OF STATEWIDE TRUCK TRAFFIC FORECASTING METHOD BY USING LIMITED O-D SURVEY DATA (한정된 O-D조사자료를 이용한 주 전체의 트럭교통예측방법 개발)

  • 박만배
    • Proceedings of the KOR-KST Conference
    • /
    • 1995.02a
    • /
    • pp.101-113
    • /
    • 1995
  • The objective of this research is to test the feasibility of developing a statewide truck traffic forecasting methodology for Wisconsin by using Origin-Destination surveys, traffic counts, classification counts, and other data that are routinely collected by the Wisconsin Department of Transportation (WisDOT). Development of a feasible model will permit estimation of future truck traffic for every major link in the network. This will provide the basis for improved estimation of future pavement deterioration. Pavement damage rises exponentially as axle weight increases, and trucks are responsible for most of the traffic-induced damage to pavement. Consequently, forecasts of truck traffic are critical to pavement management systems. The pavement Management Decision Supporting System (PMDSS) prepared by WisDOT in May 1990 combines pavement inventory and performance data with a knowledge base consisting of rules for evaluation, problem identification and rehabilitation recommendation. Without a r.easonable truck traffic forecasting methodology, PMDSS is not able to project pavement performance trends in order to make assessment and recommendations in the future years. However, none of WisDOT's existing forecasting methodologies has been designed specifically for predicting truck movements on a statewide highway network. For this research, the Origin-Destination survey data avaiiable from WisDOT, including two stateline areas, one county, and five cities, are analyzed and the zone-to'||'&'||'not;zone truck trip tables are developed. The resulting Origin-Destination Trip Length Frequency (00 TLF) distributions by trip type are applied to the Gravity Model (GM) for comparison with comparable TLFs from the GM. The gravity model is calibrated to obtain friction factor curves for the three trip types, Internal-Internal (I-I), Internal-External (I-E), and External-External (E-E). ~oth "macro-scale" calibration and "micro-scale" calibration are performed. The comparison of the statewide GM TLF with the 00 TLF for the macro-scale calibration does not provide suitable results because the available 00 survey data do not represent an unbiased sample of statewide truck trips. For the "micro-scale" calibration, "partial" GM trip tables that correspond to the 00 survey trip tables are extracted from the full statewide GM trip table. These "partial" GM trip tables are then merged and a partial GM TLF is created. The GM friction factor curves are adjusted until the partial GM TLF matches the 00 TLF. Three friction factor curves, one for each trip type, resulting from the micro-scale calibration produce a reasonable GM truck trip model. A key methodological issue for GM. calibration involves the use of multiple friction factor curves versus a single friction factor curve for each trip type in order to estimate truck trips with reasonable accuracy. A single friction factor curve for each of the three trip types was found to reproduce the 00 TLFs from the calibration data base. Given the very limited trip generation data available for this research, additional refinement of the gravity model using multiple mction factor curves for each trip type was not warranted. In the traditional urban transportation planning studies, the zonal trip productions and attractions and region-wide OD TLFs are available. However, for this research, the information available for the development .of the GM model is limited to Ground Counts (GC) and a limited set ofOD TLFs. The GM is calibrated using the limited OD data, but the OD data are not adequate to obtain good estimates of truck trip productions and attractions .. Consequently, zonal productions and attractions are estimated using zonal population as a first approximation. Then, Selected Link based (SELINK) analyses are used to adjust the productions and attractions and possibly recalibrate the GM. The SELINK adjustment process involves identifying the origins and destinations of all truck trips that are assigned to a specified "selected link" as the result of a standard traffic assignment. A link adjustment factor is computed as the ratio of the actual volume for the link (ground count) to the total assigned volume. This link adjustment factor is then applied to all of the origin and destination zones of the trips using that "selected link". Selected link based analyses are conducted by using both 16 selected links and 32 selected links. The result of SELINK analysis by u~ing 32 selected links provides the least %RMSE in the screenline volume analysis. In addition, the stability of the GM truck estimating model is preserved by using 32 selected links with three SELINK adjustments, that is, the GM remains calibrated despite substantial changes in the input productions and attractions. The coverage of zones provided by 32 selected links is satisfactory. Increasing the number of repetitions beyond four is not reasonable because the stability of GM model in reproducing the OD TLF reaches its limits. The total volume of truck traffic captured by 32 selected links is 107% of total trip productions. But more importantly, ~ELINK adjustment factors for all of the zones can be computed. Evaluation of the travel demand model resulting from the SELINK adjustments is conducted by using screenline volume analysis, functional class and route specific volume analysis, area specific volume analysis, production and attraction analysis, and Vehicle Miles of Travel (VMT) analysis. Screenline volume analysis by using four screenlines with 28 check points are used for evaluation of the adequacy of the overall model. The total trucks crossing the screenlines are compared to the ground count totals. L V/GC ratios of 0.958 by using 32 selected links and 1.001 by using 16 selected links are obtained. The %RM:SE for the four screenlines is inversely proportional to the average ground count totals by screenline .. The magnitude of %RM:SE for the four screenlines resulting from the fourth and last GM run by using 32 and 16 selected links is 22% and 31 % respectively. These results are similar to the overall %RMSE achieved for the 32 and 16 selected links themselves of 19% and 33% respectively. This implies that the SELINICanalysis results are reasonable for all sections of the state.Functional class and route specific volume analysis is possible by using the available 154 classification count check points. The truck traffic crossing the Interstate highways (ISH) with 37 check points, the US highways (USH) with 50 check points, and the State highways (STH) with 67 check points is compared to the actual ground count totals. The magnitude of the overall link volume to ground count ratio by route does not provide any specific pattern of over or underestimate. However, the %R11SE for the ISH shows the least value while that for the STH shows the largest value. This pattern is consistent with the screenline analysis and the overall relationship between %RMSE and ground count volume groups. Area specific volume analysis provides another broad statewide measure of the performance of the overall model. The truck traffic in the North area with 26 check points, the West area with 36 check points, the East area with 29 check points, and the South area with 64 check points are compared to the actual ground count totals. The four areas show similar results. No specific patterns in the L V/GC ratio by area are found. In addition, the %RMSE is computed for each of the four areas. The %RMSEs for the North, West, East, and South areas are 92%, 49%, 27%, and 35% respectively, whereas, the average ground counts are 481, 1383, 1532, and 3154 respectively. As for the screenline and volume range analyses, the %RMSE is inversely related to average link volume. 'The SELINK adjustments of productions and attractions resulted in a very substantial reduction in the total in-state zonal productions and attractions. The initial in-state zonal trip generation model can now be revised with a new trip production's trip rate (total adjusted productions/total population) and a new trip attraction's trip rate. Revised zonal production and attraction adjustment factors can then be developed that only reflect the impact of the SELINK adjustments that cause mcreases or , decreases from the revised zonal estimate of productions and attractions. Analysis of the revised production adjustment factors is conducted by plotting the factors on the state map. The east area of the state including the counties of Brown, Outagamie, Shawano, Wmnebago, Fond du Lac, Marathon shows comparatively large values of the revised adjustment factors. Overall, both small and large values of the revised adjustment factors are scattered around Wisconsin. This suggests that more independent variables beyond just 226; population are needed for the development of the heavy truck trip generation model. More independent variables including zonal employment data (office employees and manufacturing employees) by industry type, zonal private trucks 226; owned and zonal income data which are not available currently should be considered. A plot of frequency distribution of the in-state zones as a function of the revised production and attraction adjustment factors shows the overall " adjustment resulting from the SELINK analysis process. Overall, the revised SELINK adjustments show that the productions for many zones are reduced by, a factor of 0.5 to 0.8 while the productions for ~ relatively few zones are increased by factors from 1.1 to 4 with most of the factors in the 3.0 range. No obvious explanation for the frequency distribution could be found. The revised SELINK adjustments overall appear to be reasonable. The heavy truck VMT analysis is conducted by comparing the 1990 heavy truck VMT that is forecasted by the GM truck forecasting model, 2.975 billions, with the WisDOT computed data. This gives an estimate that is 18.3% less than the WisDOT computation of 3.642 billions of VMT. The WisDOT estimates are based on the sampling the link volumes for USH, 8TH, and CTH. This implies potential error in sampling the average link volume. The WisDOT estimate of heavy truck VMT cannot be tabulated by the three trip types, I-I, I-E ('||'&'||'pound;-I), and E-E. In contrast, the GM forecasting model shows that the proportion ofE-E VMT out of total VMT is 21.24%. In addition, tabulation of heavy truck VMT by route functional class shows that the proportion of truck traffic traversing the freeways and expressways is 76.5%. Only 14.1% of total freeway truck traffic is I-I trips, while 80% of total collector truck traffic is I-I trips. This implies that freeways are traversed mainly by I-E and E-E truck traffic while collectors are used mainly by I-I truck traffic. Other tabulations such as average heavy truck speed by trip type, average travel distance by trip type and the VMT distribution by trip type, route functional class and travel speed are useful information for highway planners to understand the characteristics of statewide heavy truck trip patternS. Heavy truck volumes for the target year 2010 are forecasted by using the GM truck forecasting model. Four scenarios are used. Fo~ better forecasting, ground count- based segment adjustment factors are developed and applied. ISH 90 '||'&'||' 94 and USH 41 are used as example routes. The forecasting results by using the ground count-based segment adjustment factors are satisfactory for long range planning purposes, but additional ground counts would be useful for USH 41. Sensitivity analysis provides estimates of the impacts of the alternative growth rates including information about changes in the trip types using key routes. The network'||'&'||'not;based GMcan easily model scenarios with different rates of growth in rural versus . . urban areas, small versus large cities, and in-state zones versus external stations. cities, and in-state zones versus external stations.

  • PDF

Evaluation of the Usefulness of Restricted Respiratory Period at the Time of Radiotherapy for Non-Small Cell Lung Cancer Patient (비소세포성 폐암 환자의 방사선 치료 시 제한 호흡 주기의 유용성 평가)

  • Park, So-Yeon;Ahn, Jong-Ho;Suh, Jung-Min;Kim, Yung-Il;Kim, Jin-Man;Choi, Byung-Ki;Pyo, Hong-Ryul;Song, Ki-Won
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.24 no.2
    • /
    • pp.123-135
    • /
    • 2012
  • Purpose: It is essential to minimize the movement of tumor due to respiratory movement at the time of respiration controlled radiotherapy of non-small cell lung cancer patient. Accordingly, this Study aims to evaluate the usefulness of restricted respiratory period by comparing and analyzing the treatment plans that apply free and restricted respiration period respectively. Materials and Methods: After having conducted training on 9 non-small cell lung cancer patients (tumor n=10) from April to December 2011 by using 'signal monitored-breathing (guided- breathing)' method for the 'free respiratory period' measured on the basis of the regular respiratory period of the patents and 'restricted respiratory period' that was intentionally reduced, total of 10 CT images for each of the respiration phases were acquired by carrying out 4D CT for treatment planning purpose by using RPM and 4-dimensional computed tomography simulator. Visual gross tumor volume (GTV) and internal target volume (ITV) that each of the observer 1 and observer 2 has set were measured and compared on the CT image of each respiratory interval. Moreover, the amplitude of movement of tumor was measured by measuring the center of mass (COM) at the phase of 0% which is the end-inspiration (EI) and at the phase of 50% which is the end-exhalation (EE). In addition, both observers established treatment plan that applied the 2 respiratory periods, and mean dose to normal lung (MDTNL) was compared and analyzed through dose-volume histogram (DVH). Moreover, normal tissue complication probability (NTCP) of the normal lung volume was compared by using dose-volume histogram analysis program (DVH analyzer v.1) and statistical analysis was performed in order to carry out quantitative evaluation of the measured data. Results: As the result of the analysis of the treatment plan that applied the 'restricted respiratory period' of the observer 1 and observer 2, there was reduction rate of 38.75% in the 3-dimensional direction movement of the tumor in comparison to the 'free respiratory period' in the case of the observer 1, while there reduction rate was 41.10% in the case of the observer 2. The results of measurement and comparison of the volumes, GTV and ITV, there was reduction rate of $14.96{\pm}9.44%$ for observer 1 and $19.86{\pm}10.62%$ for observer 2 in the case of GTV, while there was reduction rate of $8.91{\pm}5.91%$ for observer 1 and $15.52{\pm}9.01%$ for observer 2 in the case of ITV. The results of analysis and comparison of MDTNL and NTCP illustrated the reduction rate of MDTNL $3.98{\pm}5.62%$ for observer 1 and $7.62{\pm}10.29%$ for observer 2 in the case of MDTNL, while there was reduction rate of $21.70{\pm}28.27%$ for observer 1 and $37.83{\pm}49.93%$ for observer 2 in the case of NTCP. In addition, the results of analysis of correlation between the resultant values of the 2 observers, while there was significant difference between the observers for the 'free respiratory period', there was no significantly different reduction rates between the observers for 'restricted respiratory period. Conclusion: It was possible to verify the usefulness and appropriateness of 'restricted respiratory period' at the time of respiration controlled radiotherapy on non-small cell lung cancer patient as the treatment plan that applied 'restricted respiratory period' illustrated relative reduction in the evaluation factors in comparison to the 'free respiratory period.

  • PDF

Corporate Bond Rating Using Various Multiclass Support Vector Machines (다양한 다분류 SVM을 적용한 기업채권평가)

  • Ahn, Hyun-Chul;Kim, Kyoung-Jae
    • Asia pacific journal of information systems
    • /
    • v.19 no.2
    • /
    • pp.157-178
    • /
    • 2009
  • Corporate credit rating is a very important factor in the market for corporate debt. Information concerning corporate operations is often disseminated to market participants through the changes in credit ratings that are published by professional rating agencies, such as Standard and Poor's (S&P) and Moody's Investor Service. Since these agencies generally require a large fee for the service, and the periodically provided ratings sometimes do not reflect the default risk of the company at the time, it may be advantageous for bond-market participants to be able to classify credit ratings before the agencies actually publish them. As a result, it is very important for companies (especially, financial companies) to develop a proper model of credit rating. From a technical perspective, the credit rating constitutes a typical, multiclass, classification problem because rating agencies generally have ten or more categories of ratings. For example, S&P's ratings range from AAA for the highest-quality bonds to D for the lowest-quality bonds. The professional rating agencies emphasize the importance of analysts' subjective judgments in the determination of credit ratings. However, in practice, a mathematical model that uses the financial variables of companies plays an important role in determining credit ratings, since it is convenient to apply and cost efficient. These financial variables include the ratios that represent a company's leverage status, liquidity status, and profitability status. Several statistical and artificial intelligence (AI) techniques have been applied as tools for predicting credit ratings. Among them, artificial neural networks are most prevalent in the area of finance because of their broad applicability to many business problems and their preeminent ability to adapt. However, artificial neural networks also have many defects, including the difficulty in determining the values of the control parameters and the number of processing elements in the layer as well as the risk of over-fitting. Of late, because of their robustness and high accuracy, support vector machines (SVMs) have become popular as a solution for problems with generating accurate prediction. An SVM's solution may be globally optimal because SVMs seek to minimize structural risk. On the other hand, artificial neural network models may tend to find locally optimal solutions because they seek to minimize empirical risk. In addition, no parameters need to be tuned in SVMs, barring the upper bound for non-separable cases in linear SVMs. Since SVMs were originally devised for binary classification, however they are not intrinsically geared for multiclass classifications as in credit ratings. Thus, researchers have tried to extend the original SVM to multiclass classification. Hitherto, a variety of techniques to extend standard SVMs to multiclass SVMs (MSVMs) has been proposed in the literature Only a few types of MSVM are, however, tested using prior studies that apply MSVMs to credit ratings studies. In this study, we examined six different techniques of MSVMs: (1) One-Against-One, (2) One-Against-AIL (3) DAGSVM, (4) ECOC, (5) Method of Weston and Watkins, and (6) Method of Crammer and Singer. In addition, we examined the prediction accuracy of some modified version of conventional MSVM techniques. To find the most appropriate technique of MSVMs for corporate bond rating, we applied all the techniques of MSVMs to a real-world case of credit rating in Korea. The best application is in corporate bond rating, which is the most frequently studied area of credit rating for specific debt issues or other financial obligations. For our study the research data were collected from National Information and Credit Evaluation, Inc., a major bond-rating company in Korea. The data set is comprised of the bond-ratings for the year 2002 and various financial variables for 1,295 companies from the manufacturing industry in Korea. We compared the results of these techniques with one another, and with those of traditional methods for credit ratings, such as multiple discriminant analysis (MDA), multinomial logistic regression (MLOGIT), and artificial neural networks (ANNs). As a result, we found that DAGSVM with an ordered list was the best approach for the prediction of bond rating. In addition, we found that the modified version of ECOC approach can yield higher prediction accuracy for the cases showing clear patterns.

Analysis of media trends related to spent nuclear fuel treatment technology using text mining techniques (텍스트마이닝 기법을 활용한 사용후핵연료 건식처리기술 관련 언론 동향 분석)

  • Jeong, Ji-Song;Kim, Ho-Dong
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.2
    • /
    • pp.33-54
    • /
    • 2021
  • With the fourth industrial revolution and the arrival of the New Normal era due to Corona, the importance of Non-contact technologies such as artificial intelligence and big data research has been increasing. Convergent research is being conducted in earnest to keep up with these research trends, but not many studies have been conducted in the area of nuclear research using artificial intelligence and big data-related technologies such as natural language processing and text mining analysis. This study was conducted to confirm the applicability of data science analysis techniques to the field of nuclear research. Furthermore, the study of identifying trends in nuclear spent fuel recognition is critical in terms of being able to determine directions to nuclear industry policies and respond in advance to changes in industrial policies. For those reasons, this study conducted a media trend analysis of pyroprocessing, a spent nuclear fuel treatment technology. We objectively analyze changes in media perception of spent nuclear fuel dry treatment techniques by applying text mining analysis techniques. Text data specializing in Naver's web news articles, including the keywords "Pyroprocessing" and "Sodium Cooled Reactor," were collected through Python code to identify changes in perception over time. The analysis period was set from 2007 to 2020, when the first article was published, and detailed and multi-layered analysis of text data was carried out through analysis methods such as word cloud writing based on frequency analysis, TF-IDF and degree centrality calculation. Analysis of the frequency of the keyword showed that there was a change in media perception of spent nuclear fuel dry treatment technology in the mid-2010s, which was influenced by the Gyeongju earthquake in 2016 and the implementation of the new government's energy conversion policy in 2017. Therefore, trend analysis was conducted based on the corresponding time period, and word frequency analysis, TF-IDF, degree centrality values, and semantic network graphs were derived. Studies show that before the 2010s, media perception of spent nuclear fuel dry treatment technology was diplomatic and positive. However, over time, the frequency of keywords such as "safety", "reexamination", "disposal", and "disassembly" has increased, indicating that the sustainability of spent nuclear fuel dry treatment technology is being seriously considered. It was confirmed that social awareness also changed as spent nuclear fuel dry treatment technology, which was recognized as a political and diplomatic technology, became ambiguous due to changes in domestic policy. This means that domestic policy changes such as nuclear power policy have a greater impact on media perceptions than issues of "spent nuclear fuel processing technology" itself. This seems to be because nuclear policy is a socially more discussed and public-friendly topic than spent nuclear fuel. Therefore, in order to improve social awareness of spent nuclear fuel processing technology, it would be necessary to provide sufficient information about this, and linking it to nuclear policy issues would also be a good idea. In addition, the study highlighted the importance of social science research in nuclear power. It is necessary to apply the social sciences sector widely to the nuclear engineering sector, and considering national policy changes, we could confirm that the nuclear industry would be sustainable. However, this study has limitations that it has applied big data analysis methods only to detailed research areas such as "Pyroprocessing," a spent nuclear fuel dry processing technology. Furthermore, there was no clear basis for the cause of the change in social perception, and only news articles were analyzed to determine social perception. Considering future comments, it is expected that more reliable results will be produced and efficiently used in the field of nuclear policy research if a media trend analysis study on nuclear power is conducted. Recently, the development of uncontact-related technologies such as artificial intelligence and big data research is accelerating in the wake of the recent arrival of the New Normal era caused by corona. Convergence research is being conducted in earnest in various research fields to follow these research trends, but not many studies have been conducted in the nuclear field with artificial intelligence and big data-related technologies such as natural language processing and text mining analysis. The academic significance of this study is that it was possible to confirm the applicability of data science analysis technology in the field of nuclear research. Furthermore, due to the impact of current government energy policies such as nuclear power plant reductions, re-evaluation of spent fuel treatment technology research is undertaken, and key keyword analysis in the field can contribute to future research orientation. It is important to consider the views of others outside, not just the safety technology and engineering integrity of nuclear power, and further reconsider whether it is appropriate to discuss nuclear engineering technology internally. In addition, if multidisciplinary research on nuclear power is carried out, reasonable alternatives can be prepared to maintain the nuclear industry.

Brief Introduction of Research Progresses in Control and Biocontrol of Clubroot Disease in China

  • He, Yueqiu;Wu, Yixin;He, Pengfei;Li, Xinyu
    • 한국균학회소식:학술대회논문집
    • /
    • 2015.05a
    • /
    • pp.45-46
    • /
    • 2015
  • Clubroot disease of crucifers has occurred since 1957. It has spread to the whole China, especially in the southwest and nourtheast where it causes 30-80% loss in some fields. The disease has being expanded in the recent years as seeds are imported and the floating seedling system practices. For its effective control, the Ministry of Agriculture of China set up a program in 2010 and a research team led by Dr. Yueqiu HE, Yunnan Agricultural University. The team includes 20 main reseachers of 11 universities and 5 institutions. After 5 years, the team has made a lot of progresses in disease occurrence regulation, resources collection, resistance identification and breeding, biological agent exploration, formulation, chemicals evaluation, and control strategy. About 1200 collections of local and commercial crucifers were identified in the field and by artificiall inoculation in the laboratories, 10 resistant cultivars were breeded including 7 Chinese cabbages and 3 cabbages. More than 800 antagostic strains were isolated including bacteria, stretomyces and fungi. Around 100 chemicals were evaluated in the field and greenhouse based on its control effect, among them, 6 showed high control effect, especially fluazinam and cyazofamid could control about 80% the disease. However, fluzinam has negative effect on soil microbes. Clubroot disease could not be controlled by bioagents and chemicals once when the pathogen Plasmodiophora brassicae infected its hosts and set up the parasitic relationship. We found the earlier the pathogent infected its host, the severer the disease was. Therefore, early control was the most effective. For Chinese cabbage, all controlling measures should be taken in the early 30 days because the new infection could not cause severe symptom after 30 days of seeding. For example, a biocontrol agent, Bacillus subtilis Strain XF-1 could control the disease 70%-85% averagely when it mixed with seedling substrate and was drenching 3 times after transplanting, i.e. immediately, 7 days, 14 days. XF-1 has been deeply researched in control mechanisms, its genome, and development and application of biocontrol formulate. It could produce antagonistic protein, enzyme, antibiotics and IAA, which promoted rhizogenesis and growth. Its The genome was sequenced by Illumina/Solexa Genome Analyzer to assembled into 20 scaffolds then the gaps between scaffolds were filled by long fragment PCR amplification to obtain complet genmone with 4,061,186 bp in size. The whole genome was found to have 43.8% GC, 108 tandem repeats with an average of 2.65 copies and 84 transposons. The CDSs were predicted as 3,853 in which 112 CDSs were predicted to secondary metabolite biosynthesis, transport and catabolism. Among those, five NRPS/PKS giant gene clusters being responsible for the biosynthesis of polyketide (pksABCDEFHJLMNRS in size 72.9 kb), surfactin(srfABCD, 26.148 kb, bacilysin(bacABCDE 5.903 kb), bacillibactin(dhbABCEF, 11.774 kb) and fengycin(ppsABCDE, 37.799 kb) have high homolgous to fuction confirmed biosynthesis gene in other strain. Moreover, there are many of key regulatory genes for secondary metabolites from XF-1, such as comABPQKX Z, degQ, sfp, yczE, degU, ycxABCD and ywfG. were also predicted. Therefore, XF-1 has potential of biosynthesis for secondary metabolites surfactin, fengycin, bacillibactin, bacilysin and Bacillaene. Thirty two compounds were detected from cell extracts of XF-1 by MALDI-TOF-MS, including one Macrolactin (m/z 441.06), two fusaricidin (m/z 850.493 and 968.515), one circulocin (m/z 852.509), nine surfactin (m/z 1044.656~1102.652), five iturin (m/z 1096.631~1150.57) and forty fengycin (m/z 1449.79~1543.805). The top three compositions types (contening 56.67% of total extract) are surfactin, iturin and fengycin, in which the most abundant is the surfactin type composition 30.37% of total extract and in second place is the fengycin with 23.28% content with rich diversity of chemical structure, and the smallest one is the iturin with 3.02% content. Moreover, the same main compositions were detected in Bacillus sp.355 which is also a good effects biocontol bacterial for controlling the clubroot of crucifer. Wherefore those compounds surfactin, iturin and fengycin maybe the main active compositions of XF-1 against P. brassicae. Twenty one fengycin type compounds were evaluate by LC-ESI-MS/MS with antifungal activities, including fengycin A $C_{16{\sim}C19}$, fengycin B $C_{14{\sim}C17}$, fengycin C $C_{15{\sim}C18}$, fengycin D $C_{15{\sim}C18}$ and fengycin S $C_{15{\sim}C18}$. Furthermore, one novel compound was identified as Dehydroxyfengycin $C_{17}$ according its MS, 1D and 2D NMR spectral data, which molecular weight is 1488.8480 Da and formula $C_{75}H_{116}N_{12}O_{19}$. The fengycin type compounds (FTCPs $250{\mu}g/mL$) were used to treat the resting spores of P. brassicae ($10^7/mL$) by detecting leakage of the cytoplasm components and cell destruction. After 12 h treatment, the absorbencies at 260 nm (A260) and at 280 nm (A280) increased gradually to approaching the maximum of absorbance, accompanying the collapse of P. brassicae resting spores, and nearly no complete cells were observed at 24 h treatment. The results suggested that the cells could be lyzed by the FTCPs of XF-1, and the diversity of FTCPs was mainly attributed to a mechanism of clubroot disease biocontrol. In the five selected medium MOLP, PSA, LB, Landy and LD, the most suitable for growth of strain medium is MOLP, and the least for strains longevity is the Landy sucrose medium. However, the lipopeptide highest yield is in Landy sucrose medium. The lipopeptides in five medium were analyzed with HPLC, and the results showed that lipopeptides component were same, while their contents from B. subtilis XF-1 fermented in five medium were different. We found that it is the lipopeptides content but ingredients of XF-1 could be impacted by medium and lacking of nutrition seems promoting lipopeptides secretion from XF-1. The volatile components with inhibition fungal Cylindrocarpon spp. activity which were collect in sealed vesel were detected with metheds of HS-SPME-GC-MS in eight biocontrol Bacillus species and four positive mutant strains of XF-1 mutagenized with chemical mutagens, respectively. They have same main volatile components including pyrazine, aldehydes, oxazolidinone and sulfide which are composed of 91.62% in XF-1, in which, the most abundant is the pyrazine type composition with 47.03%, and in second place is the aldehydes with 23.84%, and the third place is oxazolidinone with 15.68%, and the smallest ones is the sulfide with 5.07%.

  • PDF

Study on PM10, PM2.5 Reduction Effects and Measurement Method of Vegetation Bio-Filters System in Multi-Use Facility (다중이용시설 내 식생바이오필터 시스템의 PM10, PM2.5 저감효과 및 측정방법에 대한 연구)

  • Kim, Tae-Han;Choi, Boo-Hun
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.48 no.5
    • /
    • pp.80-88
    • /
    • 2020
  • With the issuance of one-week fine dust emergency reduction measures in March 2019, the public's anxiety about fine dust is increasingly growing. In order to assess the application of air purifying plant-based bio-filters to public facilities, this study presented a method for measuring pollutant reduction effects by creating an indoor environment for continuous discharge of particle pollutants and conducted basic studies to verify whether indoor air quality has improved through the system. In this study conducted in a lecture room in spring, the background concentration was created by using mosquito repellent incense as a pollutant one hour before monitoring. Then, according to the schedule, the fine dust reduction capacity was monitored by irrigating for two hours and venting air for one hour. PM10, PM2.5, and temperature & humidity sensors were installed two meters front of the bio-filters, and velocity probes were installed at the center of the three air vents to conduct time-series monitoring. The average face velocity of three air vents set up in the bio-filter was 0.38±0.16 m/s. Total air-conditioning air volume was calculated at 776.89±320.16㎥/h by applying an air vent area of 0.29m×0.65m after deducing damper area. With the system in operation, average temperature and average relative humidity were maintained at 21.5-22.3℃, and 63.79-73.6%, respectively, which indicates that it satisfies temperature and humidity range of various conditions of preceding studies. When the effects of raising relatively humidity rapidly by operating system's air-conditioning function are used efficiently, it would be possible to reduce indoor fine dust and maintain appropriate relative humidity seasonally. Concentration of fine dust increased the same in all cycles before operating the bio-filter system. After operating the system, in cycle 1 blast section (C-1, β=-3.83, β=-2.45), particulate matters (PM10) were lowered by up to 28.8% or 560.3㎍/㎥ and fine particulate matters (PM2.5) were reduced by up to 28.0% or 350.0㎍/㎥. Then, the concentration of find dust (PM10, PM2.5) was reduced by up to 32.6% or 647.0㎍/㎥ and 32.4% or 401.3㎍/㎥ respectively through reduction in cycle 2 blast section (C-2, β=-5.50, β=-3.30) and up to 30.8% or 732.7㎍/㎥ and 31.0% or 459.3㎍/㎥ respectively through reduction in cycle 3 blast section (C-3, β=5.48, β=-3.51). By referring to standards and regulations related to the installation of vegetation bio-filters in public facilities, this study provided plans on how to set up objective performance evaluation environment. By doing so, it was possible to create monitoring infrastructure more objective than a regular lecture room environment and secure relatively reliable data.

Export Control System based on Case Based Reasoning: Design and Evaluation (사례 기반 지능형 수출통제 시스템 : 설계와 평가)

  • Hong, Woneui;Kim, Uihyun;Cho, Sinhee;Kim, Sansung;Yi, Mun Yong;Shin, Donghoon
    • Journal of Intelligence and Information Systems
    • /
    • v.20 no.3
    • /
    • pp.109-131
    • /
    • 2014
  • As the demand of nuclear power plant equipment is continuously growing worldwide, the importance of handling nuclear strategic materials is also increasing. While the number of cases submitted for the exports of nuclear-power commodity and technology is dramatically increasing, preadjudication (or prescreening to be simple) of strategic materials has been done so far by experts of a long-time experience and extensive field knowledge. However, there is severe shortage of experts in this domain, not to mention that it takes a long time to develop an expert. Because human experts must manually evaluate all the documents submitted for export permission, the current practice of nuclear material export is neither time-efficient nor cost-effective. Toward alleviating the problem of relying on costly human experts only, our research proposes a new system designed to help field experts make their decisions more effectively and efficiently. The proposed system is built upon case-based reasoning, which in essence extracts key features from the existing cases, compares the features with the features of a new case, and derives a solution for the new case by referencing similar cases and their solutions. Our research proposes a framework of case-based reasoning system, designs a case-based reasoning system for the control of nuclear material exports, and evaluates the performance of alternative keyword extraction methods (full automatic, full manual, and semi-automatic). A keyword extraction method is an essential component of the case-based reasoning system as it is used to extract key features of the cases. The full automatic method was conducted using TF-IDF, which is a widely used de facto standard method for representative keyword extraction in text mining. TF (Term Frequency) is based on the frequency count of the term within a document, showing how important the term is within a document while IDF (Inverted Document Frequency) is based on the infrequency of the term within a document set, showing how uniquely the term represents the document. The results show that the semi-automatic approach, which is based on the collaboration of machine and human, is the most effective solution regardless of whether the human is a field expert or a student who majors in nuclear engineering. Moreover, we propose a new approach of computing nuclear document similarity along with a new framework of document analysis. The proposed algorithm of nuclear document similarity considers both document-to-document similarity (${\alpha}$) and document-to-nuclear system similarity (${\beta}$), in order to derive the final score (${\gamma}$) for the decision of whether the presented case is of strategic material or not. The final score (${\gamma}$) represents a document similarity between the past cases and the new case. The score is induced by not only exploiting conventional TF-IDF, but utilizing a nuclear system similarity score, which takes the context of nuclear system domain into account. Finally, the system retrieves top-3 documents stored in the case base that are considered as the most similar cases with regard to the new case, and provides them with the degree of credibility. With this final score and the credibility score, it becomes easier for a user to see which documents in the case base are more worthy of looking up so that the user can make a proper decision with relatively lower cost. The evaluation of the system has been conducted by developing a prototype and testing with field data. The system workflows and outcomes have been verified by the field experts. This research is expected to contribute the growth of knowledge service industry by proposing a new system that can effectively reduce the burden of relying on costly human experts for the export control of nuclear materials and that can be considered as a meaningful example of knowledge service application.