• Title/Summary/Keyword: Performance tools and methodology

Search Result 98, Processing Time 0.026 seconds

ARISING TECHNICAL ISSUES IN THE DEVELOPMENT OF A TRANSPORTATION AND STORAGE SYSTEM OF SPENT NUCLEAR FUEL IN KOREA

  • Yoo, Jeong-Hyoun;Choi, Woo-Seok;Lee, Sang-Hoon;Seo, Ki-Seog
    • Nuclear Engineering and Technology
    • /
    • v.43 no.5
    • /
    • pp.413-420
    • /
    • 2011
  • In Korea, although the concept of dry storage system for PWR spent fuels first emerged in the early 1990s, wet storage inside nuclear reactor buildings remains the dominant storage paradigm. Furthermore, as the amount of discharged fuel from nuclear power plants increases, nuclear power plants are confronted with the problem of meeting storage capacity demand. Various measures have been taken to resolve this problem. Dry storage systems along with transportation of spent fuel either on-site or off-site are regarded as the most feasible measure. In order to develop dry storage and transportation system safety analyses, development of design techniques, full scale performance tests, and research on key material degradation should be conducted. This paper deals with two topics, structural analysis methodology to assess cumulative damage to transportation packages and the effects of an aircraft engine crash on a dual purpose cask. These newly emerging issues are selected from among the many technical issues related to the development of transportation and storage systems of spent fuels. In the design process, appropriate analytical methods, procedures, and tools are used in conjunction with a suitably selected test procedure and assumptions such as jet engine simulation for postulated design events and a beyond design basis accident.

Cloud Attack Detection with Intelligent Rules

  • Pradeepthi, K.V;Kannan, A
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.10
    • /
    • pp.4204-4222
    • /
    • 2015
  • Cloud is the latest buzz word in the internet community among developers, consumers and security researchers. There have been many attacks on the cloud in the recent past where the services got interrupted and consumer privacy has been compromised. Denial of Service (DoS) attacks effect the service availability to the genuine user. Customers are paying to use the cloud, so enhancing the availability of services is a paramount task for the service provider. In the presence of DoS attacks, the availability is reduced drastically. Such attacks must be detected and prevented as early as possible and the power of computational approaches can be used to do so. In the literature, machine learning techniques have been used to detect the presence of attacks. In this paper, a novel approach is proposed, where intelligent rule based feature selection and classification are performed for DoS attack detection in the cloud. The performance of the proposed system has been evaluated on an experimental cloud set up with real time DoS tools. It was observed that the proposed system achieved an accuracy of 98.46% on the experimental data for 10,000 instances with 10 fold cross-validation. By using this methodology, the service providers will be able to provide a more secure cloud environment to the customers.

The Implementation of Total Quality Management in Controlling the Cost of Manufacturing

  • Seetharaman, A.;Raj, John Rudolph;Seetharaman, Saravanan Arumugam
    • Journal of Distribution Science
    • /
    • v.13 no.8
    • /
    • pp.27-40
    • /
    • 2015
  • Purpose - Total Quality Management (TQM) has received significant attention and interest from a large number of organizations around the world in various industries. These organizations have tried to embody TQM concepts in areas such as engineering and product design, marketing, R&D, procurement, production, personnel, and product inspection. Research design, data, and methodology - This study presents an overview of the fundamentals of TQM and an in depth review of the obstacles to the successful implementation of TQM. Results - In order to control the cost of manufacturing, the tracking of the cost of quality (COQ) allows companies to capture the actual overall cost incurred in producing a unit of product or service. The study explores the reasons why companies track the COQ and ways to address it. Conclusions - Based on the results, COQ is one of the key performance indicators for making more accurate strategic decisions as well as a critical aspect of TQM. The study also presents a few popular quality improvement tools that have been widely used in organizations successfully implementing TQM.

Estimating floor spectra in multiple degree of freedom systems

  • Calvi, Paolo M.;Sullivan, Timothy J.
    • Earthquakes and Structures
    • /
    • v.7 no.1
    • /
    • pp.17-38
    • /
    • 2014
  • As the desire for high performance buildings increases, it is increasingly evident that engineers require reliable methods for the estimation of seismic demands on both structural and non-structural components. To this extent, improved tools for the prediction of floor spectra would assist in the assessment of acceleration sensitive non-structural and secondary components. Recently, a new procedure was successfully developed and tested for the simplified construction of floor spectra, at various levels of elastic damping, atop single-degree-of-freedom structures. This paper extends the methodology to multi-degree-of-freedom (MDOF) supporting systems responding in the elastic range, proposing a simplified modal combination approach for floor spectra over upper storeys and accounting for the limited filtering of the ground motion input that occurs over lower storeys. The procedure is tested numerically by comparing predictions with floor spectra obtained from time-history analyses of RC wall structures of 2- to 20-storeys in height. Results demonstrate that the method performs well for MDOF systems responding in the elastic range. Future research should further develop the approach to permit the prediction of floor spectra in MDOF systems that respond in the inelastic range.

A New Algorithm Based on ASH in Local Modes Detection of Pathrate (ASH를 이용한 Pathrate에서의 Local Mode 검출 알고리즘)

  • Huang, Yue;Kim, Yong-Soo
    • Journal of the Korea Society of Computer and Information
    • /
    • v.11 no.5 s.43
    • /
    • pp.1-8
    • /
    • 2006
  • Network measurement is a vital part of network traffic engineering. In a network, the metric 'capacity' characterizes the maximum throughput the path can provide when there is no traffic load, or the minimum transmission rate among all links in a path. Pathrate is one of the most widely used network capacity measurement tools nowadays. It's famous for its accurate estimation result and non restriction of the temporal network traffic condition. After several years of development, its performance becomes more stable and reliable. Extant local modes detection algorithm in pathrate is based on statistic methodology histogram. This paper suggests a new algorithm for local modes detection based on ASH (Averaged Shifted Histogram). We have implemented this algorithm and will prove it can accomplish the same task as the original one with a better result.

  • PDF

Drug-Drug Interaction Prediction Using Krill Herd Algorithm Based on Deep Learning Method

  • Al-Marghilani, Abdulsamad
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.6
    • /
    • pp.319-328
    • /
    • 2021
  • Parallel administration of numerous drugs increases Drug-Drug Interaction (DDI) because one drug might affect the activity of other drugs. DDI causes negative or positive impacts on therapeutic output. So there is a need to discover DDI to enhance the safety of consuming drugs. Though there are several DDI system exist to predict an interaction but nowadays it becomes impossible to maintain with a large number of biomedical texts which is getting increased rapidly. Mostly the existing DDI system address classification issues, and especially rely on handcrafted features, and some features which are based on particular domain tools. The objective of this paper to predict DDI in a way to avoid adverse effects caused by the consumed drugs, to predict similarities among the drug, Drug pair similarity calculation is performed. The best optimal weight is obtained with the support of KHA. LSTM function with weight obtained from KHA and makes bets prediction of DDI. Our methodology depends on (LSTM-KHA) for the detection of DDI. Similarities among the drugs are measured with the help of drug pair similarity calculation. KHA is used to find the best optimal weight which is used by LSTM to predict DDI. The experimental result was conducted on three kinds of dataset DS1 (CYP), DS2 (NCYP), and DS3 taken from the DrugBank database. To evaluate the performance of proposed work in terms of performance metrics like accuracy, recall, precision, F-measures, AUPR, AUC, and AUROC. Experimental results express that the proposed method outperforms other existing methods for predicting DDI. LSTMKHA produces reasonable performance metrics when compared to the existing DDI prediction model.

Knowledge graph-based knowledge map for efficient expression and inference of associated knowledge (연관지식의 효율적인 표현 및 추론이 가능한 지식그래프 기반 지식지도)

  • Yoo, Keedong
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.4
    • /
    • pp.49-71
    • /
    • 2021
  • Users who intend to utilize knowledge to actively solve given problems proceed their jobs with cross- and sequential exploration of associated knowledge related each other in terms of certain criteria, such as content relevance. A knowledge map is the diagram or taxonomy overviewing status of currently managed knowledge in a knowledge-base, and supports users' knowledge exploration based on certain relationships between knowledge. A knowledge map, therefore, must be expressed in a networked form by linking related knowledge based on certain types of relationships, and should be implemented by deploying proper technologies or tools specialized in defining and inferring them. To meet this end, this study suggests a methodology for developing the knowledge graph-based knowledge map using the Graph DB known to exhibit proper functionality in expressing and inferring relationships between entities and their relationships stored in a knowledge-base. Procedures of the proposed methodology are modeling graph data, creating nodes, properties, relationships, and composing knowledge networks by combining identified links between knowledge. Among various Graph DBs, the Neo4j is used in this study for its high credibility and applicability through wide and various application cases. To examine the validity of the proposed methodology, a knowledge graph-based knowledge map is implemented deploying the Graph DB, and a performance comparison test is performed, by applying previous research's data to check whether this study's knowledge map can yield the same level of performance as the previous one did. Previous research's case is concerned with building a process-based knowledge map using the ontology technology, which identifies links between related knowledge based on the sequences of tasks producing or being activated by knowledge. In other words, since a task not only is activated by knowledge as an input but also produces knowledge as an output, input and output knowledge are linked as a flow by the task. Also since a business process is composed of affiliated tasks to fulfill the purpose of the process, the knowledge networks within a business process can be concluded by the sequences of the tasks composing the process. Therefore, using the Neo4j, considered process, task, and knowledge as well as the relationships among them are defined as nodes and relationships so that knowledge links can be identified based on the sequences of tasks. The resultant knowledge network by aggregating identified knowledge links is the knowledge map equipping functionality as a knowledge graph, and therefore its performance needs to be tested whether it meets the level of previous research's validation results. The performance test examines two aspects, the correctness of knowledge links and the possibility of inferring new types of knowledge: the former is examined using 7 questions, and the latter is checked by extracting two new-typed knowledge. As a result, the knowledge map constructed through the proposed methodology has showed the same level of performance as the previous one, and processed knowledge definition as well as knowledge relationship inference in a more efficient manner. Furthermore, comparing to the previous research's ontology-based approach, this study's Graph DB-based approach has also showed more beneficial functionality in intensively managing only the knowledge of interest, dynamically defining knowledge and relationships by reflecting various meanings from situations to purposes, agilely inferring knowledge and relationships through Cypher-based query, and easily creating a new relationship by aggregating existing ones, etc. This study's artifacts can be applied to implement the user-friendly function of knowledge exploration reflecting user's cognitive process toward associated knowledge, and can further underpin the development of an intelligent knowledge-base expanding autonomously through the discovery of new knowledge and their relationships by inference. This study, moreover than these, has an instant effect on implementing the networked knowledge map essential to satisfying contemporary users eagerly excavating the way to find proper knowledge to use.

Contribution of thermal-hydraulic validation tests to the standard design approval of SMART

  • Park, Hyun-Sik;Kwon, Tae-Soon;Moon, Sang-Ki;Cho, Seok;Euh, Dong-Jin;Yi, Sung-Jae
    • Nuclear Engineering and Technology
    • /
    • v.49 no.7
    • /
    • pp.1537-1546
    • /
    • 2017
  • Many thermal-hydraulic tests have been conducted at the Korea Atomic Energy Research Institute for verification of the SMART (System-integrated Modular Advanced ReacTor) design, the standard design approval of which was issued by the Korean regulatory body. In this paper, the contributions of these tests to the standard design approval of SMART are discussed. First, an integral effect test facility named VISTA-ITL (Experimental Verification by Integral Simulation of Transients and Accidents-Integral Test Loop) has been utilized to assess the TASS/SMR-S (Transient and Set-point Simulation/Small and Medium) safety analysis code and confirm its conservatism, to support standard design approval, and to construct a database for the SMART design optimization. In addition, many separate effect tests have been performed. The reactor internal flow test has been conducted using the SCOP (SMART COre flow distribution and Pressure drop test) facility to evaluate the reactor internal flow and pressure distributions. An ECC (Emergency Core Coolant) performance test has been carried out using the SWAT (SMART ECC Water Asymmetric Two-phase choking test) facility to evaluate the safety injection performance and to validate the thermal-hydraulic model used in the safety analysis code. The Freon CHF (Critical Heat Flux) test has been performed using the FTHEL (Freon Thermal Hydraulic Experimental Loop) facility to construct a database from the $5{\times}5$ rod bundle Freon CHF tests and to evaluate the DNBR (Departure from Nucleate Boiling Ratio) model in the safety analysis and core design codes. These test results were used for standard design approval of SMART to verify its design bases, design tools, and analysis methodology.

Analysis of the Satisfaction with Computer Based Test Program and Test Environment in Medical School (의과대학의 컴퓨터기반시험 프로그램 및 시험환경 만족도 분석)

  • Kim, Soon Gu;Lee, Aehwa;Hwang, Ilseon
    • Korean Medical Education Review
    • /
    • v.22 no.3
    • /
    • pp.198-206
    • /
    • 2020
  • This study aimed to identify needed improvements to current evaluation methods in medical school computer-based test (CBT) programs and test environments. To that end, an analysis of the importance and satisfaction was conducted through a survey of 3rd and 4th year medical students who had sufficient experience with CBT programs. Importance performance analysis methodology using the correlation coefficient was applied to assess average satisfaction and importance. The first quadrant (keep up the good work) was a factor of review and time management and test facilities among the conveniences of the CBT program. The second quadrant (concentrate here) was a factor of the convenience of the CBT program and computer monitor and chair factor within the test facilities. The third quadrant (low priority) was a factor of cheating and computer failure. The fourth quadrant (possible overkill) was the location, spacing, and temperature factors of the test facilities. Improvements are needed to reduce 'eye fatigue' and help students focus and understand the questions in the CBT programs. It is necessary to improve computer monitors, desks and chairs, and consider the subject's body type and manager in order to cope with computer breakdown and peripheral failures. Spare computers are needed. These findings are meaningful in that they have been able to identify factors that require improvement in the CBT program and test environment resulting from changes in assessment tools.

Development of an Intelligent Ultrasonic Signature Classification Software for Discrimination of Flaws in Weldments (용접 결함 종류 판별을 위한 지능형 초음파 신호 분류 소프트웨어의 개발)

  • Kim, H.J.;Song, S.J.;Jeong, H.D.
    • Journal of the Korean Society for Nondestructive Testing
    • /
    • v.17 no.4
    • /
    • pp.248-261
    • /
    • 1997
  • Ultrasonic pattern recognition is the most effective approach to the problem of discriminating types of flaws in weldments based on ultrasonic flaw signals. In spite of significant progress in the research on this methodology, it has not been widely used in many practical ultrasonic inspections of weldments in industry. Hence, for the convenient application of this approach in many practical situations, we develop an intelligent ultrasonic signature classification software which can discriminate types of flaws in weldments based on their ultrasonic signals using various tools in artificial intelligence such as neural networks. This software shows the excellent performance in an experimental problem where flaws in weldments are classified into two categories of cracks and non-cracks. This performance demonstrates the high possibility of this software as a practical tool for ultrasonic flaw classification in weldments.

  • PDF