• Title/Summary/Keyword: multiple corresponding analysis

Search Result 213, Processing Time 0.027 seconds

On the Study of System Reliability Analysis of Tension Leg Platforms (TLP 해양구조물의 시스템 신뢰성 해석에 관한 연구)

  • Joo-Sung,Lee
    • Bulletin of the Society of Naval Architects of Korea
    • /
    • v.27 no.2
    • /
    • pp.55-62
    • /
    • 1990
  • In this paper, another method for system reliability analysis, called the extended incremental load method, is introduced. The method is an extension of the conventional incremental load method and has been developed aiming at evaluating the probability of system failure(or system reliability) of continuous structures such as floating offshore structures under the multiple loading condition, more realistically considering the post-ultimate behaviour of failed components and directly using the strength formulae of principle components in a structure with employing the modified safety margin equation proposed herein in the system analysis. The method has been applied to the Hutton TLP operated in the Hutton field in the North Sea and a certain variant of the design using the TLP Rule Case Committee type improved strength models. System failure probability and corresponding system reliability indices are derived for a more economical and efficient design. The redundancy characteristics are also addressed. The TLP forms are shown to possess high reserve strength and system safety.

  • PDF

Bayesian Model Selection for Linkage Analyses: Considering Collinear Predictors (연관분석을 위한 베이지안 모형 선택: 상호상관성 변수를 중심으로)

  • Suh, Young-Ju
    • The Korean Journal of Applied Statistics
    • /
    • v.18 no.3
    • /
    • pp.533-541
    • /
    • 2005
  • We identify the correct chromosome and locate the corresponding markers close to the QTL in the linkage analysis of a quantitative trait by using the SSVS method. We consider several markers linked to the QTL, as well as to each oyher and thus the i.b.d. values at these loci generate collinear predictors to be evaluated when using the SSVS approach. The results on considering only closely linked markers to two QTL simultaneously showed clear evidence in favor of the closest marker to the QTL considered over other markers. The results of the analysis of collinear markers with SSVS showeed high concordance to those obtained using traditional multiple regression. We conclude based on this simulation study that the SSVS is quite useful to identify linkage with multiple linked markers simultaneously for a complex quantitative trait.

A Development of Sound Quality Index of an Intake and Exhaust System for High Quality Improvement of Luxury Vehicles (차량 고급감 향상을 위한 흡배기계 음질지수 개발)

  • Lee, Jong-Kyu;Cho, Teock-Hyeong;Seo, Dae-Won;Lim, Yun-Soo;Won, Kwang-Min
    • Transactions of the Korean Society for Noise and Vibration Engineering
    • /
    • v.22 no.3
    • /
    • pp.234-243
    • /
    • 2012
  • In this paper, a sound quality indices for the evaluation of vehicle intake and exhaust noise were developed through a correlation analysis of objective measurement data and subjective evaluation data. At first, intake and exhaust orifice noise were measured at the wide-open throttle sweep condition. And then, acoustic transfer function between intake orifice noise and interior noise at the steady state condition was measured. Also, acoustic transfer function for exhaust system was measured as the same method. Simultaneously, subjective evaluation was carried out by the paired comparison and semantic differential method by 27 engineers. Next, the correlation analysis between the psycho-acoustic parameters derived from the measured data and the subjective evaluation was performed. The most critical factor was determined and the corresponding sound quality index for the intake and exhaust noise was obtained from the multiple factor regression method. At last, the effectiveness of the proposed index was investigated.

Seismic risk assessment of intake tower in Korea using updated fragility by Bayesian inference

  • Alam, Jahangir;Kim, Dookie;Choi, Byounghan
    • Structural Engineering and Mechanics
    • /
    • v.69 no.3
    • /
    • pp.317-326
    • /
    • 2019
  • This research aims to assess the tight seismic risk curve of the intake tower at Geumgwang reservoir by considering the recorded historical earthquake data in the Korean Peninsula. The seismic fragility, a significant part of risk assessment, is updated by using Bayesian inference to consider the uncertainties and computational efficiency. The reservoir is one of the largest reservoirs in Korea for the supply of agricultural water. The intake tower controls the release of water from the reservoir. The seismic risk assessment of the intake tower plays an important role in the risk management of the reservoir. Site-specific seismic hazard is computed based on the four different seismic source maps of Korea. Probabilistic Seismic Hazard Analysis (PSHA) method is used to estimate the annual exceedance rate of hazard for corresponding Peak Ground Acceleration (PGA). Hazard deaggregation is shown at two customary hazard levels. Multiple dynamic analyses and a nonlinear static pushover analysis are performed for deriving fragility parameters. Thereafter, Bayesian inference with Markov Chain Monte Carlo (MCMC) is used to update the fragility parameters by integrating the results of the analyses. This study proves to reduce the uncertainties associated with fragility and risk curve, and to increase significant statistical and computational efficiency. The range of seismic risk curve of the intake tower is extracted for the reservoir site by considering four different source models and updated fragility function, which can be effectively used for the risk management and mitigation of reservoir.

A Systematic Review of MRI, Scintigraphy, FDG-PET and PET/CT for Diagnosis of Multiple Myeloma Related Bone Disease - Which is Best?

  • Weng, Wan-Wen;Dong, Meng-Jie;Zhang, Jun;Yang, Jun;Xu, Qin;Zhu, Yang-Jun;Liu, Ning-Hu
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.15 no.22
    • /
    • pp.9879-9884
    • /
    • 2014
  • Aim: The purpose of the current study was to conduct a systematic review of the published literature to evaluate the diagnostic accuracy of FDG-PET, PTE/CT, MRI and scintigraphy for multiple myeloma related bone disease. Methods: Through a search of PubMed, EMBASE, and the Cochrane Library, two reviewers independently assessed the methodological quality of each study. We estimated pooled sensitivity, specificity, positive and negative likelihood ratios (PLR and NLR), and two sample Z-tests were conducted to evaluate for differences in sensitivity, specificity, area under the curve (AUC), and the $Q^*$ index between any two diagnostic modalities. Results: A total of 17 studies were reviewed. The MRI had a pooled sensitivity of 0.88, specificity of 0.68, AUC of 0.897, and $Q^*$ index of 0.828, whereas for MIBI, the corresponding values were 0.98, 0.90, 0.991, and 0.962, respectively, and for bone scan, they were 066, 0.83, 0.805, and 0.740, respectively. The corresponding values of MIBI were 0.98, 0.90, 0.991, and 0.962, respectively. For PET and PET/CT, the values were 0.91, 0.69, 0.927 and 0.861, respectively. Statistically significant differences were not found in the sensitivity, specificity, AUC, and $Q^*$ index between MRI, scintigraphy, FDG-PET and PET/CT. Conclusions: On the condition that X ray is taken as a reference in our study, we suggested that FDG-PET, PTE/CT, MRI and scintigraphy are all associated with high detection rate of bone disease in patients with MM. Thus, in clinical practice, it is recommended that we could choose these tests according to the condition of the patient.

Design of Multi-node Real-time Diagnostic and Management System Using Zigbee Sensor Network (Zigbee 센서 네트워크를 활용한 다중노드 실시간 진단 및 관리시스템 설계)

  • Kang, Moonsik
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.51 no.6
    • /
    • pp.152-161
    • /
    • 2014
  • In this paper, a multi-node real-time diagnostic and management system based on zigbee sensor network is proposed, which is to monitor and diagnose multiple nodes as well as to control the data generated from the various multiple sensors collectively. The proposed system is designed to transmit the collected wireless and wired data to the server for monitoring and controling efficiently the condition for multi-nodes by taking the corresponding actions according to the analysis. The system is implemented to make it possible to manage the sensor data by classifying them, of which data are issued from the clustered sources with a number of the remote sensors. In order to evaluate the performance of the proposed system, we measure and analyze both the transmission delay time according to the distance and the data loss rate issued from multiple sensors. The results shows that the proposed system has a good performance.

Verification Control Algorithm of Data Integrity Verification in Remote Data sharing

  • Xu, Guangwei;Li, Shan;Lai, Miaolin;Gan, Yanglan;Feng, Xiangyang;Huang, Qiubo;Li, Li;Li, Wei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.2
    • /
    • pp.565-586
    • /
    • 2022
  • Cloud storage's elastic expansibility not only provides flexible services for data owners to store their data remotely, but also reduces storage operation and management costs of their data sharing. The data outsourced remotely in the storage space of cloud service provider also brings data security concerns about data integrity. Data integrity verification has become an important technology for detecting the integrity of remote shared data. However, users without data access rights to verify the data integrity will cause unnecessary overhead to data owner and cloud service provider. Especially malicious users who constantly launch data integrity verification will greatly waste service resources. Since data owner is a consumer purchasing cloud services, he needs to bear both the cost of data storage and that of data verification. This paper proposes a verification control algorithm in data integrity verification for remotely outsourced data. It designs an attribute-based encryption verification control algorithm for multiple verifiers. Moreover, data owner and cloud service provider construct a common access structure together and generate a verification sentinel to verify the authority of verifiers according to the access structure. Finally, since cloud service provider cannot know the access structure and the sentry generation operation, it can only authenticate verifiers with satisfying access policy to verify the data integrity for the corresponding outsourced data. Theoretical analysis and experimental results show that the proposed algorithm achieves fine-grained access control to multiple verifiers for the data integrity verification.

Numerical Analysis of Deformation Characteristics in the Double-Layer Liner According to Explosive Material Distribution (이중층 라이너에서 폭발 재료 분포에 따른 변형 특성 수치해석)

  • Mun, Sang Ho;Kim, See Jo;Lee, Chang Hee;Lee, Seong
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.19 no.5
    • /
    • pp.618-628
    • /
    • 2016
  • The development of new concepts of liners is required in order to effectively neutralize the enemy's attack power concealed in the armored vehicles. A multiple-layer liner is one of possibilities and has a mechanism for explosion after penetrating the target which is known as "Behind Armor Effect." The multiple-layer explosive liner should have sufficient kinetic energy to penetrate the protective structure and explosive material react after target penetration. With this in mind, double-layer liner materials were obtained by cold spray coating methods and these material properties were experimentally characterized and used in this simulation for double-layer liners. In this study, numerical simulations in the three different layer types, i.e., single, A/B, A/B/A in terms of the layer location were verified in terms of finite element mesh sizes and numerical results for the jet tip velocity, kinetic energy, and the corresponding jet deformation characteristics were analysed in detail depending on the structure of layer types.

Multi-Vector Document Embedding Using Semantic Decomposition of Complex Documents (복합 문서의 의미적 분해를 통한 다중 벡터 문서 임베딩 방법론)

  • Park, Jongin;Kim, Namgyu
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.3
    • /
    • pp.19-41
    • /
    • 2019
  • According to the rapidly increasing demand for text data analysis, research and investment in text mining are being actively conducted not only in academia but also in various industries. Text mining is generally conducted in two steps. In the first step, the text of the collected document is tokenized and structured to convert the original document into a computer-readable form. In the second step, tasks such as document classification, clustering, and topic modeling are conducted according to the purpose of analysis. Until recently, text mining-related studies have been focused on the application of the second steps, such as document classification, clustering, and topic modeling. However, with the discovery that the text structuring process substantially influences the quality of the analysis results, various embedding methods have actively been studied to improve the quality of analysis results by preserving the meaning of words and documents in the process of representing text data as vectors. Unlike structured data, which can be directly applied to a variety of operations and traditional analysis techniques, Unstructured text should be preceded by a structuring task that transforms the original document into a form that the computer can understand before analysis. It is called "Embedding" that arbitrary objects are mapped to a specific dimension space while maintaining algebraic properties for structuring the text data. Recently, attempts have been made to embed not only words but also sentences, paragraphs, and entire documents in various aspects. Particularly, with the demand for analysis of document embedding increases rapidly, many algorithms have been developed to support it. Among them, doc2Vec which extends word2Vec and embeds each document into one vector is most widely used. However, the traditional document embedding method represented by doc2Vec generates a vector for each document using the whole corpus included in the document. This causes a limit that the document vector is affected by not only core words but also miscellaneous words. Additionally, the traditional document embedding schemes usually map each document into a single corresponding vector. Therefore, it is difficult to represent a complex document with multiple subjects into a single vector accurately using the traditional approach. In this paper, we propose a new multi-vector document embedding method to overcome these limitations of the traditional document embedding methods. This study targets documents that explicitly separate body content and keywords. In the case of a document without keywords, this method can be applied after extract keywords through various analysis methods. However, since this is not the core subject of the proposed method, we introduce the process of applying the proposed method to documents that predefine keywords in the text. The proposed method consists of (1) Parsing, (2) Word Embedding, (3) Keyword Vector Extraction, (4) Keyword Clustering, and (5) Multiple-Vector Generation. The specific process is as follows. all text in a document is tokenized and each token is represented as a vector having N-dimensional real value through word embedding. After that, to overcome the limitations of the traditional document embedding method that is affected by not only the core word but also the miscellaneous words, vectors corresponding to the keywords of each document are extracted and make up sets of keyword vector for each document. Next, clustering is conducted on a set of keywords for each document to identify multiple subjects included in the document. Finally, a Multi-vector is generated from vectors of keywords constituting each cluster. The experiments for 3.147 academic papers revealed that the single vector-based traditional approach cannot properly map complex documents because of interference among subjects in each vector. With the proposed multi-vector based method, we ascertained that complex documents can be vectorized more accurately by eliminating the interference among subjects.

An Exact Analysis of Steel Box Girders with the Effects of Distortional Deformation of Sections (단면변형의 효과를 포함한 강상자형 거더의 엄밀한 해석)

  • 진만식;이병주;김문영
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.17 no.1
    • /
    • pp.11-20
    • /
    • 2004
  • The main goal of this study is to develop MATLAB programming for an analysis of distortional deformations and stresses of the straight box girder. For this purpose, a distortional deformation theory is firstly summarized and then a BEF (Beam on Elastic Foundation) theory is presented using analogy of the corresponding variables. Finally, with governing equations of the beam-column element on elastic foundation, an exact element stiffness matrix of the beam element and nodal forces equivalent to concentrated and distributed loads are evaluated via a generalized linear eigenvalue problem. In order to verify the efficiency and accuracy of this method, distortional stresses of box girders with multiple diaphragms are presented and compared with results by FEA.