• Title/Summary/Keyword: Linear Probability Model

Search Result 226, Processing Time 0.023 seconds

Efficiency of various structural modeling schemes on evaluating seismic performance and fragility of APR1400 containment building

  • Nguyen, Duy-Duan;Thusa, Bidhek;Park, Hyosang;Azad, Md Samdani;Lee, Tae-Hyung
    • Nuclear Engineering and Technology
    • /
    • v.53 no.8
    • /
    • pp.2696-2707
    • /
    • 2021
  • The purpose of this study is to investigate the efficiency of various structural modeling schemes for evaluating seismic performances and fragility of the reactor containment building (RCB) structure in the advanced power reactor 1400 (APR1400) nuclear power plant (NPP). Four structural modeling schemes, i.e. lumped-mass stick model (LMSM), solid-based finite element model (Solid FEM), multi-layer shell model (MLSM), and beam-truss model (BTM), are developed to simulate the seismic behaviors of the containment structure. A full three-dimensional finite element model (full 3D FEM) is additionally constructed to verify the previous numerical models. A set of input ground motions with response spectra matching to the US NRC 1.60 design spectrum is generated to perform linear and nonlinear time-history analyses. Floor response spectra (FRS) and floor displacements are obtained at the different elevations of the structure since they are critical outputs for evaluating the seismic vulnerability of RCB and secondary components. The results show that the difference in seismic responses between linear and nonlinear analyses gets larger as an earthquake intensity increases. It is observed that the linear analysis underestimates floor displacements while it overestimates floor accelerations. Moreover, a systematic assessment of the capability and efficiency of each structural model is presented thoroughly. MLSM can be an alternative approach to a full 3D FEM, which is complicated in modeling and extremely time-consuming in dynamic analyses. Specifically, BTM is recommended as the optimal model for evaluating the nonlinear seismic performance of NPP structures. Thereafter, linear and nonlinear BTM are employed in a series of time-history analyses to develop fragility curves of RCB for different damage states. It is shown that the linear analysis underestimates the probability of damage of RCB at a given earthquake intensity when compared to the nonlinear analysis. The nonlinear analysis approach is highly suggested for assessing the vulnerability of NPP structures.

Mean estimation of small areas using penalized spline mixed-model under informative sampling

  • Chytrasari, Angela N.R.;Kartiko, Sri Haryatmi;Danardono, Danardono
    • Communications for Statistical Applications and Methods
    • /
    • v.27 no.3
    • /
    • pp.349-363
    • /
    • 2020
  • Penalized spline is a suitable nonparametric approach in estimating mean model in small area. However, application of the approach in informative sampling in a published article is uncommon. We propose a semiparametric mixed-model using penalized spline under informative sampling to estimate mean of small area. The response variable is explained in terms of mean model, informative sample effect, area random effect and unit error. We approach the mean model by penalized spline and utilize a penalized spline function of the inclusion probability to account for the informative sample effect. We determine the best and unbiased estimators for coefficient model and derive the restricted maximum likelihood estimators for the variance components. A simulation study shows a decrease in the average absolute bias produced by the proposed model. A decrease in the root mean square error also occurred except in some quadratic cases. The use of linear and quadratic penalized spline to approach the function of the inclusion probability provides no significant difference distribution of root mean square error, except for few smaller samples.

An educational tool for binary logistic regression model using Excel VBA (엑셀 VBA를 이용한 이분형 로지스틱 회귀모형 교육도구 개발)

  • Park, Cheolyong;Choi, Hyun Seok
    • Journal of the Korean Data and Information Science Society
    • /
    • v.25 no.2
    • /
    • pp.403-410
    • /
    • 2014
  • Binary logistic regression analysis is a statistical technique that explains binary response variable by quantitative or qualitative explanatory variables. In the binary logistic regression model, the probability that the response variable equals, say 1, one of the binary values is to be explained as a transformation of linear combination of explanatory variables. This is one of big barriers that non-statisticians have to overcome in order to understand the model. In this study, an educational tool is developed that explains the need of the binary logistic regression analysis using Excel VBA. More precisely, this tool explains the problems related to modeling the probability of the response variable equal to 1 as a linear combination of explanatory variables and then shows how these problems can be solved through some transformations of the linear combination.

A Study on a Multi-period Inventory Model with Quantity Discounts Based on the Previous Order (주문량 증가에 따른 할인 정책이 있는 다기간 재고 모형의 해법 연구)

  • Lim, Sung-Mook
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.32 no.4
    • /
    • pp.53-62
    • /
    • 2009
  • Lee[15] examined quantity discount contracts between a manufacturer and a retailer in a stochastic, two-period inventory model where quantity discounts are provided based on the previous order size. During the two periods, the retailer faces stochastic (truncated Poisson distributed) demands and he/she places orders to meet the demands. The manufacturer provides for the retailer a price discount for the second period order if its quantity exceeds the first period order quantity. In this paper we extend the above two-period model to a k-period one (where k < 2) and propose a stochastic nonlinear mixed binary integer program for it. In order to make the program tractable, the nonlinear term involving the sum of truncated Poisson cumulative probability function values over a certain range of demand is approximated by an i-interval piecewise linear function. With the value of i selected and fixed, the piecewise linear function is determined using an evolutionary algorithm where its fitness to the original nonlinear term is maximized. The resulting piecewise linear mixed binary integer program is then transformed to a mixed binary integer linear program. With the k-period model developed, we suggest a solution procedure of receding horizon control style to solve n-period (n < k) order decision problems. We implement Lee's two-period model and the proposed k-period model for the use in receding horizon control style to solve n-period order decision problems, and compare between the two models in terms of the pattern of order quantities and the total profits. Our computational study shows that the proposed model is superior to the two-period model with respect to the total profits, and that order quantities from the proposed model have higher fluctuations over periods.

Performance Analysis of Nonlinear Energy-Harvesting DF Relay System in Interference-Limited Nakagami-m Fading Environment

  • Cvetkovic, Aleksandra;Blagojevic, Vesna;Ivanis, Predrag
    • ETRI Journal
    • /
    • v.39 no.6
    • /
    • pp.803-812
    • /
    • 2017
  • A decode-and-forward system with an energy-harvesting relay is analyzed for the case when an arbitrary number of independent interference signals affect the communication at both the relay and the destination nodes. The scenario in which the relay harvests energy from both the source and interference signals using a time switching scheme is analyzed. The analysis is performed for the interference-limited Nakagami-m fading environment, assuming a realistic nonlinearity for the electronic devices. The closed-form outage probability expression for the system with a nonlinear energy harvester is derived. An asymptotic expression valid for the case of a simpler linear harvesting model is also provided. The derived analytical results are corroborated by an independent simulation model. The impacts of the saturation threshold power, the energy-harvesting ratio, and the number and power of the interference signals on the system performance are analyzed.

Korea-specified Maximum Expected Utility Model for the Probability of Default (기대효용최대화를 통한 한국형 기업 신용평가 모형)

  • Park, You-Sung;Song, Ji-Hyun;Choi, Bo-Seung
    • The Korean Journal of Applied Statistics
    • /
    • v.20 no.3
    • /
    • pp.573-584
    • /
    • 2007
  • A well estimated probability of default is most important for constructing a good credit scoring process. The maximum expected utility (MEU) model has been suggested as an alternative of the traditional logistic regression model. Because the MEU model has been constructed using financial data arising from North America and European countries, the MEU model may not be suitable to Korean private firms. Thus, we propose a Korea-specific MEU model by estimating the parameters involved in kernel functions. This Korea-specific MEU model is illustrated using 34,057 private firms to show the performance of the MEU model relative to the usual logistic regression model.

Seismic fragility assessment of shored mechanically stabilized earth walls

  • Sheida Ilbagitaher;Hamid Alielahi
    • Geomechanics and Engineering
    • /
    • v.36 no.3
    • /
    • pp.277-293
    • /
    • 2024
  • Shored Mechanically Stabilized Earth (SMSE) walls are types of soil retaining structures that increase soil stability under static and dynamic loads. The damage caused by an earthquake can be determined by evaluating the probabilistic seismic response of SMSE walls. This study aimed to assess the seismic performance of SMSE walls and provide fragility curves for evaluating failure levels. The generated fragility curves can help to improve the seismic performance of these walls through assessing and controlling variables like backfill surface settlement, lateral deformation of facing, and permanent relocation of the wall. A parametric study was performed based on a non-linear elastoplastic constitutive model known as the hardening soil model with small-strain stiffness, HSsmall. The analyses were conducted using PLAXIS 2D, a Finite Element Method (FEM) program, under plane-strain conditions to study the effect of the number of geogrid layers and the axial stiffness of geogrids on the performance of SMSE walls. In this study, three areas of damage (minor, moderate, and severe) were observed and, in all cases, the wall has not completely entered the stage of destruction. For the base model (Model A), at the highest ground acceleration coefficient (1 g), in the moderate damage state, the fragility probability was 76%. These values were 62%, and 54%, respectively, by increasing the number of geogrids (Model B) and increasing the geogrid stiffness (Model C). Meanwhile, the fragility values were 99%, 98%, and 97%, respectively in the case of minor damage. Notably, the probability of complete destruction was zero percent in all models.

Modified Test Statistic for Identity of Two Distribution on Credit Evaluation (신용평가에서 두 분포의 동일성 검정에 대한 수정통계량)

  • Hong, C.S.;Park, H.S.
    • The Korean Journal of Applied Statistics
    • /
    • v.22 no.2
    • /
    • pp.237-248
    • /
    • 2009
  • The probability of default on the credit evaluation study is represented as a linear combination of two distributions of default and non-default, and the distribution of the probability of default are generally known in most cases. Except the well-known Kolmogorov-Smirnov statistic for testing the identity of two distribution, Kuiper, Cramer-Von Mises, Anderson-Darling, and Watson test statistics are introduced in this work. Under the assumption that the population distribution is known, modified Cramer-Von Mises, Anderson-Darling, and Watson statistics are proposed. Based on score data generated from various probability density functions of the probability of default, the modified test statistics are discussed and compared.

Complex Segregation Analysis of Categorical Traits in Farm Animals: Comparison of Linear and Threshold Models

  • Kadarmideen, Haja N.;Ilahi, H.
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.18 no.8
    • /
    • pp.1088-1097
    • /
    • 2005
  • Main objectives of this study were to investigate accuracy, bias and power of linear and threshold model segregation analysis methods for detection of major genes in categorical traits in farm animals. Maximum Likelihood Linear Model (MLLM), Bayesian Linear Model (BALM) and Bayesian Threshold Model (BATM) were applied to simulated data on normal, categorical and binary scales as well as to disease data in pigs. Simulated data on the underlying normally distributed liability (NDL) were used to create categorical and binary data. MLLM method was applied to data on all scales (Normal, categorical and binary) and BATM method was developed and applied only to binary data. The MLLM analyses underestimated parameters for binary as well as categorical traits compared to normal traits; with the bias being very severe for binary traits. The accuracy of major gene and polygene parameter estimates was also very low for binary data compared with those for categorical data; the later gave results similar to normal data. When disease incidence (on binary scale) is close to 50%, segregation analysis has more accuracy and lesser bias, compared to diseases with rare incidences. NDL data were always better than categorical data. Under the MLLM method, the test statistics for categorical and binary data were consistently unusually very high (while the opposite is expected due to loss of information in categorical data), indicating high false discovery rates of major genes if linear models are applied to categorical traits. With Bayesian segregation analysis, 95% highest probability density regions of major gene variances were checked if they included the value of zero (boundary parameter); by nature of this difference between likelihood and Bayesian approaches, the Bayesian methods are likely to be more reliable for categorical data. The BATM segregation analysis of binary data also showed a significant advantage over MLLM in terms of higher accuracy. Based on the results, threshold models are recommended when the trait distributions are discontinuous. Further, segregation analysis could be used in an initial scan of the data for evidence of major genes before embarking on molecular genome mapping.

A Linear Approximation Model for an Asset-based Weapon Target Assignment Problem (자산기반 무기할당 문제의 선형 근사 모형)

  • Jang, Jun-Gun;Kim, Kyeongtaek;Choi, Bong-Wan;Suh, Jae Joon
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.38 no.3
    • /
    • pp.108-116
    • /
    • 2015
  • A missile defense system is composed of radars detecting incoming missiles aiming at defense assets, command control units making the decisions on weapon target assignment, and artillery batteries firing of defensive weapons to the incoming missiles. Although, the technology behind the development of radars and weapons is very important, effective assignment of the weapons against missile threats is much more crucial. When incoming missile targets toward valuable assets in the defense area are detected, the asset-based weapon target assignment model addresses the issue of weapon assignment to these missiles so as to maximize the total value of surviving assets threatened by them. In this paper, we present a model for an asset-based weapon assignment problem with shoot-look-shoot engagement policy and fixed set-up time between each anti-missile launch from each defense unit. Then, we show detailed linear approximation process for nonlinear portions of the model and propose final linear approximation model. After that, the proposed model is applied to several ballistic missile defense scenarios. In each defense scenario, the number of incoming missiles, the speed and the position of each missile, the number of defense artillery battery, the number of anti-missile in each artillery battery, single shot kill probability of each weapon to each target, value of assets, the air defense coverage are given. After running lpSolveAPI package of R language with the given data in each scenario in a personal computer, we summarize its weapon target assignment results specified with launch order time for each artillery battery. We also show computer processing time to get the result for each scenario.