• Title/Summary/Keyword: cumulative distribution functions

Search Result 75, Processing Time 0.027 seconds

Rapid seismic vulnerability assessment by new regression-based demand and collapse models for steel moment frames

  • Kia, M.;Banazadeh, M.;Bayat, M.
    • Earthquakes and Structures
    • /
    • v.14 no.3
    • /
    • pp.203-214
    • /
    • 2018
  • Predictive demand and collapse fragility functions are two essential components of the probabilistic seismic demand analysis that are commonly developed based on statistics with enormous, costly and time consuming data gathering. Although this approach might be justified for research purposes, it is not appealing for practical applications because of its computational cost. Thus, in this paper, Bayesian regression-based demand and collapse models are proposed to eliminate the need of time-consuming analyses. The demand model developed in the form of linear equation predicts overall maximum inter-story drift of the lowto mid-rise regular steel moment resisting frames (SMRFs), while the collapse model mathematically expressed by lognormal cumulative distribution function provides collapse occurrence probability for a given spectral acceleration at the fundamental period of the structure. Next, as an application, the proposed demand and collapse functions are implemented in a seismic fragility analysis to develop fragility and consequently seismic demand curves of three example buildings. The accuracy provided by utilization of the proposed models, with considering computation reduction, are compared with those directly obtained from Incremental Dynamic analysis, which is a computer-intensive procedure.

An Approximation Approach for Solving a Continuous Review Inventory System Considering Service Cost (서비스 비용을 고려한 연속적 재고관리시스템 해결을 위한 근사법)

  • Lee, Dongju;Lee, Chang-Yong
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.38 no.2
    • /
    • pp.40-46
    • /
    • 2015
  • The modular assembly system can make it possible for the variety of products to be assembled in a short lead time. In this system, necessary components are assembled to optional components tailor to customers' orders. Budget for inventory investments composed of inventory and purchasing costs are practically limited and the purchasing cost is often paid when an order is arrived. Service cost is assumed to be proportional to service level and it is included in budget constraint. We develop a heuristic procedure to find a good solution for a continuous review inventory system of the modular assembly system with a budget constraint. A regression analysis using a quadratic function based on the exponential function is applied to the cumulative density function of a normal distribution. With the regression result, an efficient heuristics is proposed by using an approximation for some complex functions that are composed of exponential functions only. A simple problem is introduced to illustrate the proposed heuristics.

Class-Based Histogram Equalization for Robust Speech Recognition

  • Suh, Young-Joo;Kim, Hoi-Rin
    • ETRI Journal
    • /
    • v.28 no.4
    • /
    • pp.502-505
    • /
    • 2006
  • A new class-based histogram equalization method is proposed for robust speech recognition. The proposed method aims at not only compensating the acoustic mismatch between training and test environments, but also at reducing the discrepancy between the phonetic distributions of training and test speech data. The algorithm utilizes multiple class-specific reference and test cumulative distribution functions, classifies the noisy test features into their corresponding classes, and equalizes the features by using their corresponding class-specific reference and test distributions. Experiments on the Aurora 2 database proved the effectiveness of the proposed method by reducing relative errors by 18.74%, 17.52%, and 23.45% over the conventional histogram equalization method and by 59.43%, 66.00%, and 50.50% over mel-cepstral-based features for test sets A, B, and C, respectively.

  • PDF

Geostatistics for Bayesian interpretation of geophysical data

  • Oh Seokhoon;Lee Duk Kee;Yang Junmo;Youn Yong-Hoon
    • 한국지구물리탐사학회:학술대회논문집
    • /
    • 2003.11a
    • /
    • pp.340-343
    • /
    • 2003
  • This study presents a practical procedure for the Bayesian inversion of geophysical data by Markov chain Monte Carlo (MCMC) sampling and geostatistics. We have applied geostatistical techniques for the acquisition of prior model information, and then the MCMC method was adopted to infer the characteristics of the marginal distributions of model parameters. For the Bayesian inversion of dipole-dipole array resistivity data, we have used the indicator kriging and simulation techniques to generate cumulative density functions from Schlumberger array resistivity data and well logging data, and obtained prior information by cokriging and simulations from covariogram models. The indicator approach makes it possible to incorporate non-parametric information into the probabilistic density function. We have also adopted the MCMC approach, based on Gibbs sampling, to examine the characteristics of a posteriori probability density function and the marginal distribution of each parameter. This approach provides an effective way to treat Bayesian inversion of geophysical data and reduce the non-uniqueness by incorporating various prior information.

  • PDF

TIME-DOMAIN TECHNIQUE FOR FRONT-END NOISE SIMULATION IN NUCLEAR SPECTROSCOPY

  • Neamintara, Hudsaleark;Mangclaviraj, Virul;Punnachaiya, Suvit
    • Nuclear Engineering and Technology
    • /
    • v.39 no.6
    • /
    • pp.717-724
    • /
    • 2007
  • A measurement-based time-domain noise simulation of radiation detector-preamplifier (front-end) noise in nuclear spectroscopy is described. The time-domain noise simulation was performed by generating "noise random numbers" using Monte Carlo's inverse method. The probability of unpredictable noise was derived from the empirical cumulative distribution function via the sampled noise, which was measured from a preamplifier output. Results of the simulated noise were investigated as functions of time, frequency, and statistical domains. Noise behavior was evaluated using the signal wave-shaping function, and was compared with the actual noise. Similarities between the response characteristics of the simulated and the actual preamplifier output noises were found. The simulated noise and the computed nuclear pulse signal were also combined to generate a simulated preamplifier output signal. Such simulated output signals could be used in nuclear spectroscopy to determine energy resolution degradation from front-end noise effect.

Parametric study on probabilistic local seismic demand of IBBC connection using finite element reliability method

  • Taherinasab, Mohammad;Aghakouchak, Ali A.
    • Steel and Composite Structures
    • /
    • v.37 no.2
    • /
    • pp.151-173
    • /
    • 2020
  • This paper aims to probabilistically evaluate performance of two types of I beam to box column (IBBC) connection. With the objective of considering the variability of seismic loading demand, statistical features of the inter-story drift ratio corresponding to the second, fifth and eleventh story of a 12-story steel special moment resisting frames are extracted through incremental dynamic analysis at global collapse state. Variability of geometrical variables and material strength are also taken into account. All of these random variables are exported as inputs to a probabilistic finite element model which simulates the connection. At the end, cumulative distribution functions of local seismic demand for each component of each connection are provided using histogram sampling. Through a parametric study on probabilistic local seismic demand, the influence of some geometrical random variables on the performance of IBBC connections is demonstrated. Furthermore, the probabilistic study revealed that IBBC connection with widened flange has a better performance than the un-widened flange. Also, a design procedure is proposed for WF connections to achieve a same connection performance in different stories.

Estimation of Suitable Methodology for Determining Weibull Parameters for the Vortex Shedding Analysis of Synovial Fluid

  • Singh, Nishant Kumar;Sarkar, A.;Deo, Anandita;Gautam, Kirti;Rai, S.K.
    • Journal of Biomedical Engineering Research
    • /
    • v.37 no.1
    • /
    • pp.21-30
    • /
    • 2016
  • Weibull distribution with two parameters, shape (k) and scale (s) parameters are used to model the fatigue failure analysis due to periodic vortex shedding of the synovial fluid in knee joints. In order to determine the later parameter, a suitable statistical model is required for velocity distribution of synovial fluid flow. Hence, wide applicability of Weibull distribution in life testing and reliability analysis can be applied to describe the probability distribution of synovial fluid flow velocity. In this work, comparisons of three most widely used methods for estimating Weibull parameters are carried out; i.e. the least square estimation method (LSEM), maximum likelihood estimator (MLE) and the method of moment (MOM), to study fatigue failure of bone joint due to periodic vortex shedding of synovial fluid. The performances of these methods are compared through the analysis of computer generated synovial fluidflow velocity distribution in the physiological range. Significant values for the (k) and (s) parameters are obtained by comparing these methods. The criterions such as root mean square error (RMSE), coefficient of determination ($R^2$), maximum error between the cumulative distribution functions (CDFs) or Kolmogorov-Smirnov (K-S) and the chi square tests are used for the comparison of the suitability of these methods. The results show that maximum likelihood method performs well for most of the cases studied and hence recommended.

Multivariate design estimations under copulas constructions. Stage-1: Parametrical density constructions for defining flood marginals for the Kelantan River basin, Malaysia

  • Latif, Shahid;Mustafa, Firuza
    • Ocean Systems Engineering
    • /
    • v.9 no.3
    • /
    • pp.287-328
    • /
    • 2019
  • Comprehensive understanding of the flood risk assessments via frequency analysis often demands multivariate designs under the different notations of return periods. Flood is a tri-variate random consequence, which often pointing the unreliability of univariate return period and demands for the joint dependency construction by accounting its multiple intercorrelated flood vectors i.e., flood peak, volume & durations. Selecting the most parsimonious probability functions for demonstrating univariate flood marginals distributions is often a mandatory pre-processing desire before the establishment of joint dependency. Especially under copulas methodology, which often allows the practitioner to model univariate marginals separately from their joint constructions. Parametric density approximations often hypothesized that the random samples must follow some specific or predefine probability density functions, which usually defines different estimates especially in the tail of distributions. Concentrations of the upper tail often seem interesting during flood modelling also, no evidence exhibited in favours of any fixed distributions, which often characterized through the trial and error procedure based on goodness-of-fit measures. On another side, model performance evaluations and selections of best-fitted distributions often demand precise investigations via comparing the relative sample reproducing capabilities otherwise, inconsistencies might reveal uncertainty. Also, the strength & weakness of different fitness statistics usually vary and having different extent during demonstrating gaps and dispensary among fitted distributions. In this literature, selections efforts of marginal distributions of flood variables are incorporated by employing an interactive set of parametric functions for event-based (or Block annual maxima) samples over the 50-years continuously-distributed streamflow characteristics for the Kelantan River basin at Gulliemard Bridge, Malaysia. Model fitness criteria are examined based on the degree of agreements between cumulative empirical and theoretical probabilities. Both the analytical as well as graphically visual inspections are undertaken to strengthen much decisive evidence in favour of best-fitted probability density.

Economic Analysis of Insulation Wall Panel System using LCC Method (LCC기법을 활용한 단열외벽패널시스템의 경제성분석)

  • Kim, Min-Woo;Jeon, Kyu-Nam;Lee, Gun-Cheol;Cho, Byoung-Young;Han, Min-Cheol;Han, Cheon-Goo
    • Proceedings of the Korean Institute of Building Construction Conference
    • /
    • 2011.05a
    • /
    • pp.153-155
    • /
    • 2011
  • In this study, an insulation panel system that has the most excellent economic feasibility in a long term LCC viewpoint in some analysis, which determine a proper insulation panel construction method for the out wall of structures, is analyzed. As a result, in the case of a deterministic LCC analysis, the initial investment cost represents about 80,000Won/㎡ for extrusion ceramic panels. Also, although the costs of maintenance, disassembling, and disposal show no large differences compared with other panel systems, metal panels indicate a bit higher than other panel systems about 1.5 times. In the probability density function that analyzes the variation of the probabilistic cost between panel systems and its economic feasibility, metal panels show the highest cost distribution and extrusion and stone panels represent low cost distributions. In the cumulative function distribution that composites probability density functions, the extrusion ceramic panel represents the most excellent economic feasibility and reliability and that is also the most superior subject among the subjects used in this study.

  • PDF

Extreme Value Analysis of Statistically Independent Stochastic Variables

  • Choi, Yongho;Yeon, Seong Mo;Kim, Hyunjoe;Lee, Dongyeon
    • Journal of Ocean Engineering and Technology
    • /
    • v.33 no.3
    • /
    • pp.222-228
    • /
    • 2019
  • An extreme value analysis (EVA) is essential to obtain a design value for highly nonlinear variables such as long-term environmental data for wind and waves, and slamming or sloshing impact pressures. According to the extreme value theory (EVT), the extreme value distribution is derived by multiplying the initial cumulative distribution functions for independent and identically distributed (IID) random variables. However, in the position mooring of DNVGL, the sampled global maxima of the mooring line tension are assumed to be IID stochastic variables without checking their independence. The ITTC Recommended Procedures and Guidelines for Sloshing Model Tests never deal with the independence of the sampling data. Hence, a design value estimated without the IID check would be under- or over-estimated because of considering observations far away from a Weibull or generalized Pareto distribution (GPD) as outliers. In this study, the IID sampling data are first checked in an EVA. With no IID random variables, an automatic resampling scheme is recommended using the block maxima approach for a generalized extreme value (GEV) distribution and peaks-over-threshold (POT) approach for a GPD. A partial autocorrelation function (PACF) is used to check the IID variables. In this study, only one 5 h sample of sloshing test results was used for a feasibility study of the resampling IID variables approach. Based on this study, the resampling IID variables may reduce the number of outliers, and the statistically more appropriate design value could be achieved with independent samples.