• Title/Summary/Keyword: probability sampling

Search Result 570, Processing Time 0.03 seconds

PERFORMANCE EVALUATION VIA MONTE CARLO IMPORTANCE SAMPLING IN SINGLE USER DIGITAL COMMUNICATION SYSTEMS

  • Oh Man-Suk
    • Journal of the Korean Statistical Society
    • /
    • v.35 no.2
    • /
    • pp.157-166
    • /
    • 2006
  • This research proposes an efficient Monte Carlo algorithm for computing error probability in high performance digital communication st stems. It characterizes special features of the problem and suggests an importance sampling algorithm specially designed to handle the problem. It uses a shifted exponential density as the importance sampling density, and shows an adaptive way of choosing the rate and the origin of the shifted exponential density. Instead of equal allocation, an intelligent allocation of the samples is proposed so that more samples are allocated to more important part of the error probability. The algorithm uses the nested feature of the error space and avoids redundancy in estimating the probability. The algorithm is applied to an example data set and shows a great improvement in accuracy of the error probability estimation.

A Study of Dependent Nonstationary Multiple Sampling Plans (종속적 비평형 다중표본 계획법의 연구)

  • 김원경
    • Journal of the Korea Society for Simulation
    • /
    • v.9 no.2
    • /
    • pp.75-87
    • /
    • 2000
  • In this paper, nonstationary multiple sampling plans are discussed which are difficult to solve by analytical method when there exists dependency between the sample data. The initial solution is found by the sequential sampling plan using the sequential probability ration test. The number of acceptance and rejection in each step of the multiple sampling plan are found by grouping the sequential sampling plan's solution initially. The optimal multiple sampling plans are found by simulation. Four search methods are developed U and the optimum sampling plans satisfying the Type I and Type ll error probabilities. The performance of the sampling plans is measured and their algorithms are also shown. To consider the nonstationary property of the dependent sampling plan, simulation method is used for finding the lot rejection and acceptance probability function. As a numerical example Markov chain model is inspected. Effects of the dependency factor and search methods are compared to analyze the sampling results by changing their parameters.

  • PDF

An importance sampling for a function of a multivariate random variable

  • Jae-Yeol Park;Hee-Geon Kang;Sunggon Kim
    • Communications for Statistical Applications and Methods
    • /
    • v.31 no.1
    • /
    • pp.65-85
    • /
    • 2024
  • The tail probability of a function of a multivariate random variable is not easy to estimate by the crude Monte Carlo simulation. When the occurrence of the function value over a threshold is rare, the accurate estimation of the corresponding probability requires a huge number of samples. When the explicit form of the cumulative distribution function of each component of the variable is known, the inverse transform likelihood ratio method is directly applicable scheme to estimate the tail probability efficiently. The method is a type of the importance sampling and its efficiency depends on the selection of the importance sampling distribution. When the cumulative distribution of the multivariate random variable is represented by a copula and its marginal distributions, we develop an iterative algorithm to find the optimal importance sampling distribution, and show the convergence of the algorithm. The performance of the proposed scheme is compared with the crude Monte Carlo simulation numerically.

An Optimal Scheme of Inclusion Probability Proportional to Size Sampling

  • Kim Sun Woong
    • Communications for Statistical Applications and Methods
    • /
    • v.12 no.1
    • /
    • pp.181-189
    • /
    • 2005
  • This paper suggest a method of inclusion probability proportional to size sampling that provides a non-negative and stable variance estimator. The sampling procedure is quite simple and flexible since a sampling design is easily obtained using mathematical programming. This scheme appears to be preferable to Nigam, Kumar and Gupta's (1984) method which uses a balanced incomplete block designs. A comparison is made with their method through an example in the literature.

Bootstrap Confidence Intervals for a One Parameter Model using Multinomial Sampling

  • Jeong, Hyeong-Chul;Kim, Dae-Hak
    • Journal of the Korean Data and Information Science Society
    • /
    • v.10 no.2
    • /
    • pp.465-472
    • /
    • 1999
  • We considered a bootstrap method for constructing confidenc intervals for a one parameter model using multinomial sampling. The convergence rates or the proposed bootstrap method are calculated for model-based maximum likelihood estimators(MLE) using multinomial sampling. Monte Carlo simulation was used to compare the performance of bootstrap methods with normal approximations in terms of the average coverage probability criterion.

  • PDF

Estimation of Overflow Probabilities in Parallel Networks with Coupled Inputs

  • Lee, Jiyeon;Kweon, Min Hee
    • Communications for Statistical Applications and Methods
    • /
    • v.8 no.1
    • /
    • pp.257-269
    • /
    • 2001
  • The simulation is used to estimate an overflow probability in a stable parallel network with coupled inputs. Since the general simulation needs extremely many trials to obtain such a small probability, the fast simulation is proposed to reduce trials instead. By using the Cramer’s theorem, we first obtain an optimally changed measure under which the variance of the estimator is minimized. Then, we use it to derive an importance sampling estimator of the overflow probability which enables us to perform the fast simulation.

  • PDF

Optimal Latinized partially stratified sampling for structural reliability analysis

  • Majid Ilchi Ghazaan;Amirreza Davoodi Yekta
    • Structural Engineering and Mechanics
    • /
    • v.92 no.1
    • /
    • pp.111-120
    • /
    • 2024
  • Sampling methods are powerful approaches to solving the problems of structural reliability analysis and estimating the failure probability of structures. In this paper, a new sampling method is proposed offering lower variance and lower computational cost for complex and high-dimensional problems. The method is called Optimal Latinized partially stratified sampling (OLPSS) as it is based upon the Latinized Partially Stratified Sampling (LPSS) which itself is based on merging Stratified Sampling (SS) and Latin Hypercube Sampling (LHS) algorithms. While LPSS has a low variance, it may suffer from a lack of good space-filling of its generated samples in some cases. In the OLPSS, this issue has been resolved by employing a new columnwise-pairwise exchange optimization procedure for sample generation. The efficiency of the OLPSS has been tested and reported under several benchmark mathematical functions and structural examples including structures with a large number of variables (e.g., a structure with 67 variables). The proposed method provides highly accurate estimates of the failure probability of structures with a significantly lower variance relative to the Monte Carlo simulations, Latin Hypercube, and standard LPSS.

A study on unequal probability sampling over two successive occasions in time series (시계열 계속 표본조사에서 불균등확률 추출법 연구)

  • 박홍래;이계오
    • The Korean Journal of Applied Statistics
    • /
    • v.6 no.1
    • /
    • pp.145-162
    • /
    • 1993
  • We review sampling schemes on successive occasions with partial replacement of units and propose a Rao-Hartley-Cochran(RHC) type's sampling scheme over two successive occasions with probability proportionate to observations on the previous occasion. For comparison of the reviewed and proposed sampling schemes, optimal estimator of population mean on second occasion and its variance are derived. The relative efficiency of the proposed sampling scheme is compared with other equal and unequal probability sampling scheme by theoretical and numerical simulation study. For simulation study, three artificial populations are generated by a time series model. It is observed that RHC type's sampling scheme has small variance and deviation in general.

  • PDF

Supervised Classification Using Training Parameters and Prior Probability Generated from VITD - The Case of QuickBird Multispectral Imagery

  • Eo, Yang-Dam;Lee, Gyeong-Wook;Park, Doo-Youl;Park, Wang-Yong;Lee, Chang-No
    • Korean Journal of Remote Sensing
    • /
    • v.24 no.5
    • /
    • pp.517-524
    • /
    • 2008
  • In order to classify an satellite imagery into geospatial features of interest, the supervised classification needs to be trained to distinguish these features through training sampling. However, even though an imagery is classified, different results of classification could be generated according to operator's experience and expertise in training process. Users who practically exploit an classification result to their applications need the research accomplishment for the consistent result as well as the accuracy improvement. The experiment includes the classification results for training process used VITD polygons as a prior probability and training parameter, instead of manual sampling. As results, classification accuracy using VITD polygons as prior probabilities shows the highest results in several methods. The training using unsupervised classification with VITD have produced similar classification results as manual training and/or with prior probability.

Weighted Latin Hypercube Sampling to Estimate Clearance-to-stop for Probabilistic Design of Seismically Isolated Structures in Nuclear Power Plants

  • Han, Minsoo;Hong, Kee-Jeung;Cho, Sung-Gook
    • Journal of the Earthquake Engineering Society of Korea
    • /
    • v.22 no.2
    • /
    • pp.63-75
    • /
    • 2018
  • This paper proposes extension of Latin Hypercube Sampling (LHS) to avoid the necessity of using intervals with the same probability area where intervals with different probability areas are used. This method is called Weighted Latin Hypercube Sampling (WLHS). This paper describes equations and detail procedure necessary to apply weight function to WLHS. WLHS is verified through numerical examples by comparing the estimated distribution parameters with those from other methods such as Random Sampling and Latin Hypercube Sampling. WLHS provides more flexible way on selecting samples than LHS. Accuracy of WLHS estimation on distribution parameters is depending on the selection of weight function. The proposed WLHS is applied to seismically isolated structures in nuclear power plants. In this application, clearance-to-stops (CSs) calculated using LHS proposed by Huang et al. [1] and WLHS proposed in this paper, respectively, are compared to investigate the effect of choosing different sampling techniques.