• Title/Summary/Keyword: Random Sampling

Search Result 1,330, Processing Time 0.028 seconds

Improvement of ASIFT for Object Matching Based on Optimized Random Sampling

  • Phan, Dung;Kim, Soo Hyung;Na, In Seop
    • International Journal of Contents
    • /
    • v.9 no.2
    • /
    • pp.1-7
    • /
    • 2013
  • This paper proposes an efficient matching algorithm based on ASIFT (Affine Scale-Invariant Feature Transform) which is fully invariant to affine transformation. In our approach, we proposed a method of reducing similar measure matching cost and the number of outliers. First, we combined the Manhattan and Chessboard metrics replacing the Euclidean metric by a linear combination for measuring the similarity of keypoints. These two metrics are simple but really efficient. Using our method the computation time for matching step was saved and also the number of correct matches was increased. By applying an Optimized Random Sampling Algorithm (ORSA), we can remove most of the outlier matches to make the result meaningful. This method was experimented on various combinations of affine transform. The experimental result shows that our method is superior to SIFT and ASIFT.

Adaptive Random Pocket Sampling for Traffic Load Measurement (트래픽 부하측정을 위한 적응성 있는 랜덤 패킷 샘플링 기법)

  • ;;Zhi-Li Zhang
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.28 no.11B
    • /
    • pp.1038-1049
    • /
    • 2003
  • Exactly measuring traffic load is the basis for efficient traffic engineering. However, precise traffic measurement involves inspecting every packet traversing a lint resulting in significant overhead on routers with high-speed links. Sampling techniques are proposed as an alternative way to reduce the measurement overhead. But, since sampling inevitably accompany with error, there should be a way to control, or at least limit, the error for traffic engineering applications to work correctly. In this paper, we address the problem of bounding sampling error within a pre-specified tolerance level. We derive a relationship between the number of samples, the accuracy of estimation and the squared coefficient of variation of packet size distribution. Based on this relationship, we propose an adaptive random sampling technique that determines the minimum sampling probability adaptively according to traffic dynamics. Using real network traffic traces, we show that the proposed adaptive random sampling technique indeed produces the desired accuracy, while also yielding significant reduction in the amount of traffic samples.

Random number generation by use of de Bruijin sequence

  • Harada, Hiroshi;Kashiwagi, Hiroshi;Oguri, Kazuo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1988.10b
    • /
    • pp.1033-1036
    • /
    • 1988
  • This paper proposes a new method for generation of uniform random numbers using binary random sequences. These binary sequences are obtained from a de Bruijn sequence by random sampling method. Several statistical tests are carried out for the random numbers generated by the proposed method, and it is shown that the random numbers have good random properties.

  • PDF

A Stratified Multi-proportions Randomized Response Model (층화 다지 확률화응답모형)

  • Lee, Gi-Sung;Park, Kyung-Soon
    • The Korean Journal of Applied Statistics
    • /
    • v.28 no.6
    • /
    • pp.1113-1120
    • /
    • 2015
  • We propose a multi-proportions randomized response model by stratified simple random sampling for surveys of sensitive issues of a polychotomous population composed of several stratum. We also systemize a theoretical validity to apply multi-proportions randomized response model (Abul-Ela et al.' model, Eriksson's model) to stratified simple random sampling and derive the estimate and its dispersion matrix of the proportion of sensitive characteristic of population using the suggested model. Two types of sample allocations (proportional allocation and optimum allocation) are considered under the fixed cost. In efficiency, the Eriksson's model by stratified sampling are compared to the Abul-Ela et al.' model.

Probabilistic Evaluation of Voltage Quality on Distribution System Containing Distributed Generation and Electric Vehicle Charging Load

  • CHEN, Wei;YAN, Hongqiang;PEI, Xiping
    • Journal of Electrical Engineering and Technology
    • /
    • v.12 no.5
    • /
    • pp.1743-1753
    • /
    • 2017
  • Since there are multiple random variables in the probabilistic load flow (PLF) calculation of distribution system containing distributed generation (DG) and electric vehicle charging load (EVCL), a Monte Carlo method based on composite sampling method is put forward according to the existing simple random sampling Monte Carlo simulation method (SRS-MCSM) to perform probabilistic assessment analysis of voltage quality of distribution system containing DG and EVCL. This method considers not only the randomness of wind speed and light intensity as well as the uncertainty of basic load and EVCL, but also other stochastic disturbances, such as the failure rate of the transmission line. According to the different characteristics of random factors, different sampling methods are applied. Simulation results on IEEE9 bus system and IEEE34 bus system demonstrates the validity, accuracy, rapidity and practicability of the proposed method. In contrast to the SRS-MCSM, the proposed method is of higher computational efficiency and better simulation accuracy. The variation of nodal voltages for distribution system before and after connecting DG and EVCL is compared and analyzed, especially the voltage fluctuation of the grid-connected point of DG and EVCL.

Identification of the Most Conservative Condition for the Safety Analysis of a Nuclear Power Plant by Use of Random Sampling (무작위 추출 방법을 이용한 원자력발전소 보수적 안전해석 조건 결정)

  • Jeong, Hae-Yong
    • Journal of the Korean Society of Safety
    • /
    • v.30 no.5
    • /
    • pp.131-137
    • /
    • 2015
  • For the evaluation of safety margin of a nuclear power plant using a conservative methodology, the influence of applied assumptions such as initial conditions and boundary conditions needs to be assessed deliberately. Usually, a combination of the most conservative initial conditions is determined, and the safety margin for the transient is evaluated through the analysis for this conservative conditions. In existing conservative methodologies, a most-conservative condition is searched through the analyses for the maximum, minimum, and nominal values of the major parameters. In the present study, we investigates a new approach which can be applied to choose a most-conservative initial condition effectively when a best-estimate computer code and a conservative evaluation methodology are utilized for the evaluation of safety margin of transients. By constituting the band of various initial conditions using the random sampling of input parameters, the sensitivity study for various parameters are performed systematically. A method of sampling the value of control or operation parameters for a certain range is adopted by use of MOSAIQUE program, which enables to minimize the efforts for achieving the steady-state for various different conditions. A representative control parameter is identified, which governs the reactor coolant flow rate, pressurizer pressure, pressurizer level, and steam generator level, respectively. It is shown that an appropriate distribution of input parameter is obtained by adjusting the range and distribution of the control parameter.

Experimental Analysis of Equilibrization in Binary Classification for Non-Image Imbalanced Data Using Wasserstein GAN

  • Wang, Zhi-Yong;Kang, Dae-Ki
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.11 no.4
    • /
    • pp.37-42
    • /
    • 2019
  • In this paper, we explore the details of three classic data augmentation methods and two generative model based oversampling methods. The three classic data augmentation methods are random sampling (RANDOM), Synthetic Minority Over-sampling Technique (SMOTE), and Adaptive Synthetic Sampling (ADASYN). The two generative model based oversampling methods are Conditional Generative Adversarial Network (CGAN) and Wasserstein Generative Adversarial Network (WGAN). In imbalanced data, the whole instances are divided into majority class and minority class, where majority class occupies most of the instances in the training set and minority class only includes a few instances. Generative models have their own advantages when they are used to generate more plausible samples referring to the distribution of the minority class. We also adopt CGAN to compare the data augmentation performance with other methods. The experimental results show that WGAN-based oversampling technique is more stable than other approaches (RANDOM, SMOTE, ADASYN and CGAN) even with the very limited training datasets. However, when the imbalanced ratio is too small, generative model based approaches cannot achieve satisfying performance than the conventional data augmentation techniques. These results suggest us one of future research directions.

A Comparison of Ensemble Methods Combining Resampling Techniques for Class Imbalanced Data (데이터 전처리와 앙상블 기법을 통한 불균형 데이터의 분류모형 비교 연구)

  • Leea, Hee-Jae;Lee, Sungim
    • The Korean Journal of Applied Statistics
    • /
    • v.27 no.3
    • /
    • pp.357-371
    • /
    • 2014
  • There are many studies related to imbalanced data in which the class distribution is highly skewed. To address the problem of imbalanced data, previous studies deal with resampling techniques which correct the skewness of the class distribution in each sampled subset by using under-sampling, over-sampling or hybrid-sampling such as SMOTE. Ensemble methods have also alleviated the problem of class imbalanced data. In this paper, we compare around a dozen algorithms that combine the ensemble methods and resampling techniques based on simulated data sets generated by the Backbone model, which can handle the imbalance rate. The results on various real imbalanced data sets are also presented to compare the effectiveness of algorithms. As a result, we highly recommend the resampling technique combining ensemble methods for imbalanced data in which the proportion of the minority class is less than 10%. We also find that each ensemble method has a well-matched sampling technique. The algorithms which combine bagging or random forest ensembles with random undersampling tend to perform well; however, the boosting ensemble appears to perform better with over-sampling. All ensemble methods combined with SMOTE outperform in most situations.

A Case Study on the Target Sampling Inspection for Improving Outgoing Quality (타겟 샘플링 검사를 통한 출하품질 향상에 관한 사례 연구)

  • Kim, Junse;Lee, Changki;Kim, Kyungnam;Kim, Changwoo;Song, Hyemi;Ahn, Seoungsu;Oh, Jaewon;Jo, Hyunsang;Han, Sangseop
    • Journal of Korean Society for Quality Management
    • /
    • v.49 no.3
    • /
    • pp.421-431
    • /
    • 2021
  • Purpose: For improving outgoing quality, this study presents a novel sampling framework based on predictive analytics. Methods: The proposed framework is composed of three steps. The first step is the variable selection. The knowledge-based and data-driven approaches are employed to select important variables. The second step is the model learning. In this step, we consider the supervised classification methods, the anomaly detection methods, and the rule-based methods. The applying model is the third step. This step includes the all processes to be enabled on real-time prediction. Each prediction model classifies a product as a target sample or random sample. Thereafter intensive quality inspections are executed on the specified target samples. Results: The inspection data of three Samsung products (mobile, TV, refrigerator) are used to check functional defects in the product by utilizing the proposed method. The results demonstrate that using target sampling is more effective and efficient than random sampling. Conclusion: The results of this paper show that the proposed method can efficiently detect products that have the possibilities of user's defect in the lot. Additionally our study can guide practitioners on how to easily detect defective products using stratified sampling

Reliability Analysis for Structure Design of Automatic Ocean Salt Collector Using Sampling Method of Monte Carlo Simulation

  • Song, Chang Yong
    • Journal of Ocean Engineering and Technology
    • /
    • v.34 no.5
    • /
    • pp.316-324
    • /
    • 2020
  • This paper presents comparative studies of reliability analysis and meta-modeling using the sampling method of Monte Carlo simulation for the structure design of an automatic ocean salt collector (AOSC). The thickness sizing variables of structure members are considered as random variables. Probabilistic performance functions are selected from strength performances evaluated via the finite element analysis of an AOSC. The sampling methods used in the comparative studies are simple random sampling and Sobol sequences with varied numbers of sampling. Approximation methods such as the Kriging model is applied to the meta-model generation. Reliability performances such as the probability failure and distribution are compared based on the variation of the sampling method of Monte Carlo simulation. The meta-modeling accuracy is evaluated for the Kriging model generated from the Monte Carlo simulation and Sobol sequence results. It is discovered that the Sobol sequence method is applicable to not only to the reliability analysis for the structural design of marine equipment such as the AOSC, but also to Kriging meta-modeling owing to its high numerical efficiency.