• Title/Summary/Keyword: Sampling-Based Algorithm

Search Result 477, Processing Time 0.023 seconds

A Fast Block Matching Algorithm Using Hierarchical Search Point Sampling (계층적인 탐색점 추출을 이용한 고속 블록 정합 알고리즘)

  • 정수목
    • Journal of the Korea Computer Industry Society
    • /
    • v.4 no.12
    • /
    • pp.1043-1052
    • /
    • 2003
  • In this paper, we present a fast motion estimation algorithm to reduce the computations of block matching algorithm for motion estimation in video coding. The proposed algorithm is based on Multi-level Successive Elimination Algorithm and Efficient Multi-level Successive Elimination Algorithms. The best estimate of the motion vectors can be obtained by hierarchical search point sampling and thus the proposed algorithm can decrease the number of matching evaluations that require very intensive computations. The efficiency of the proposed algorithm was verified by experimental results.

  • PDF

A Method for RBF-based Approximate Optimization of Expensive Black Box Functions (고비용 블랙박스 함수의 RBF기반 근사 최적화 기법)

  • Park, Sangkun
    • Korean Journal of Computational Design and Engineering
    • /
    • v.21 no.4
    • /
    • pp.443-452
    • /
    • 2016
  • This paper proposes a method for expensive black box optimization using radial basis functions (RBFs). The proposed algorithm is a computational strategy that uses a RBF model approximating the expensive black box function to predict an optimum. First, a RBF-based approximation technique is introduced and a sampling plan for estimation of the black box function is described. Then the proposed algorithm is explained, which presents the pseudo-codes for implementation and the detailed description of each step performed in the optimization process. In addition, numerical experiments will be given to analyze the performance of the proposed algorithm, by investigating computation accuracy, number of function evaluations, and convergence history. Finally, geometric distance problem as application example will be also presented for showing the algorithm applicability to different engineering problems.

Research on Multiple-image Encryption Scheme Based on Fourier Transform and Ghost Imaging Algorithm

  • Zhang, Leihong;Yuan, Xiao;Zhang, Dawei;Chen, Jian
    • Current Optics and Photonics
    • /
    • v.2 no.4
    • /
    • pp.315-323
    • /
    • 2018
  • A new multiple-image encryption scheme that is based on a compressive ghost imaging concept along with a Fourier transform sampling principle has been proposed. This further improves the security of the scheme. The scheme adopts a Fourier transform to sample the original multiple-image information respectively, utilizing the centrosymmetric conjugation property of the spatial spectrum of the images to obtain each Fourier coefficient in the most abundant spatial frequency band. Based on this sampling principle, the multiple images to be encrypted are grouped into a combined image, and then the compressive ghost imaging algorithm is used to improve the security, which reduces the amount of information transmission and improves the information transmission rate. Due to the presence of the compressive sensing algorithm, the scheme improves the accuracy of image reconstruction.

Improvement of ASIFT for Object Matching Based on Optimized Random Sampling

  • Phan, Dung;Kim, Soo Hyung;Na, In Seop
    • International Journal of Contents
    • /
    • v.9 no.2
    • /
    • pp.1-7
    • /
    • 2013
  • This paper proposes an efficient matching algorithm based on ASIFT (Affine Scale-Invariant Feature Transform) which is fully invariant to affine transformation. In our approach, we proposed a method of reducing similar measure matching cost and the number of outliers. First, we combined the Manhattan and Chessboard metrics replacing the Euclidean metric by a linear combination for measuring the similarity of keypoints. These two metrics are simple but really efficient. Using our method the computation time for matching step was saved and also the number of correct matches was increased. By applying an Optimized Random Sampling Algorithm (ORSA), we can remove most of the outlier matches to make the result meaningful. This method was experimented on various combinations of affine transform. The experimental result shows that our method is superior to SIFT and ASIFT.

RANDOM SAMPLING AND RECONSTRUCTION OF SIGNALS WITH FINITE RATE OF INNOVATION

  • Jiang, Yingchun;Zhao, Junjian
    • Bulletin of the Korean Mathematical Society
    • /
    • v.59 no.2
    • /
    • pp.285-301
    • /
    • 2022
  • In this paper, we mainly study the random sampling and reconstruction of signals living in the subspace Vp(𝚽, 𝚲) of Lp(ℝd), which is generated by a family of molecules 𝚽 located on a relatively separated subset 𝚲 ⊂ ℝd. The space Vp(𝚽, 𝚲) is used to model signals with finite rate of innovation, such as stream of pulses in GPS applications, cellular radio and ultra wide-band communication. The sampling set is independently and randomly drawn from a general probability distribution over ℝd. Under some proper conditions for the generators 𝚽 = {𝜙λ : λ ∈ 𝚲} and the probability density function 𝜌, we first approximate Vp(𝚽, 𝚲) by a finite dimensional subspace VpN (𝚽, 𝚲) on any bounded domains. Then, we prove that the random sampling stability holds with high probability for all signals in Vp(𝚽, 𝚲) whose energy concentrate on a cube when the sampling size is large enough. Finally, a reconstruction algorithm based on random samples is given for signals in VpN (𝚽, 𝚲).

Accelerating the EM Algorithm through Selective Sampling for Naive Bayes Text Classifier (나이브베이즈 문서분류시스템을 위한 선택적샘플링 기반 EM 가속 알고리즘)

  • Chang Jae-Young;Kim Han-Joon
    • The KIPS Transactions:PartD
    • /
    • v.13D no.3 s.106
    • /
    • pp.369-376
    • /
    • 2006
  • This paper presents a new method of significantly improving conventional Bayesian statistical text classifier by incorporating accelerated EM(Expectation Maximization) algorithm. EM algorithm experiences a slow convergence and performance degrade in its iterative process, especially when real online-textual documents do not follow EM's assumptions. In this study, we propose a new accelerated EM algorithm with uncertainty-based selective sampling, which is simple yet has a fast convergence speed and allow to estimate a more accurate classification model on Naive Bayesian text classifier. Experiments using the popular Reuters-21578 document collection showed that the proposed algorithm effectively improves classification accuracy.

SMCS/SMPS Simulation Algorithms for Estimating Network Reliability (네트워크 신뢰도를 추정하기 위한 SMCS/SMPS 시뮬레이션 기법)

  • 서재준
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.24 no.63
    • /
    • pp.33-43
    • /
    • 2001
  • To estimate the reliability of a large and complex network with a small variance, we propose two dynamic Monte Carlo sampling methods: the sequential minimal cut set (SMCS) and the sequential minimal path set (SMPS) methods. These methods do not require all minimal cut sets or path sets to be given in advance and do not simulate all arcs at each trial, which can decrease the valiance of network reliability. Based on the proposed methods, we develop the importance sampling estimators, the total hazard (or safety) estimator and the hazard (or safety) importance sampling estimator, and compare the performance of these simulation estimators. It is found that these estimators can significantly reduce the variance of the raw simulation estimator and the usual importance sampling estimator. Especially, the SMCS algorithm is very effective in case that the failure probabilities of arcs are low. On the contrary, the SMPS algorithm is effective in case that the success Probabilities of arcs are low.

  • PDF

Digital Simulation of Narrow-Band Ocean Systems (협대역 해양시스템의 Digital simulation)

  • 김영균
    • Journal of the Korean Institute of Telematics and Electronics
    • /
    • v.18 no.2
    • /
    • pp.22-26
    • /
    • 1981
  • Truncated expansions based upon the sampling theorem but containing only a few terms can be very useful for practical interpolations of band-limited or narrow-band random signals. The major goal of this work is to find and coiupare efficient and "statistically accurate" algorithms for the dynamic analysis of the ocean systems. The stalistical accuracy of truncated sampling interpolations is investicated, and one simple ocean systems, which yields a Runge-Kutta simulation algorithm of improved accuracy with very little increase in computation, is indicated.indicated.

  • PDF

A Study on Modeling of Search Space with GA Sampling

  • Banno, Yoshifumi;Ohsaki, Miho;Yoshikawa, Tomohiro;Shinogi, Tsuyoshi;Tsuruoka, Shinji
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2003.09a
    • /
    • pp.86-89
    • /
    • 2003
  • To model a numerical problem space under the limitation of available data, we need to extract sparse but key points from the space and to efficiently approximate the space with them. This study proposes a sampling method based on the search process of genetic algorithm and a space modeling method based on least-squares approximation using the summation of Gaussian functions. We conducted simulations to evaluate them for several kinds of problem spaces: DeJong's, Schaffer's, and our original one. We then compared the performance between our sampling method and sampling at regular intervals and that between our modeling method and modeling using a polynomial. The results showed that the error between a problem space and its model was the smallest for the combination of our sampling and modeling methods for many problem spaces when the number of samples was considerably small.

  • PDF

Error-robust model-based sampling in accounting (회계감사예에 적용시켜본 오차로버스터적 모델표본론)

  • 김영일
    • The Korean Journal of Applied Statistics
    • /
    • v.6 no.1
    • /
    • pp.29-40
    • /
    • 1993
  • In a model-based sampling problem, it often happens that the functional form of variance of error terms in regression model cannot be specified in an exact form. The goal of error-robust sampling design will be to minimize the 'ill effects' resulting from a lack of knowledge of the error structure. A sampling criterion, which is optimal if it minimizes the average of an inefficiency measure when taken with respect to all candidate error structures, is proposed and a computer algorithm is developed for construction of optimal sampling plans. Auditing problem is of particular relevance because of the uncertainty that currently clouds specification of the error structure.

  • PDF