• Title/Summary/Keyword: Size-based selection

Search Result 496, Processing Time 0.027 seconds

Determination of sample size to serological surveillance plan for pullorum disease and fowl typhoid (추백리-가금티푸스의 혈청학적 모니터링 계획수립을 위한 표본크기)

  • Pak, Son-Il;Park, Choi-Kyu
    • Korean Journal of Veterinary Research
    • /
    • v.48 no.4
    • /
    • pp.457-462
    • /
    • 2008
  • The objective of this study was to determine appropriate sample size that simulated different assumptions for diagnostic test characteristics and true prevalences when designing serological surveillance plan for pullorum disease and fowl typhoid in domestic poultry production. The number of flocks and total number of chickens to be sampled was obtained to provide 95% confidence of detecting at least one infected flock, taking imperfect diagnostic tests into account. Due to lack of reliable data, within infected flock prevalence (WFP) was assumed to follow minimum 1%, most likely 5% and maximum 9% and true flock prevalence of 0.1%, 0.5% and 1% in order. Sensitivity were modeled using the Pert distribution: minimum 75%, most likely 80% and maximum 90% for plate agglutination test and 80%, 85%, and 90% for ELISA test. Similarly, the specificity was modeled 85%, 90%, 95% for plate agglutination test and 90%, 95%, 99% for ELISA test. In accordance with the current regulation, flock-level test characteristics calculated assuming that 30 samples are taken from per flock. The model showed that the current 112,000 annual number of testing plan which is based on random selection of flocks is far beyond the sample size estimated in this study. The sample size was further reduced with increased sensitivity and specificity of the test and decreased WFP. The effect of increasing samples per flock on total sample size to be sampled and optimal combination of sensitivity and specificity of the test for the purpose of the surveillance is discussed regarding cost.

An Optimized Iterative Semantic Compression Algorithm And Parallel Processing for Large Scale Data

  • Jin, Ran;Chen, Gang;Tung, Anthony K.H.;Shou, Lidan;Ooi, Beng Chin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.6
    • /
    • pp.2761-2781
    • /
    • 2018
  • With the continuous growth of data size and the use of compression technology, data reduction has great research value and practical significance. Aiming at the shortcomings of the existing semantic compression algorithm, this paper is based on the analysis of ItCompress algorithm, and designs a method of bidirectional order selection based on interval partitioning, which named An Optimized Iterative Semantic Compression Algorithm (Optimized ItCompress Algorithm). In order to further improve the speed of the algorithm, we propose a parallel optimization iterative semantic compression algorithm using GPU (POICAG) and an optimized iterative semantic compression algorithm using Spark (DOICAS). A lot of valid experiments are carried out on four kinds of datasets, which fully verified the efficiency of the proposed algorithm.

Reliability Modeling and Computational Algorithm of Network Systems with Dependent Components (구성요소가 서로 종속인 네트워크시스템의 신뢰성모형과 계산알고리즘)

  • 홍정식;이창훈
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.14 no.1
    • /
    • pp.88-96
    • /
    • 1989
  • General measure in the reliability is the k-terminal reliability, which is the probability that the specified vertices are connected by the working edges. To compute the k-terminal reliability components are usually assumed to be statistically independent. In this study the modeling and analysis of the k-terminal reliability are investigated when dependency among components is considered. As the size of the network increases, the number of the joint probability parameter to represent the dependency among components is increasing exponentially. To avoid such a difficulty the structured-event-based-reliability model (SERM) is presented. This model uses the combination of the network topology (physical representation) and reliability block diagram (logical representation). This enables us to represent the dependency among components in a network form. Computational algorithms for the k-terminal reliability in SERM are based on the factoring algorithm Two features of the ractoring algorithm are the reliability preserving reduction and the privoting edge selection strategy. The pivoting edge selction strategy is modified by two different ways to tackle the replicated edges occuring in SERM. Two algorithms are presented according to each modified pivoting strategy and illustrated by numerical example.

  • PDF

A New Stereo Matching Using Compact Genetic Algorithm (소형 유전자 알고리즘을 이용한 새로운 스테레오 정합)

  • 한규필;배태면;권순규;하영호
    • Proceedings of the IEEK Conference
    • /
    • 1999.06a
    • /
    • pp.474-478
    • /
    • 1999
  • Genetic algorithm is an efficient search method using principles of natural selection and population genetics. In conventional genetic algorithms, however, the size of gene pool should be increased to insure a convergency. Therefore, many memory spaces and much computation time were needed. Also, since child chromosomes were generated by chromosome crossover and gene mutation, the algorithms have a complex structure. Thus, in this paper, a compact stereo matching algorithm using a population-based incremental teaming based on probability vector is proposed to reduce these problems. The PBIL method is modified for matching environment. Since the Proposed algorithm uses a probability vector and eliminates gene pool, chromosome crossover, and gene mutation, the matching algorithm is simple and the computation load is considerably reduced. Even if the characteristics of images are changed, stable outputs are obtained without the modification of the matching algorithm.

  • PDF

Discovery of promising business items by technology-industry concordance and keyword co-occurrence analysis of US patents. (기술-산업 연계구조 및 특허 분석을 통한 미래유망 아이템 발굴)

  • Cho Byoung-Youl;Rho Hyun-Sook
    • Journal of Korea Technology Innovation Society
    • /
    • v.8 no.2
    • /
    • pp.860-885
    • /
    • 2005
  • This study relates to develop a quantitative method through which promising technology-based business items can be discovered and selected. For this study, we utilized patent trend analysis, technology-industry concordance analysis, and keyword co-occurrence analysis of US patents. By analyzing patent trends and technology-industry concordance, we were able to find out the emerging industry trends : prevalence of bio industry, service industry, and B2C business. From the direct and co-occurrence analysis of newly discovered patent keywords in the year, 2000, 28 promising business item candidates were extracted. Finally, the promising item candidates were prioritized using 4 business attractiveness determinants; market size, product life cycle, degree of the technological innovation, and coincidence with the industry trends. This result implicates that reliable discovery and selection of promising technology-based business items can be performed by a quantitative, objective and low- cost process using knowledge discovery method from patent database instead of peer review.

  • PDF

Logic Synthesis Algorithm for Multiplexer-based FPGA's Using BDD (멀티플렉서 구조의 FPGA를 위한 BDD를 이용한 논리 합성 알고리듬)

  • 강규현;이재흥;정정화
    • Journal of the Korean Institute of Telematics and Electronics A
    • /
    • v.30A no.12
    • /
    • pp.117-124
    • /
    • 1993
  • In this paper we propose a new thchnology mapping algorithm for multiplexer-based FPGA's The algorithm consists of three phases` First, it converts the logic functions and the basic logic mocule into BDD's. Second. it covers the logic function with the basic logic modules. Lastly, it reduces the number of basic logic modules used to implement the logic function after going through cell merging procedure. The binate selection is employed to determine the order of input variables of the logic function to constructs the balanced BDD with low level. That enables us to constructs the circuit that has small size and delay time. Technology mapping algorithm of previous work used one basic logic module to implement a two-input or three-input function in logic functions. The algorithm proposed here merges almost all pairs of two-input and three-input functions that occupy one basic logic module. and improves the mapping results. We show the effectiveness of the algorithm by comparing the results of our experiments with those of previous systems.

  • PDF

A Study on the Optimal Probe Path Generation for Sculptured Surface Inspection Using the Coordinate Measuring Machine (3차원 측정기를 이용한 자유곡면 측정시 최적의 경로 결정에 관한 연구)

  • Cho, Myung-Wo;Yi, Seung-Jong;Kim, Moon-Ki
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.12 no.10
    • /
    • pp.121-129
    • /
    • 1995
  • The objective of this research is to develop an effective inspection planning strategy for sculptured surfaces by using 3-dimensional Coordinate Measuring Machine (CMM). First, the CAD/CAM database is generated by using the Bezier surface patch mathod and variable cutter step size approach for design and machining of the workpiece model. Then, optimum measuring point locations are determained based on the mean curvature analysis to obtain more effective inspection results for the given sample numbers. An optimal probe sequence generation method is proposed by implementing the Traveling Salesperson (TSP) algorithm and new guide point selection methods are suggested based on the concepts of the variable distance between the first and second guide points. Finally, simulation study and experimental work show the effectiveness of the proposed strategy.

  • PDF

Cost-Benefit based User Review Selection Method

  • Neung-Hoe Kim;Man-Soo Hwang
    • International journal of advanced smart convergence
    • /
    • v.12 no.4
    • /
    • pp.177-181
    • /
    • 2023
  • User reviews posted in the application market show high relevance with the satisfaction of application users and its significance has been proven from numerous studies. User reviews are also crucial data as they are essential for improving applications after its release. However, as infinite amounts of user reviews are posted per day, application developers are unable to examine every user review and address them. Simply addressing the reviews in a chronological order will not be enough for an adequate user satisfaction given the limited resources of the developers. As such, the following research suggests a systematical method of analyzing user reviews with a cost-benefit analysis, in which the benefit of each user review is quantified based on the number of positive/negative words and the cost of each user review is quantified by using function point, a technique that measures software size.

A study on the optimal sizing and topology design for Truss/Beam structures using a genetic algorithm (유전자 알고리듬을 이용한 트러스/보 구조물의 기하학적 치수 및 토폴로지 최적설계에 관한 연구)

  • 박종권;성활경
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.14 no.3
    • /
    • pp.89-97
    • /
    • 1997
  • A genetic algorithm (GA) is a stochastic direct search strategy that mimics the process of genetic evolution. The GA applied herein works on a population of structural designs at any one time, and uses a structured information exchange based on the principles of natural selection and wurvival of the fittest to recombine the most desirable features of the designs over a sequence of generations until the process converges to a "maximum fitness" design. Principles of genetics are adapted into a search procedure for structural optimization. The methods consist of three genetics operations mainly named selection, cross- over and mutation. In this study, a method of finding the optimum topology of truss/beam structure is pro- posed by using the GA. In order to use GA in the optimum topology problem, chromosomes to FEM elements are assigned, and a penalty function is used to include constraints into fitness function. The results show that the GA has the potential to be an effective tool for the optimal design of structures accounting for sizing, geometrical and topological variables.variables.

  • PDF

Effect of outliers on the variable selection by the regularized regression

  • Jeong, Junho;Kim, Choongrak
    • Communications for Statistical Applications and Methods
    • /
    • v.25 no.2
    • /
    • pp.235-243
    • /
    • 2018
  • Many studies exist on the influence of one or few observations on estimators in a variety of statistical models under the "large n, small p" setup; however, diagnostic issues in the regression models have been rarely studied in a high dimensional setup. In the high dimensional data, the influence of observations is more serious because the sample size n is significantly less than the number variables p. Here, we investigate the influence of observations on the least absolute shrinkage and selection operator (LASSO) estimates, suggested by Tibshirani (Journal of the Royal Statistical Society, Series B, 73, 273-282, 1996), and the influence of observations on selected variables by the LASSO in the high dimensional setup. We also derived an analytic expression for the influence of the k observation on LASSO estimates in simple linear regression. Numerical studies based on artificial data and real data are done for illustration. Numerical results showed that the influence of observations on the LASSO estimates and the selected variables by the LASSO in the high dimensional setup is more severe than that in the usual "large n, small p" setup.