• Title/Summary/Keyword: Statistical optimization

Search Result 650, Processing Time 0.021 seconds

Application of Resampling Method based on Statistical Hypothesis Test for Improving the Performance of Particle Swarm Optimization in a Noisy Environment (노이즈 환경에서 입자 군집 최적화 알고리즘의 성능 향상을 위한 통계적 가설 검정 기반 리샘플링 기법의 적용)

  • Choi, Seon Han
    • Journal of the Korea Society for Simulation
    • /
    • v.28 no.4
    • /
    • pp.21-32
    • /
    • 2019
  • Inspired by the social behavior models of a bird flock or fish school, particle swarm optimization (PSO) is a popular metaheuristic optimization algorithm and has been widely used from solving a complex optimization problem to learning a artificial neural network. However, PSO is difficult to apply to many real-life optimization problems involving stochastic noise, since it is originated in a deterministic environment. To resolve this problem, this paper incorporates a resampling method called the uncertainty evaluation (UE) method into PSO. The UE method allows the particles to converge on the accurate optimal solution quickly in a noisy environment by selecting the particles' global best position correctly, one of the significant factors in the performance of PSO. The results of comparative experiments on several benchmark problems demonstrated the improved performance of the propose algorithm compared to the existing studies. In addition, the results of the case study emphasize the necessity of this work. The proposed algorithm is expected to be effectively applied to optimize complex systems through digital twins in the fourth industrial revolution.

Combinatorial particle swarm optimization for solving blocking flowshop scheduling problem

  • Eddaly, Mansour;Jarboui, Bassem;Siarry, Patrick
    • Journal of Computational Design and Engineering
    • /
    • v.3 no.4
    • /
    • pp.295-311
    • /
    • 2016
  • This paper addresses to the flowshop scheduling problem with blocking constraints. The objective is to minimize the makespan criterion. We propose a hybrid combinatorial particle swarm optimization algorithm (HCPSO) as a resolution technique for solving this problem. At the initialization, different priority rules are exploited. Experimental study and statistical analysis were performed to select the most adapted one for this problem. Then, the swarm behavior is tested for solving a combinatorial optimization problem such as a sequencing problem under constraints. Finally, an iterated local search algorithm based on probabilistic perturbation is sequentially introduced to the particle swarm optimization algorithm for improving the quality of solution. The computational results show that our approach is able to improve several best known solutions of the literature. In fact, 76 solutions among 120 were improved. Moreover, HCPSO outperforms the compared methods in terms of quality of solutions in short time requirements. Also, the performance of the proposed approach is evaluated according to a real-world industrial problem.

An Empirical Study on Statistical Optimization Model for the Portfolio Construction of Sponsored Search Advertising(SSA) (키워드검색광고 포트폴리오 구성을 위한 통계적 최적화 모델에 대한 실증분석)

  • Yang, Hognkyu;Hong, Juneseok;Kim, Wooju
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.2
    • /
    • pp.167-194
    • /
    • 2019
  • This research starts from the four basic concepts of incentive incompatibility, limited information, myopia and decision variable which are confronted when making decisions in keyword bidding. In order to make these concept concrete, four framework approaches are designed as follows; Strategic approach for the incentive incompatibility, Statistical approach for the limited information, Alternative optimization for myopia, and New model approach for decision variable. The purpose of this research is to propose the statistical optimization model in constructing the portfolio of Sponsored Search Advertising (SSA) in the Sponsor's perspective through empirical tests which can be used in portfolio decision making. Previous research up to date formulates the CTR estimation model using CPC, Rank, Impression, CVR, etc., individually or collectively as the independent variables. However, many of the variables are not controllable in keyword bidding. Only CPC and Rank can be used as decision variables in the bidding system. Classical SSA model is designed on the basic assumption that the CPC is the decision variable and CTR is the response variable. However, this classical model has so many huddles in the estimation of CTR. The main problem is the uncertainty between CPC and Rank. In keyword bid, CPC is continuously fluctuating even at the same Rank. This uncertainty usually raises questions about the credibility of CTR, along with the practical management problems. Sponsors make decisions in keyword bids under the limited information, and the strategic portfolio approach based on statistical models is necessary. In order to solve the problem in Classical SSA model, the New SSA model frame is designed on the basic assumption that Rank is the decision variable. Rank is proposed as the best decision variable in predicting the CTR in many papers. Further, most of the search engine platforms provide the options and algorithms to make it possible to bid with Rank. Sponsors can participate in the keyword bidding with Rank. Therefore, this paper tries to test the validity of this new SSA model and the applicability to construct the optimal portfolio in keyword bidding. Research process is as follows; In order to perform the optimization analysis in constructing the keyword portfolio under the New SSA model, this study proposes the criteria for categorizing the keywords, selects the representing keywords for each category, shows the non-linearity relationship, screens the scenarios for CTR and CPC estimation, selects the best fit model through Goodness-of-Fit (GOF) test, formulates the optimization models, confirms the Spillover effects, and suggests the modified optimization model reflecting Spillover and some strategic recommendations. Tests of Optimization models using these CTR/CPC estimation models are empirically performed with the objective functions of (1) maximizing CTR (CTR optimization model) and of (2) maximizing expected profit reflecting CVR (namely, CVR optimization model). Both of the CTR and CVR optimization test result show that the suggested SSA model confirms the significant improvements and this model is valid in constructing the keyword portfolio using the CTR/CPC estimation models suggested in this study. However, one critical problem is found in the CVR optimization model. Important keywords are excluded from the keyword portfolio due to the myopia of the immediate low profit at present. In order to solve this problem, Markov Chain analysis is carried out and the concept of Core Transit Keyword (CTK) and Expected Opportunity Profit (EOP) are introduced. The Revised CVR Optimization model is proposed and is tested and shows validity in constructing the portfolio. Strategic guidelines and insights are as follows; Brand keywords are usually dominant in almost every aspects of CTR, CVR, the expected profit, etc. Now, it is found that the Generic keywords are the CTK and have the spillover potentials which might increase consumers awareness and lead them to Brand keyword. That's why the Generic keyword should be focused in the keyword bidding. The contribution of the thesis is to propose the novel SSA model based on Rank as decision variable, to propose to manage the keyword portfolio by categories according to the characteristics of keywords, to propose the statistical modelling and managing based on the Rank in constructing the keyword portfolio, and to perform empirical tests and propose a new strategic guidelines to focus on the CTK and to propose the modified CVR optimization objective function reflecting the spillover effect in stead of the previous expected profit models.

Selecting the Number and Location of Knots for Presenting Densities

  • Ahn, JeongYong;Moon, Gill Sung;Han, Kyung Soo;Han, Beom Soo
    • Communications for Statistical Applications and Methods
    • /
    • v.11 no.3
    • /
    • pp.609-617
    • /
    • 2004
  • To present graph of probability densities, many softwares and graphical tools use methods that link points or straight lines. However, the methods can't display exactly and smoothly the graph and are not efficient from the viewpoint of process time. One method to overcome these shortcomings is utilizing interpolation methods. In these methods, selecting the number and location of knots is an important factor. This article proposes an algorithm to select knots for graphically presenting densities and implements graph components based on the algorithm.

Time dependent equations for the compressive strength of self-consolidating concrete through statistical optimization

  • Hossain, K.M.A.;Lachemi, M.
    • Computers and Concrete
    • /
    • v.3 no.4
    • /
    • pp.249-260
    • /
    • 2006
  • Self-consolidating concrete (SCC) in the fresh state is known for its excellent deformability, high resistance to segregation, and use, without applying vibration, in congested reinforced concrete structures characterized by difficult casting conditions. Such a concrete can be obtained by incorporating either mineral or chemical admixtures. This paper presents the results of an investigation to asses the applicability of Abram's law in predicting the compressive strength of SCC to any given age. Abram's law is based on the assumption that the strength of concrete with a specific type of aggregate at given age cured at a prescribed temperature depends primarily on the water-to-cement ratio (W/C). It is doubtful that such W/C law is applicable to concrete mixes with mineral or chemical admixtures as is the case for SCC where water to binder ratio (W/B) is used instead of W/C as the basis for mix design. Strength data of various types of SCC mixtures is collected from different sources to check the performance of Abram's law. An attempt has been made to generalize Abram's law by using various optimization methodologies on collected strength data of various SCC mixtures. A set of generalized equations is developed for the prediction of SCC strength at various ages. The performance of generalized equations is found better than original Abram's equations.

Application of Response Surface Methodology and Plackett Burman Design assisted with Support Vector Machine for the Optimization of Nitrilase Production by Bacillus subtilis AGAB-2

  • Ashish Bhatt;Darshankumar Prajapati;Akshaya Gupte
    • Microbiology and Biotechnology Letters
    • /
    • v.51 no.1
    • /
    • pp.69-82
    • /
    • 2023
  • Nitrilases are a hydrolase group of enzymes that catalyzes nitrile compounds and produce industrially important organic acids. The current objective is to optimize nitrilase production using statistical methods assisted with artificial intelligence (AI) tool from novel nitrile degrading isolate. A nitrile hydrolyzing bacteria Bacillus subtilis AGAB-2 (GenBank Ascension number- MW857547) was isolated from industrial effluent waste through an enrichment culture technique. The culture conditions were optimized by creating an orthogonal design with 7 variables to investigate the effect of the significant factors on nitrilase activity. On the basis of obtained data, an AI-driven support vector machine was used for the fitted regression, which yielded new sets of predicted responses with zero mean error and reduced root mean square error. The results of the above global optimization were regarded as the theoretical optimal function conditions. Nitrilase activity of 9832 ± 15.3 U/ml was obtained under optimized conditions, which is a 5.3-fold increase in compared to unoptimized (1822 ± 18.42 U/ml). The statistical optimization method involving Plackett Burman Design and Response surface methodology in combination with an AI tool created a better response prediction model with a significant improvement in enzyme production.

A Study on Automatic Learning of Weight Decay Neural Network (가중치감소 신경망의 자동학습에 관한 연구)

  • Hwang, Chang-Ha;Na, Eun-Young;Seok, Kyung-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.12 no.2
    • /
    • pp.1-10
    • /
    • 2001
  • Neural networks we increasingly being seen as an addition to the statistics toolkit which should be considered alongside both classical and modern statistical methods. Neural networks are usually useful for classification and function estimation. In this paper we concentrate on function estimation using neural networks with weight decay factor The use of weight decay seems both to help the optimization process and to avoid overfitting. In this type of neural networks, the problem to decide the number of hidden nodes, weight decay parameter and iteration number of learning is very important. It is called the optimization of weight decay neural networks. In this paper we propose a automatic optimization based on genetic algorithms. Moreover, we compare the weight decay neural network automatically learned according to automatic optimization with ordinary neural network, projection pursuit regression and support vector machines.

  • PDF

Evaluation of concrete compressive strength based on an improved PSO-LSSVM model

  • Xue, Xinhua
    • Computers and Concrete
    • /
    • v.21 no.5
    • /
    • pp.505-511
    • /
    • 2018
  • This paper investigates the potential of a hybrid model which combines the least squares support vector machine (LSSVM) and an improved particle swarm optimization (IMPSO) techniques for prediction of concrete compressive strength. A modified PSO algorithm is employed in determining the optimal values of LSSVM parameters to improve the forecasting accuracy. Experimental data on concrete compressive strength in the literature were used to validate and evaluate the performance of the proposed IMPSO-LSSVM model. Further, predictions from five models (the IMPSO-LSSVM, PSO-LSSVM, genetic algorithm (GA) based LSSVM, back propagation (BP) neural network, and a statistical model) were compared with the experimental data. The results show that the proposed IMPSO-LSSVM model is a feasible and efficient tool for predicting the concrete compressive strength with high accuracy.

Application of Analysis of Response Surface and Experimental Designs ; Optimization Methodology of Statistical Model (반응표면(反應表面) 분석(分析)을 위한 실험계획(實驗計劃)과 그 응용(鷹用) 통계적(統計的) 모형(模型)의 최적화수법론(最適化手法論)을 중심으로)

  • Lee, Myeong-Ju
    • Journal of Korean Society for Quality Management
    • /
    • v.7 no.2
    • /
    • pp.22-28
    • /
    • 1979
  • The problem considered in this paper is to select the vital factor effect to the product quality through the experimental design and analysis of response surface, so as to control the quality improvement of industrial product. In this time, even through the mathematical model is unknown it could be applicable to control the quality of industrial products and to determine optimum operating condition for many technical fields, particulary, for industrial manufacturing process. When a set of data is available from an experimental design, it is often of interest 1:0 fit polynominal repression model in independent variables (eg, time, temperature, pressure, etc) the optimize the response variable (eg. yield, strength etc). This paper proposes a method known to obtain the optimum operating condition, and how to find the condition by using table of orthogonal array experiments, and optimization methodology of statistical model. A criterion can be applied determining to optimum operating conditions in manufacturing industry and improving the fit of response surface which may be used for prediction of responses and quality control of industrial products.

  • PDF

Multinomial Kernel Logistic Regression via Bound Optimization Approach

  • Shim, Joo-Yong;Hong, Dug-Hun;Kim, Dal-Ho;Hwang, Chang-Ha
    • Communications for Statistical Applications and Methods
    • /
    • v.14 no.3
    • /
    • pp.507-516
    • /
    • 2007
  • Multinomial logistic regression is probably the most popular representative of probabilistic discriminative classifiers for multiclass classification problems. In this paper, a kernel variant of multinomial logistic regression is proposed by combining a Newton's method with a bound optimization approach. This formulation allows us to apply highly efficient approximation methods that effectively overcomes conceptual and numerical problems of standard multiclass kernel classifiers. We also provide the approximate cross validation (ACV) method for choosing the hyperparameters which affect the performance of the proposed approach. Experimental results are then presented to indicate the performance of the proposed procedure.