• Title/Summary/Keyword: statistical potential

Search Result 1,059, Processing Time 0.029 seconds

Evaluation of concrete compressive strength based on an improved PSO-LSSVM model

  • Xue, Xinhua
    • Computers and Concrete
    • /
    • v.21 no.5
    • /
    • pp.505-511
    • /
    • 2018
  • This paper investigates the potential of a hybrid model which combines the least squares support vector machine (LSSVM) and an improved particle swarm optimization (IMPSO) techniques for prediction of concrete compressive strength. A modified PSO algorithm is employed in determining the optimal values of LSSVM parameters to improve the forecasting accuracy. Experimental data on concrete compressive strength in the literature were used to validate and evaluate the performance of the proposed IMPSO-LSSVM model. Further, predictions from five models (the IMPSO-LSSVM, PSO-LSSVM, genetic algorithm (GA) based LSSVM, back propagation (BP) neural network, and a statistical model) were compared with the experimental data. The results show that the proposed IMPSO-LSSVM model is a feasible and efficient tool for predicting the concrete compressive strength with high accuracy.

Importance of Meta-Analysis and Practical Obstacles in Oncological and Epidemiological Studies: Statistics Very Close but Also Far!

  • Tanriverdi, Ozgur;Yeniceri, Nese
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.16 no.3
    • /
    • pp.1303-1306
    • /
    • 2015
  • Studies of epidemiological and prognostic factors are very important for oncology practice. There is a rapidly increasing amount of research and resultant knowledge in the scientific literature. This means that health professionals have major challenges in accessing relevant information and they increasingly require best available evidence to make their clinical decisions. Meta-analyses of prognostic and other epidemiological factors are very practical statistical approaches to define clinically important parameters. However, they also feature many obstacles in terms of data collection, standardization of results from multiple centers, bias, and commentary for intepretation. In this paper, the obstacles of meta-analysis are briefly reviewed, and potential problems with this statistical method are discussed.

Characterization of Photoresist Processing by Statistical Design of Experiment (DOE)

  • Kim, Gwang-Beom;Park, Jae-Hyun;Soh, Dae-Wha;Hong, Sang-Jeen
    • Proceedings of the Korean Institute of Electrical and Electronic Material Engineers Conference
    • /
    • 2005.11a
    • /
    • pp.43-44
    • /
    • 2005
  • SU-8 is a epoxy based photoresist designed for MEMS applications, where a thick, chemically and thermally stable image is desired. But SU-8 has proven to be very sensitive to variation in processing variables and hence difficult to use in the fabrication of useful structures. In this paper, negative SU-8 photoresist processed has been characterized in terms of delamination. Based on a full factorial designed experiment. Employing the design of experiment (DOE), a process parameter is established, and analyzing of full factional design is generated to investigate degree of delamination associated with three process parameters: post exposure bake (PEB) temperature, PEB time, and exposure energy. These results identify acceptable ranges of the three process variables to avoid delamination of SU-8 film, which in turn might lead to potential defects in MEMS device fabrication.

  • PDF

EMPIRICAL BAYES THRESHOLDING: ADAPTING TO SPARSITY WHEN IT ADVANTAGEOUS TO DO SO

  • Silverman Bernard W.
    • Journal of the Korean Statistical Society
    • /
    • v.36 no.1
    • /
    • pp.1-29
    • /
    • 2007
  • Suppose one is trying to estimate a high dimensional vector of parameters from a series of one observation per parameter. Often, it is possible to take advantage of sparsity in the parameters by thresholding the data in an appropriate way. A marginal maximum likelihood approach, within a suitable Bayesian structure, has excellent properties. For very sparse signals, the procedure chooses a large threshold and takes advantage of the sparsity, while for signals where there are many non-zero values, the method does not perform excessive smoothing. The scope of the method is reviewed and demonstrated, and various theoretical, practical and computational issues are discussed, in particularly exploring the wide potential and applicability of the general approach, and the way it can be used within more complex thresholding problems such as curve estimation using wavelets.

New generalized inverse Weibull distribution for lifetime modeling

  • Khan, Muhammad Shuaib;King, Robert
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.2
    • /
    • pp.147-161
    • /
    • 2016
  • This paper introduces the four parameter new generalized inverse Weibull distribution and investigates the potential usefulness of this model with application to reliability data from engineering studies. The new extended model has upside-down hazard rate function and provides an alternative to existing lifetime distributions. Various structural properties of the new distribution are derived that include explicit expressions for the moments, moment generating function, quantile function and the moments of order statistics. The estimation of model parameters are performed by the method of maximum likelihood and evaluate the performance of maximum likelihood estimation using simulation.

Determining the Optimal Subsampling Rate for Refusal Conversion in RDD Surveys

  • Park, In-Ho
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.6
    • /
    • pp.1031-1036
    • /
    • 2009
  • Under recent dramatic declines in response rates, various procedures have been considered among survey practitioners to reduce nonresponse in order to avoid its potential impairment to the inference. In the random digit dialing telephone surveys, substantial efforts are often required to obtain the initial contact for the screener interview. To reduce a burden with higher data collection costs, refusal conversion can be administered only to a random portion of the sample, reducing nonresponse (bias) with an expense of sample variability increment due to the associated weight adjustment. In this paper, we provide ways to determine the optimal subsampling rate using a linear cost model. Our approach for refusal subsampling is to predetermine a random portion from the full sample and to apply refusal conversion efforts if needed only to the subsample.

An Algorithm for Support Vector Machines with a Reject Option Using Bundle Method

  • Choi, Ho-Sik;Kim, Yong-Dai;Han, Sang-Tae;Kang, Hyun-Cheol
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.6
    • /
    • pp.997-1004
    • /
    • 2009
  • A standard approach is to classify all of future observations. In some cases, however, it would be desirable to defer a decision in particular for observations which are hard to classify. That is, it would be better to take more advanced tests rather than to make a decision right away. This motivates a classifier with a reject option that reports a warning for those observations that are hard to classify. In this paper, we present the method which gives efficient computation with a reject option. Some numerical results show strong potential of the propose method.

Modeling Aided Lead Design of FAK Inhibitors

  • Madhavan, Thirumurthy
    • Journal of Integrative Natural Science
    • /
    • v.4 no.4
    • /
    • pp.266-272
    • /
    • 2011
  • Focal adhesion kinase (FAK) is a potential target for the treatment of primary cancers as well as prevention of tumor metastasis. To understand the structural and chemical features of FAK inhibitors, we report comparative molecular field analysis (CoMFA) for the series of 7H-pyrrolo(2,3-d)pyrimidines. The CoMFA models showed good correlation between the actual and predicted values for training set molecules. Our results indicated the ligand-based alignment has produced better statistical results for CoMFA ($q^2$ = 0.505, $r^2$ = 0.950). Both models were validated using test set compounds, and gave good predictive values of 0.537. The statistical parameters from the generated 3D-QSAR models were indicated that the data are well fitted and have high predictive ability. The contour map from 3D-QSAR models explains nicely the structure-activity relationships of FAK inhibitors and our results would give proper guidelines to further enhance the activity of novel inhibitors.

Characterization of Negative Photoresist Processing by Statistical Design of Experiment (DOE)

  • Mun Sei-Young;Kim Gwang-Beom;Soh Dea-Wha;Hong Sang Jeen
    • Journal of information and communication convergence engineering
    • /
    • v.3 no.4
    • /
    • pp.191-194
    • /
    • 2005
  • SU-8 is a epoxy based photoresist designed for MEMS applications, where a thick, chemically and thermally stable image are desired. However SU-8 has proven to be very sensitive to variation in processing variables and hence difficult to use in the fabrication of useful structures. In this paper, negative SU-8 photoresist processed has been characterized in terms of delamination, based on a full factorial designed experiment. Employing the design of experiment (DOE), a process parameter is established, and analyzing of full factorial design is generated to investigate degree of delamination associated with three process parameters: post exposure bake (PEB) temperature, PEB time, and exposure energy. These results identify acceptable ranges of the three process variables to avoid delamination of SU-8 film, which in turn might lead to potential defects in MEMS device fabrication.

Potential of regression models in projecting sea level variability due to climate change at Haldia Port, India

  • Roshni, Thendiyath;K., Md. Sajid;Samui, Pijush
    • Ocean Systems Engineering
    • /
    • v.7 no.4
    • /
    • pp.319-328
    • /
    • 2017
  • Higher prediction efficacy is a very challenging task in any field of engineering. Due to global warming, there is a considerable increase in the global sea level. Through this work, an attempt has been made to find the sea level variability due to climate change impact at Haldia Port, India. Different statistical downscaling techniques are available and through this paper authors are intending to compare and illustrate the performances of three regression models. The models: Wavelet Neural Network (WNN), Minimax Probability Machine Regression (MPMR), Feed-Forward Neural Network (FFNN) are used for projecting the sea level variability due to climate change at Haldia Port, India. Model performance indices like PI, RMSE, NSE, MAPE, RSR etc were evaluated to get a clear picture on the model accuracy. All the indices are pointing towards the outperformance of WNN in projecting the sea level variability. The findings suggest a strong recommendation for ensembled models especially wavelet decomposed neural network to improve projecting efficiency in any time series modeling.