• 제목/요약/키워드: statistical limits

검색결과 271건 처리시간 0.02초

계량형 관리도와 관련된 불편화 상수의 비교 (Comparison of the Unbiasing Constants in Connection with Variable Control Charts)

  • 안해일
    • 산업경영시스템학회지
    • /
    • 제37권4호
    • /
    • pp.134-144
    • /
    • 2014
  • With the advent of lean-six sigma era, an extensive use of analytic tools such as control charts is required in the field of manufacturing. In relation to statistical quality control (SQC) or process control (SPC), the Korean standards have undergone a meaningful change. In this study, the theoretic backgrounds for evaluating the control limits in connection with the variable control charts are examined in view of better understanding the related constants and coefficients. This paper is intended to help the quality control practitioners understand the mathematical backgrounds by comparing related quality control constants and also to encourage them to make use of and to take the advantage of the variable control charts which are very useful for implementing the concept of lean-six sigma in many industrial sites.

프리스트레스트 콘크리트 교량의 크리프와 건조수축효과의 민감도 해석 (Sensitivity Analysis of Creep and Shrinkage Effects of Prestressed Concrete Bridges)

  • 오병환;양인환
    • 한국콘크리트학회:학술대회논문집
    • /
    • 한국콘크리트학회 1998년도 가을 학술발표대회 논문집(III)
    • /
    • pp.656-661
    • /
    • 1998
  • This paper presents a method of statistical analysis and sensitivity analysis of creep and shrinkage effects in PSC box girder bridges. The statistical and sensitivity analyses are performed by using the numerical simulation of Latin Hypercube sampling. For each sample, the time-dependent structural analysis is performed to produce response data, which are then statistically analyzed. The probabilistic prediction of the confidence limits on long-term effects of creep and shrinkage is then expressed. Three measures are examined to quantify the sensitivity of the outputs to each of the input variables. These are rank correlation coefficient(RCC), partial rank correlation coefficient(PRCC) and standardized rank regression coefficient(SRRC) computed on the ranks of the observations. Probability band widens with time, which indicates an increase of prediction uncertainty with time. The creep model uncertainty factor and the relative humidity appear as the most dominant factors with regard to the model output uncertainty.

  • PDF

피로강도 데이터의 정밀도 향상에 관한 연구 (A Study on the Accuracy Improvement of Fatigue Strength Data)

  • 최창섭
    • 한국안전학회지
    • /
    • 제11권4호
    • /
    • pp.42-48
    • /
    • 1996
  • Since the fatigue phenomenon is characterized as a probability feature and test data cannot be easily collected number its small sample data are uncertain. Therefore, statistical evaluation methods should necessarily be introduced into data evaluation. With such a basic concept in mind, this study attempted to apply conventional statistical processing methods to the fatigue data and thereby present a new evaluation method in consideration of the fact that the fatigue test is usually performed within a limited number. That is a package evaluation method was adopted which co-realtes parameters between different levels of capacity in E-N or S-N level. So far fatigue limits have been found by means of stiarcase method. But it was also disclosed that this new method has its own disadvantage because limited frequencies are not duely considered.

  • PDF

A fast approximate fitting for mixture of multivariate skew t-distribution via EM algorithm

  • Kim, Seung-Gu
    • Communications for Statistical Applications and Methods
    • /
    • 제27권2호
    • /
    • pp.255-268
    • /
    • 2020
  • A mixture of multivariate canonical fundamental skew t-distribution (CFUST) has been of interest in various fields. In particular, interest in the unsupervised learning society is noteworthy. However, fitting the model via EM algorithm suffers from significant processing time. The main cause is due to the calculation of many multivariate t-cdfs (cumulative distribution functions) in E-step. In this article, we provide an approximate, but fast calculation method for the in univariate fashion, which is the product of successively conditional univariate t-cdfs with Taylor's first order approximation. By replacing all multivariate t-cdfs in E-step with the proposed approximate versions, we obtain the admissible results of fitting the model, where it gives 85% reduction time for the 5 dimensional skewness case of the Australian Institution Sport data set. For this approach, discussions about rough properties, advantages and limits are also presented.

DERIVING ACCURATE COST CONTINGENCY ESTIMATE FOR MULTIPLE PROJECT MANAGEMENT

  • Jin-Lee Kim ;Ok-Kyue Kim
    • 국제학술발표논문집
    • /
    • The 1th International Conference on Construction Engineering and Project Management
    • /
    • pp.935-940
    • /
    • 2005
  • This paper presents the results of a statistical analysis using historical data of cost contingency. As a result, a model that predicts and estimates an accurate cost contingency value using the least squares estimation method was developed. Data such as original contract amounts, estimated contingency amounts set by maximum funding limits, and actual contingency amounts, were collected and used for model development. The more effective prediction model was selected from the two developed models based on its prediction capability. The model would help guide project managers making financial decisions when the determination of the cost contingency amounts for multiple projects is necessary.

  • PDF

Comprehensive studies of Grassmann manifold optimization and sequential candidate set algorithm in a principal fitted component model

  • Chaeyoung, Lee;Jae Keun, Yoo
    • Communications for Statistical Applications and Methods
    • /
    • 제29권6호
    • /
    • pp.721-733
    • /
    • 2022
  • In this paper we compare parameter estimation by Grassmann manifold optimization and sequential candidate set algorithm in a structured principal fitted component (PFC) model. The structured PFC model extends the form of the covariance matrix of a random error to relieve the limits that occur due to too simple form of the matrix. However, unlike other PFC models, structured PFC model does not have a closed form for parameter estimation in dimension reduction which signals the need of numerical computation. The numerical computation can be done through Grassmann manifold optimization and sequential candidate set algorithm. We conducted numerical studies to compare the two methods by computing the results of sequential dimension testing and trace correlation values where we can compare the performance in determining dimension and estimating the basis. We could conclude that Grassmann manifold optimization outperforms sequential candidate set algorithm in dimension determination, while sequential candidate set algorithm is better in basis estimation when conducting dimension reduction. We also applied the methods in real data which derived the same result.

화질의 국소적 변화를 고려한 의용화상처리 (Medical Image Processing with Local Variati on of the Image Quality)

  • 홍승홍
    • 대한전자공학회논문지
    • /
    • 제12권1호
    • /
    • pp.1-6
    • /
    • 1975
  • 잡음을 포함한 저화질의용화상의 배경과 목적대상정을 분할하는 환경영역은 중요한 정보로 의학상 진단에 큰 의의를 갖고 있다. 이 논문의 목적은 화상의 농도변화를 정총화하여 통계적수법에 의해 환경영역을 결정하는 threshold를 구하는 방법을 제시하고 이를 비 scintigram에 적용하여 실험을 행했다. 전화상을 64개의 소영역으로 나누고 경계영역이 존재하는 부분온 선택하며 이 부분에 maximum likelihood법을 적용하여 threshold를 결정한뒤 내삽법에 의해 전화소에 대한 threshold를 구하고 수곽을 포함한 2식화면을 구했다. 이의 결과는 인간의 인식과 거의 같은 결과로 동적해석방법의 유효성이 증명되었다. The boundary has been one of the most important information in radiographic images and the degrees of difficulty involved varies greatly with the quality of the picture. These Buantifications are the means to diagnoses. The purpose of this paper is to quantify intensity variation and the threshold decision which is based on statistical principles and is developed to detect limits in liver scintigrams the entire picture is devide4 into 64 small regions. The kurtosis and variances for each smal region are used as indications to select the histograms the thresholds are computed according to the method o(maximum likelihood which minimizes the probability o( misclassification. Therefore Ive have demonstrated the applicability of the boundary detection and proved good agreement with human recognition, and we can use it for the diagnosis data of liver disease.

  • PDF

STATISTICAL GAUSSIAN DISTRIBUTION FUNCTION AS A DISTANCE INDICATOR TO STELLAR GROUPS

  • Abdel-Rahman, H.I.;Issa, I.A.;Sharaf, M.A.;Nouh, M.I.;Bakry, A.;Osman, A.I.;Saad, A.S.;Kamal, F.Y.;Essam, Essam
    • 천문학회지
    • /
    • 제42권4호
    • /
    • pp.71-79
    • /
    • 2009
  • In this paper, statistical distribution functions are developed for distance determination of stellar groups. This method depends on the assumption that absolute magnitudes and apparent magnitudes follow a Gaussian distribution function. Due to the limits of the integrands of the frequency function of apparent and absolute magnitudes, we introduce Case A, B, and C Gaussian distributions. The developed approaches have been implemented to determine distances to some clusters and stellar associations. The comparison with the distances derived by different authors reveals good agreement.

방사선치료에서의 품질보증을 위한 통계적공정관리의 활용 (Use of Statistical Process Control for Quality Assurance in Radiation Therapy)

  • 정광호
    • 한국의학물리학회지:의학물리
    • /
    • 제26권2호
    • /
    • pp.59-71
    • /
    • 2015
  • 품질보증의 목적은 공정의 품질을 유지하기 위하여 계통적인 오류를 최소화하는 것이다. 통계적공정관리는 2005년 이후 방사선치료분야에서도 이용되기 시작하였으며 품질보증의 패러다임을 바꾸어가고 있다. 통계적공정관리의 목적은 공정의 안정성을 관리한계 내에서 유지하며 변동의 양상도 감시하기 위한 것이다. 통계적공정관리는 방사선치료 품질보증 전 분야에 적용 가능하지만, 이를 제대로 활용하기 위해서는 통계적공정관리에 대한 지식이 필요하다. 본 논문에서는 통계적공정관리의 개념을 설명하고 방사선치료분야에서 이를 활용한 연구들을 소개하였다.

PREDICTION OF THE DETECTION LIMIT IN A NEW COUNTING EXPERIMENT

  • Seon, Kwang-Il
    • 천문학회지
    • /
    • 제41권4호
    • /
    • pp.99-107
    • /
    • 2008
  • When a new counting experiment is proposed, it is crucial to predict whether the desired source signal will be detected, or how much observation time is required in order to detect the signal at a certain significance level. The concept of the a priori prediction of the detection limit in a newly proposed experiment should be distinguished from the a posteriori claim or decision whether a source signal was detected in an experiment already performed, and the calculation of statistical significance of a measured source signal. We formulate precise definitions of these concepts based on the statistical theory of hypothesis testing, and derive an approximate formula to estimate quickly the a priori detection limit of expected Poissonian source signals. A more accurate algorithm for calculating the detection limits in a counting experiment is also proposed. The formula and the proposed algorithm may be used for the estimation of required integration or observation time in proposals of new experiments. Applications include the calculation of integration time required for the detection of faint emission lines in a newly proposed spectroscopic observation, and the detection of faint sources in a new imaging observation. We apply the results to the calculation of observation time required to claim the detection of the surface thermal emission from neutron stars with two virtual instruments.