• Title/Summary/Keyword: 통계적 추정법

Search Result 187, Processing Time 0.022 seconds

Optimization of Color Sorting Process of Shredded ELV Bumper using Reaction Surface Method (반응표면법을 이용한 폐자동차 범퍼 파쇄물의 색채선별공정 최적화 연구)

  • Lee, Hoon
    • Resources Recycling
    • /
    • v.28 no.2
    • /
    • pp.23-30
    • /
    • 2019
  • An color sorting technique was introduced to recycle End-of-life automobile shredded bumpers. The color sorting is a innovate method of separating the differences in the color of materials which are difficult to separate in gravity and size classification by using a camera and an image process technique. Experiments were planned and optimal conditions were derived by applying BBD (Box-Behnken Design) in the reaction surface method. The effects of color sensitivity, feed rate and sample size were analyzed, and a second-order reaction model was obtained based on the analysis of regression and statistical methods and $R^2$ and p-value were 99.56% and < 0.001. Optimum recovery was 94.1% under the conditions of color sensitivity, feed rate and particle size of 32%, 200 kg/h, and 33 mm respectively. The recovery of actual experiment was 93.8%. The experimental data agreed well with the predicted value and confirmed that the model was appropriate.

Reliability Evaluation for Prediction of Concrete Compressive Strength through Impact Resonance Method and Ultra Pulse Velocity Method (충격공진법과 초음파속도법을 통한 콘크리트 압축강도 예측의 신뢰성 평가)

  • Lee, Han-Kyul;Lee, Byung-Jae;Oh, Kwang-Chin;Kim, Yun-Yong
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.19 no.4
    • /
    • pp.18-24
    • /
    • 2015
  • Non-destructive testing (NDT) methods are widely used in the construction industry to diagnose the defects/strength of the concrete structure. However, it has been reported that the results obtained from NDT are having low reliability. In order to resolve this issue, four kinds of NDT test (ultrasonic velocity measurements by P-wave and S-wave and the impact resonance methods by longitudinal vibration and deformation vibration) were carried out on 180 concrete cylinders made with two kinds of mix proportions. The reliability of the NDT results was analyzed and compared through the measurement of the actual compressive strength of the concrete cylinders. The statistical analysis of the results was revealed that the ultrasonic velocity method by S-wave is having lowest coefficient of variation and also most capable of stable observation. Analytical equations were established to estimate the compressive strength of the concrete from the obtained NDT results by relating the actual compressive strength. Moreover the equation established by the ultrasonic velocity method by S-wave had the highest coefficient of determination. Further studies on the stability of non-destructive testing depending on various mixing conditions will be necessary in the future.

Statistical review and explanation for Lanchester model (란체스터 모형에 대한 통계적 고찰과 해석)

  • Yoo, Byung Joo
    • The Korean Journal of Applied Statistics
    • /
    • v.33 no.3
    • /
    • pp.335-345
    • /
    • 2020
  • This paper deals with the problem of estimating the log-transformed linear regression model to fit actual battle data from the Ardennes Campaign of World War II into the Lanchester model. The problem of determining a global solution for parameters and multicollinearity problems are identified and modified by examining the results of previous studies on data. The least squares method requires attention because a local solution can be found rather than a global solution if considering a specific constraint or a limited candidate group. The method of exploring this multicollinearity problem can be confirmed by a statistic known as a variance inflation factor. Therefore, the Lanchester model is simplified to avoid these problems, and the combat power attrition rate model was proposed which is statistically significant and easy to explain. When fitting the model, the dependence problem between the data has occurred due to autocorrelation. Matters that might be underestimated or overestimated were resolved by the Cochrane-Orcutt method as well as guaranteeing independence and normality.

Developing statistical models and constructing clinical systems for analyzing semi-competing risks data produced from medicine, public heath, and epidemiology (의료, 보건, 역학 분야에서 생산되는 준경쟁적 위험자료를 분석하기 위한 통계적 모형의 개발과 임상분석시스템 구축을 위한 연구)

  • Kim, Jinheum
    • The Korean Journal of Applied Statistics
    • /
    • v.33 no.4
    • /
    • pp.379-393
    • /
    • 2020
  • A terminal event such as death may censor an intermediate event such as relapse, but not vice versa in semi-competing risks data, which is often seen in medicine, public health, and epidemiology. We propose a Weibull regression model with a normal frailty to analyze semi-competing risks data when all three transition times of the illness-death model are possibly interval-censored. We construct the conditional likelihood separately depending on the types of subjects: still alive with or without the intermediate event, dead with or without the intermediate event, and dead with the intermediate event missing. Optimal parameter estimates are obtained from the iterative quasi-Newton algorithm after the marginalization of the full likelihood using the adaptive importance sampling. We illustrate the proposed method with extensive simulation studies and PAQUID (Personnes Agées Quid) data.

Reliability-Based Design Optimization Using Akaike Information Criterion for Discrete Information (이산정보의 아카이케 정보척도를 이용한 신뢰성 기반 최적설계)

  • Lim, Woo-Chul;Lee, Tae-Hee
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.36 no.8
    • /
    • pp.921-927
    • /
    • 2012
  • Reliability-based design optimization (RBDO) can be used to determine the reliability of a system by means of probabilistic design criteria, i.e., the possibility of failure considering stochastic features of design variables and input parameters. To assure these criteria, various reliability analysis methods have been developed. Most of these methods assume that distribution functions are continuous. However, in real problems, because real data is often discrete in form, it is important to estimate the distributions for discrete information during reliability analysis. In this study, we employ the Akaike information criterion (AIC) method for reliability analysis to determine the best estimated distribution for discrete information and we suggest an RBDO method using AIC. Mathematical and engineering examples are illustrated to verify the proposed method.

Determinants of Municipal Water Prices and Costs (지자체간 수돗물 판매가격과 생산비용 격차의 결정 요인 분석)

  • Kwon, Oh-Sang
    • Environmental and Resource Economics Review
    • /
    • v.18 no.4
    • /
    • pp.695-713
    • /
    • 2009
  • This study investigates the determinants of municipal water prices and costs in Korea. A panel data set of 164 municipalities for the period 2000~2007 is used for the study. Both random and fixed effect models with an appropriate set of instruments are applied to the data. Substantial differences in prices and costs among municipalities are observed. The study finds that prices and costs increase if the leakage rate is high, the quality of primary water is bad, and the municipality has to purchase primary water from K-water which is the single creation and management corporation of water resources facilities in Korea. Prices and costs decline if the size of consumer is large, the proportion of paying consumer is high, and the amount of subsidy from the central government is large.

  • PDF

Calculation of Joint Center Volume (JCV) for Estimation of Joint Size Distribution in Non-Planar Window Survey (비평면 조사창에서의 암반절리 크기분포 추정을 위한 Joint Center Volume (JCV) 산정 기법 제안)

  • Lee, Yong-Ki;Song, Jae-Joon
    • Tunnel and Underground Space
    • /
    • v.29 no.2
    • /
    • pp.89-107
    • /
    • 2019
  • Rock joints have an extremely important role in analyzing the mechanical stability and hydraulic characteristics of rock mass structures. Most rock joint parameters are generally indicated as a distribution by statistical techniques. In this research, calculation technique of Joint Center Volume (JCV) is analyzed, which is required for estimating the size distribution having the largest uncertainty among the joint parameters, then a new technique is proposed which is applicable regardless of the shape of survey window. The existing theoretical JCV calculation technique can be applied only to the plane window, and the complete enumeration techniques show the limitations in joint trace type and analysis time. This research aims to overcome the limitations in survey window shape and joint trace type through calculating JCV by using Monte Carlo simulation. The applicability of proposed technique is validated through the estimation results at non-planar survey windows such as curved surface and tunnel surface.

Statistical estimation of the epochs of observation for the 28 determinative stars in the Shi Shi Xing Jing and the table in Cheonsang Yeolcha Bunyajido (석씨성경과 천상열차분야지도의 이십팔수 수거성 관측 연도의 통계적 추정)

  • Ahn, Sang-Hyeon
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.44 no.2
    • /
    • pp.61.3-61.3
    • /
    • 2019
  • The epochs of observation for the 28 determinative stars in the Shi Shi Xing Jing and Cheonsang Yeolcha Bunyajido are estimated by using two fitting methods. The coordinate values in these tables were thought to be measured with meridian instruments, and so they have the axis-misalignment errors and random errors. We adopt a Fourier method, and also we devise a least square fitting method. We do bootstrap resamplings to estimate the variance of the epochs. As results, we find that both data sets were made during the 1st century BCE or the latter period of the Former Han dynasty. The sample mean of the epoch for the SSXJ data is earlier by about 15-20 years than that for the Cheonsang Yeolcha Bunyajido. However, their variances are so large that we cannot decide whether the Shi Shi Xing Jing data was formed around 77 BCE and the Cheonsang Yeolcha Bunyajido was measured in 52 BCE. We need either more data points or data points measured with better precision. We will discuss on the other 120 coordinates of stars listed in the Shi Shi Xing Jing.

  • PDF

Concept of Trend Analysis of Hydrologic Extreme Variables and Nonstationary Frequency Analysis (극치수문자료의 경향성 분석 개념 및 비정상성 빈도해석)

  • Lee, Jeong-Ju;Kwon, Hyun-Han;Kim, Tae-Woong
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.30 no.4B
    • /
    • pp.389-397
    • /
    • 2010
  • This study introduced a Bayesian based frequency analysis in which the statistical trend analysis for hydrologic extreme series is incorporated. The proposed model employed Gumbel extreme distribution to characterize extreme events and a fully coupled bayesian frequency model was finally utilized to estimate design rainfalls in Seoul. Posterior distributions of the model parameters in both Gumbel distribution and trend analysis were updated through Markov Chain Monte Carlo Simulation mainly utilizing Gibbs sampler. This study proposed a way to make use of nonstationary frequency model for dynamic risk analysis, and showed an increase of hydrologic risk with time varying probability density functions. The proposed study showed advantage in assessing statistical significance of parameters associated with trend analysis through statistical inference utilizing derived posterior distributions.

Modeling of Visual Attention Probability for Stereoscopic Videos and 3D Effect Estimation Based on Visual Attention (3차원 동영상의 시각 주의 확률 모델 도출 및 시각 주의 기반 입체감 추정)

  • Kim, Boeun;Song, Wonseok;Kim, Taejeong
    • Journal of KIISE
    • /
    • v.42 no.5
    • /
    • pp.609-620
    • /
    • 2015
  • Viewers of videos are likely to absorb more information from the part of the screen that attracts visual attention. This fact has led to the visual attention models that are being used in producing and evaluating videos. In this paper, we investigate the factors that are significant to visual attention and the mathematical form of the visual attention model. We then estimated the visual attention probability using the statistical design of experiments. The analysis of variance (ANOVA) verifies that the motion velocity, distance from the screen, and amount of defocus blur affect human visual attention significantly. Using the response surface modeling (RSM), we created a visual attention score model that concerns the three factors, from which we calculate the visual attention probabilities (VAPs) of image pixels. The VAPs are directly applied to existing gradient based 3D effect perception measurement. By giving weights according to our VAPs, our algorithm achieves more accurate measurement than the existing method. The performance of the proposed measurement is assessed by comparing them with subjective evaluation as well as with existing methods. The comparison verifies that the proposed measurement outperforms the existing ones.