• Title/Summary/Keyword: small sample size

Search Result 741, Processing Time 0.027 seconds

On Convergence of Stratification Algorithms for Skewed Populations

  • Park, In-Ho
    • The Korean Journal of Applied Statistics
    • /
    • v.22 no.6
    • /
    • pp.1277-1287
    • /
    • 2009
  • For stratifying skewed populations, the Lavall$\acute{e}$e-Hidiroglou(LH) algorithm is often considered to have a take-all stratum with the largest units and some take-some strata with the middle-size and small units. Related to its iterative nature have been reported some numerical difficulties such as the dependency of the ultimate stratum boundaries to a choice of initial boundaries and the slow convergence to locally-optimum boundaries. The geometric stratification has been recently proposed to provide initial boundaries that can avoid such numerical difficulties in implementing the LH algorithm. Since the geometric stratification does not pursuit the optimization but the equalization of the stratum CVs, the corresponding stratum boundaries may not be (near) optimal. This paper revisits these issues concerning convergence and near-optimality of optimal stratification algorithms using artificial numerical examples. We also discuss the formation of the strata and the sample allocation under the optimization process and some aspects related to discontinuity arisen from the finiteness of both population and sample as well.

A Bayesian inference for fixed effect panel probit model

  • Lee, Seung-Chun
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.2
    • /
    • pp.179-187
    • /
    • 2016
  • The fixed effects panel probit model faces "incidental parameters problem" because it has a property that the number of parameters to be estimated will increase with sample size. The maximum likelihood estimation fails to give a consistent estimator of slope parameter. Unlike the panel regression model, it is not feasible to find an orthogonal reparameterization of fixed effects to get a consistent estimator. In this note, a hierarchical Bayesian model is proposed. The model is essentially equivalent to the frequentist's random effects model, but the individual specific effects are estimable with the help of Gibbs sampling. The Bayesian estimator is shown to reduce reduced the small sample bias. The maximum likelihood estimator in the random effects model is also efficient, which contradicts Green (2004)'s conclusion.

Study on Sampling Techniques for Digital Elevation Model (수치표고모형에 있어서 표고추출법의 연구)

  • Kang, In-Joon;Jung, Jae-Hyung;Kwak, Jae-Ha
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.10 no.2
    • /
    • pp.49-55
    • /
    • 1992
  • Sampling techniques is very important in digital elevation model. There are scanning and digitizing method of sampling techniques. This study is limited in digitizing method. Continous sampling method use contour lines as same entity and grid method is a direct reading of sample elevation in each grid. Triangulated irregular method is needed to identity topographical lines to sample elevation data. As a results, authors know that continous sampling method has economic in input system and triangulated irregular method has a small memory size.

  • PDF

Development of Optimal Accelerated Life Test Plans for Weibull Distribution Under Intermittent Inspection

  • Seo, Sun-Keun
    • Journal of Korean Society for Quality Management
    • /
    • v.17 no.1
    • /
    • pp.89-106
    • /
    • 1989
  • For Weibull distributed lifetimes, this paper presents asymptotically optimal accelerated life test plans for practical applications under intermittent inspection and type-I censoring. Computational results show that the asymptotic variance of a low quantile at the design stress as optimal criterion is insensitive to the number of inspections at overstress levels. Sensitivity analyses indicate that optimal plans are robust enough to moderate departures of estimated failure probabilities at the design and high stresses as input parameters to plan accelerated life tests from their true values. Monte Carlo simulation for small sample study on optimal accelerated life test plans developed by the asymptotic maximum likelihood theory is conducted. Simulation results suggest that optimal plans are satisfactory for sample size in practice.

  • PDF

An Accelerated Test Acceptance Control Chart for Process Quality Assurance (공정보증을 위한 가속시험 합격판정 관리도)

  • Kim Jong Gurl
    • Journal of the Korea Safety Management & Science
    • /
    • v.1 no.1
    • /
    • pp.123-134
    • /
    • 1999
  • There are several models for process quality assurance by quality system (ISO 9000), process capability analysis, acceptance control chart and so on. When a high level process capability has been achieved, it takes a long time to monitor the process shift, so it is sometimes necessary to develop a quicker monitoring system. To achieve a quicker quality assurance model for high-reliability process, this paper presents a model for process quality assurance when the fraction nonconforming is very small. We design an acceptance control chart based on variable quality characteristic and time-censored accelerated testing. The distribution of the characteristics is assumed to be normal or lognormal with a location parameter of the distribution that is a linear function of a stress. The design parameters are sample size, control limits and sample proportions allocated to low stress. These paramaters are obtained under minimization of the relative variance of the MLE of location parameter subject to APL and RPL constraints.

  • PDF

Comparison of MLE and REMLE of Linear Mixed Models in Assessing Bioequivalence based on 2x2 Crossover Design with Missing data

  • Chung, Yun-Ro;Park, Sang-Gue
    • Journal of the Korean Data and Information Science Society
    • /
    • v.19 no.4
    • /
    • pp.1211-1218
    • /
    • 2008
  • Maximum likelihood estimator (MLE) and restricted maximum likelihood estimator (REMLE) approaches are available in analyzing the linear mixed model (LMM) like bioequivalence trials. US FDA (2001) guides that REMLE may be useful to assess bioequivalence (BE) test. This paper studies the statistical behaviors of the methods in assessing BE tests when some of observations are missing at random. The simulation results show that the REMLE maintains the given nominal level well and the MLE gives a bit higher power. Considering the levels and the powers, the REMLE approach is recommended when the sample sizes are small to moderate and the MLE approach should be used when the sample size is greater than 30.

  • PDF

A Study on Autoignition of Fish Meal with Change of Ambient Temperature (주위온도 변화에 따른 어분의 자연발화에 관한연구)

  • 목연수;최재욱
    • Journal of the Korean Society of Safety
    • /
    • v.7 no.1
    • /
    • pp.47-56
    • /
    • 1992
  • Spontaneous ignition charactenstics for fish meal were observed by performing experiments at constant ambient temperature and varying the ambient temperature sinusoidally. As the results of the experiments at a constant ambient temperature, the critical spontaneous ignition temperature of the sample for large, intermediate and small vessels was 170.5$^{\circ}C$, 177.5$^{\circ}C$ and 188.5$^{\circ}C$, respectively. The critical spontaneous ignition temperature decreased as the sample vessel size increased. Apparent activation energy of used fish meal calculated from the Frank-Kamenetskii's thermal ignition theory was 37.60Kcal/mol. In case of varying the ambient temperature sinusoidally, the amplitudes of temperature were 1$0^{\circ}C$, 2$0^{\circ}C$ and 3$0^{\circ}C$ respectively with the period in each amplitude 1hr, 2hrs and 3hrs. The results showed that the critical spontaneous ignition temperatures at the varied amplitudes of temperature were lower than that at the constant anbient temperature and increased as the amplitude increased. At the same amplitude, the critical spontaneous ignition temperature increased with the period.

  • PDF

The Effect of Cold-rolling on Microstructure and Transformation Behavior of Cu-Zn-Al shape Memory Alloy (냉간가공에 의한 CuZnAl계 현상기억합급의 결정립미세화와 특성평가)

  • Lee, Sang-Bong;Park, No-Jin
    • Korean Journal of Materials Research
    • /
    • v.9 no.3
    • /
    • pp.322-326
    • /
    • 1999
  • In this study, cold-rolling and appropriate annealing was adopted for the grain refining of Cu-26.65Zn-4. 05Al-0.31Ti(wt%) shape memory alloy. For the cold deformation of this alloy the ducti1e $\alpha$-phase must be contained. After heat treatment at $550^{\circ}C$ the $(\alpha+$\beta)$-dual phase with 40vol.% $\alpha$-phase was obtained which could be rolled at room temperature. This alloy was cold rolled into a final thickness of 1.0mm with total reduction degrees of 70% and 90%. The rolled sheets were betanized at $800^{\circ}C$ for various times, then quenched into ice water. The grain size of co]d rolled samples were $60~80\mu\textrm{m}$ which is much smaller comparing with the hot-rolled samples. And the 90% rolled sample showed smaller grain size than the case of the 70% rolled one. The small grain size had influence on the phase transformation temperatures and stabilization of the austenitic phases.

  • PDF

Locally Powerful Unit-Root Test (국소적 강력 단위근 검정)

  • Choi, Bo-Seung;Woo, Jin-Uk;Park, You-Sung
    • Communications for Statistical Applications and Methods
    • /
    • v.15 no.4
    • /
    • pp.531-542
    • /
    • 2008
  • The unit root test is the major tool for determining whether we use differencing or detrending to eliminate the trend from time series data. Dickey-Fuller test (Dickey and Fuller, 1979) has the low power of test when the sample size is small or the true coefficient of AR(1) process is almost unit root and the Bayesian unit root test has complicated testing procedure. We propose a new unit root testing procedure, which mixed Bayesian approach with the traditional testing procedure. Using simulation studies, our approach showed locally higher powers than Dickey-Fuller test when the sample size is small or the time series has almost unit root and simpler procedure than Bayesian unit root test procedure. Proposed testing procedure can be applied to the time series data that are not observed as process with unit root.

A pilot study using machine learning methods about factors influencing prognosis of dental implants

  • Ha, Seung-Ryong;Park, Hyun Sung;Kim, Eung-Hee;Kim, Hong-Ki;Yang, Jin-Yong;Heo, Junyoung;Yeo, In-Sung Luke
    • The Journal of Advanced Prosthodontics
    • /
    • v.10 no.6
    • /
    • pp.395-400
    • /
    • 2018
  • PURPOSE. This study tried to find the most significant factors predicting implant prognosis using machine learning methods. MATERIALS AND METHODS. The data used in this study was based on a systematic search of chart files at Seoul National University Bundang Hospital for one year. In this period, oral and maxillofacial surgeons inserted 667 implants in 198 patients after consultation with a prosthodontist. The traditional statistical methods were inappropriate in this study, which analyzed the data of a small sample size to find a factor affecting the prognosis. The machine learning methods were used in this study, since these methods have analyzing power for a small sample size and are able to find a new factor that has been unknown to have an effect on the result. A decision tree model and a support vector machine were used for the analysis. RESULTS. The results identified mesio-distal position of the inserted implant as the most significant factor determining its prognosis. Both of the machine learning methods, the decision tree model and support vector machine, yielded the similar results. CONCLUSION. Dental clinicians should be careful in locating implants in the patient's mouths, especially mesio-distally, to minimize the negative complications against implant survival.