• Title/Summary/Keyword: mean and variance

Search Result 2,029, Processing Time 0.032 seconds

Nonlinear Goal Programming Approach for Robust Parameter Experiments (로버스트 변수모형의 비선형 목표계획법 접근방법)

  • Lee, Sang-Heon
    • Journal of the military operations research society of Korea
    • /
    • v.28 no.1
    • /
    • pp.47-66
    • /
    • 2002
  • Instead of using signal-to-noise ratio, we attempt to optimize both the mean and variance responses using dual response optimization technique. The alternative experimental strategy analyzes a robust parameter design problem to obtain the best settings that give a target condition on the mean while minimizing its variance. The mean and variance are treated as the two responses of interest to be optimized. Unlike to the crossed array and combined array approaches, our experimental setup requires replicated runs for each control factor's treatment under noise sampling. When the postulated response models are true, they enable the coefficients to be estimated and the desired performance measure to be analyzed more efficiently. The procedure and illustrative example are given for the dual response optimization techniques of nonlinear goal programming.

Efficient Use of Auxiliary Variables in Estimating Finite Population Variance in Two-Phase Sampling

  • Singh, Housila P.;Singh, Sarjinder;Kim, Jong-Min
    • Communications for Statistical Applications and Methods
    • /
    • v.17 no.2
    • /
    • pp.165-181
    • /
    • 2010
  • This paper presents some chain ratio-type estimators for estimating finite population variance using two auxiliary variables in two phase sampling set up. The expressions for biases and mean squared errors of the suggested c1asses of estimators are given. Asymptotic optimum estimators(AOE's) in each class are identified with their approximate mean squared error formulae. The theoretical and empirical properties of the suggested classes of estimators are investigated. In the simulation study, we took a real dataset related to pulmonary disease available on the CD with the book by Rosner, (2005).

Mean Estimation in Two-phase Sampling (이중추출에서 모평균 추정)

  • 김규성;김진석;이선순
    • The Korean Journal of Applied Statistics
    • /
    • v.14 no.1
    • /
    • pp.13-24
    • /
    • 2001
  • In this paper, we investigated mean estimation methods in two-phase sampling. Under the fixed expected cost we reviewed the optimal sample sizes, minimum variances and approximate unbiased variance estimators for usual ratio estimator, stratified sample mean with proportional allocation and Rao's allocation of the second phase sample. Also we proposed combined ratio estimator, which uses both ratio estimation and stratification and derived optimal sample size, minimum variance and unbiased variance estimator. Through a limited simulation study, we compared estimators by design effects and came to know that ratio estimator is more efficient than stratified sample mean in some cases and inefficient in the other cases, but combined ratio estimator is more efficient than others in most cases.

  • PDF

Optimal Portfolio Models for an Inefficient Market

  • GINTING, Josep;GINTING, Neshia Wilhelmina;PUTRI, Leonita;NIDAR, Sulaeman Rahman
    • The Journal of Asian Finance, Economics and Business
    • /
    • v.8 no.2
    • /
    • pp.57-64
    • /
    • 2021
  • This research attempts to formulate a new mean-risk model to replace the Markowitz mean-variance model by altering the risk measurement using ARCH variance instead of the original variance. In building the portfolio, samples used are closing prices of Indonesia Composite Stock Index and Indonesia Composite Bonds Index from 2013 to 2018. This study is a qualitative study using secondary data from the Indonesia Stock Exchange and Indonesia Bonds Pricing Agency. This research found that Markowitz's model is still superior when utilized in daily data, while the mean-ARCH model is appropriate with wider gap data like monthly observation. The Historical return has also proven to be more appropriate as a benchmark in selecting an optimal portfolio rather than a risk-free rate in an inefficient market. Therefore Mean-ARCH is more appropriate when utilized under data that have a wider gap between the period. The research findings show that the portfolio combination produced is inefficient due to the market inefficiency indicated by the meager return of the stock, while bears notable standard deviation. Therefore, the researcher of this study proposed to replace the risk-free rate as a benchmark with the historical return. The Historical return proved to be more realistic than the risk-free rate in inefficient market conditions.

Evaluation of Non - Normal Process Capability by Johnson System (존슨 시스템에 의한 비정규 공정능력의 평가)

  • 김진수;김홍준
    • Journal of the Korea Safety Management & Science
    • /
    • v.3 no.3
    • /
    • pp.175-190
    • /
    • 2001
  • We propose, a new process capability index $C_{psk}$(WV) applying the weighted variance control charting method for non-normally distributed. The main idea of the weighted variance method(WVM) is to divide a skewed or asymmetric distribution into two normal distributions from its mean to create two new distributions which have the same mean but different standard deviations. In this paper we propose an example, a distributions generated from the Johnson family of distributions, to demonstrate how the weighted variance-based process capability indices perform in comparison with another two non-normal methods, namely the Clements and the Wright methods. This example shows that the weighted valiance-based indices are more consistent than the other two methods in terms of sensitivity to departure to the process mean/median from the target value for non-normal processes. Second method show using the percentage nonconforming by the Pearson, Johnson and Burr systems. This example shows a little difference between the Pearson system and Burr system, but Johnson system underestimated than the two systems for process capability.

  • PDF

A Study on a Measure for Non-Normal Process Capability (비정규 공정능력 측도에 관한 연구)

  • 김홍준;김진수;조남호
    • Proceedings of the Korean Reliability Society Conference
    • /
    • 2001.06a
    • /
    • pp.311-319
    • /
    • 2001
  • All indices that are now in use assume normally distributed data, and any use of the indices on non-normal data results in inaccurate capability measurements. Therefore, $C_{s}$ is proposed which extends the most useful index to date, the Pearn-Kotz-Johnson $C_{pmk}$, by not only taking into account that the process mean may not lie midway between the specification limits and incorporating a penalty when the mean deviates from its target, but also incorporating a penalty for skewness. Therefore we propose, a new process capability index $C_{psk}$( WV) applying the weighted variance control charting method for non-normally distributed. The main idea of the weighted variance method(WVM) is to divide a skewed or asymmetric distribution into two normal distribution from its mean to create two new distributions which have the same mean but different standard distributions. In this paper we propose an example, a distribution generated from the Johnson family of distributions, to demonstrate how the weighted variance-based process capability indices perform in comparison with another two non-normal methods, namely the Clements and the Wright methods. This example shows that the weighted valiance-based indices are more consistent than the other two methods In terms of sensitivity to departure to the process mean/median from the target value for non-normal process.s.s.s.

  • PDF

Stationary bootstrapping for structural break tests for a heterogeneous autoregressive model

  • Hwang, Eunju;Shin, Dong Wan
    • Communications for Statistical Applications and Methods
    • /
    • v.24 no.4
    • /
    • pp.367-382
    • /
    • 2017
  • We consider an infinite-order long-memory heterogeneous autoregressive (HAR) model, which is motivated by a long-memory property of realized volatilities (RVs), as an extension of the finite order HAR-RV model. We develop bootstrap tests for structural mean or variance changes in the infinite-order HAR model via stationary bootstrapping. A functional central limit theorem is proved for stationary bootstrap sample, which enables us to develop stationary bootstrap cumulative sum (CUSUM) tests: a bootstrap test for mean break and a bootstrap test for variance break. Consistencies of the bootstrap null distributions of the CUSUM tests are proved. Consistencies of the bootstrap CUSUM tests are also proved under alternative hypotheses of mean or variance changes. A Monte-Carlo simulation shows that stationary bootstrapping improves the sizes of existing tests.

A Robust Ship Scheduling Based on Mean-Variance Optimization Model (평균-분산 최적화 모형을 이용한 로버스트 선박운항 일정계획)

  • Park, Nareh;Kim, Si-Hwa
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.41 no.2
    • /
    • pp.129-139
    • /
    • 2016
  • This paper presented a robust ship scheduling model using the quadratic programming problem. Given a set of available carriers under control and a set of cargoes to be transported from origin to destination, a robust ship scheduling that can minimize the mean-variance objective function with the required level of profit can be modeled. Computational experiments concerning relevant maritime transportation problems are performed on randomly generated configurations of tanker scheduling in bulk trade. In the first stage, the optimal transportation problem to achieve maximum revenue is solved through the traditional set-packing model that includes all feasible schedules for each carrier. In the second stage, the robust ship scheduling problem is formulated as mentioned in the quadratic programming. Single index model is used to efficiently calculate the variance-covariance matrix of objective function. Significant results are reported to validate that the proposed model can be utilized in the decision problem of ship scheduling after considering robustness and the required level of profit.

Updating algorithms in statistical computations (통계계산에서의 갱신 알고리즘에 관한 연구)

  • 전홍석
    • The Korean Journal of Applied Statistics
    • /
    • v.5 no.2
    • /
    • pp.283-292
    • /
    • 1992
  • Updating algorithms are studied for the basic statistics (mean, variance). For a linear model, a recursive formulae for least squares estimators of regression coefficients, residual sum of squares and variance-covariance matrix are also studied. Hotelling's $T^2$ statistics can be calculated recursively using the recursive formulae of mean vector and variance-covariance matrix without computing the sample variance-covariance matrix at each stage.

  • PDF

On statistical properties of some dierence-based error variance estimators in nonparametric regression with a finite sample

  • Park, Chun-Gun
    • Journal of the Korean Data and Information Science Society
    • /
    • v.22 no.3
    • /
    • pp.575-587
    • /
    • 2011
  • We investigate some statistical properties of several dierence-based error variance estimators in nonparametric regression model. Most of existing dierence-based methods are developed under asymptotical properties. Our focus is on the exact form of mean and variance for the lag-k dierence-based estimator and the second-order dierence-based estimator in a nite sample size. Our approach can be extended to Tong's estimator (2005) and be helpful to obtain optimal k.