• Title/Summary/Keyword: Mean function

Search Result 3,885, Processing Time 0.03 seconds

Determination of the Resetting Time to the Process Mean Shift by the Loss Function (손실함수를 적용한 공정평균 이동에 대한 조정시기 결정)

  • Lee, Do-Kyung
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.40 no.1
    • /
    • pp.165-172
    • /
    • 2017
  • Machines are physically or chemically degenerated by continuous usage. One of the results of this degeneration is the process mean shift. Under the process mean shift, production cost, failure cost and quality loss function cost are increasing continuously. Therefore a periodic preventive resetting the process is necessary. We suppose that the wear level is observable. In this case, process mean shift problem has similar characteristics to the maintenance policy model. In the previous studies, process mean shift problem has been studied in several fields such as 'Tool wear limit', 'Canning Process' and 'Quality Loss Function' separately or partially integrated form. This paper proposes an integrated cost model which involves production cost by the material, failure cost by the nonconforming items, quality loss function cost by the deviation between the quality characteristics from the target value and resetting the process cost. We expand this process mean shift problem a little more by dealing the process variance as a function, not a constant value. We suggested a multiplier function model to the process variance according to the analysis result with practical data. We adopted two-side specification to our model. The initial process mean is generally set somewhat above the lower specification. The objective function is total integrated costs per unit wear and independent variables are wear limit and initial setting process mean. The optimum is derived from numerical analysis because the integral form of the objective function is not possible. A numerical example is presented.

Automatic Contrast Enhancement by Transfer Function Modification

  • Bae, Tae Wuk;Ahn, Sang Ho;Altunbasak, Yucel
    • ETRI Journal
    • /
    • v.39 no.1
    • /
    • pp.76-86
    • /
    • 2017
  • In this study, we propose an automatic contrast enhancement method based on transfer function modification (TFM) by histogram equalization. Previous histogram-based global contrast enhancement techniques employ histogram modification, whereas we propose a direct TFM technique that considers the mean brightness of an image during contrast enhancement. The mean point shifting method using a transfer function is proposed to preserve the mean brightness of an image. In addition, the linearization of transfer function technique, which has a histogram flattening effect, is designed to reduce visual artifacts. An attenuation factor is automatically determined using the maximum value of the probability density function in an image to control its rate of contrast. A new quantitative measurement method called sparsity of a histogram is proposed to obtain a better objective comparison relative to previous global contrast enhancement methods. According to our experimental results, we demonstrated the performance of our proposed method based on generalized measures and the newly proposed measurement.

A Simple Estimator of Mean Residual Life Function under Random Censoring

  • Jeong, Dong-Myung;Song, Myung-Unn;Song, Jae-Kee
    • Journal of the Korean Data and Information Science Society
    • /
    • v.8 no.2
    • /
    • pp.225-230
    • /
    • 1997
  • We, in this paper, propose an estimator of mean residual life function by using the residual survival function under random censoring and prove the uniform consistency and weak convergence result of this estimator. Also an example is illustrated by the real data.

  • PDF

Nonparametric Estimation of Bivariate Mean Residual Life Function under Univariate Censoring

  • Dong-Myung Jeong;Jae-Kee Song;Joong Kweon Sohn
    • Journal of the Korean Statistical Society
    • /
    • v.25 no.1
    • /
    • pp.133-144
    • /
    • 1996
  • We, in this paper, propose a nonparametric estimator of bivariate mean residual life function based on Lin and Ying's (1993) bivariate survival function estimator of paired failure times under univariate censoring and prove the uniform consistency and the weak convergence result of this estimator. Through Monte Carlo simulation, the performances of the proposed estimator are tabulated and are illustrated with the skin grafts data.

  • PDF

MEAN-VALUE PROPERTY AND CHARACTERIZATIONS OF SOME ELEMENTARY FUNCTIONS

  • Matkowski, Janusz
    • Bulletin of the Korean Mathematical Society
    • /
    • v.50 no.1
    • /
    • pp.263-273
    • /
    • 2013
  • A mean-value result, saying that the difference quotient of a differentiable function in a real interval is a mean value of its derivatives at the endpoints of the interval, leads to the functional equation $$\frac{f(x)-F(y)}{x-y}=M(g(x),\;G(y)),\;x{\neq}y$$, where M is a given mean and $f$, F, $g$, G are the unknown functions. Solving this equation for the arithmetic, geometric and harmonic means, we obtain, respectively, characterizations of square polynomials, homographic and square-root functions. A new criterion of the monotonicity of a real function is presented.

Accurate application of Gaussian process regression for cosmology

  • Hwang, Seung-gyu;L'Huillier, Benjamin
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.46 no.1
    • /
    • pp.48.1-48.1
    • /
    • 2021
  • Gaussian process regression (GPR) is a powerful method used for model-independent analysis of cosmological observations. In GPR, it is important to decide an input mean function and hyperparameters that affect the reconstruction results. Depending on how the input mean function and hyperparameters are determined in the literature, I divide into four main applications for GPR and compare their results. In particular, a zero mean function is commonly used as an input mean function, which may be inappropriate for reconstructing cosmological observations such as the distance modulus. Using mock data based on Pantheon compilation of type Ia supernovae, I will point out the problem of using a zero input and suggest a new way to deal with the input mean function.

  • PDF

ON THE EMPIRICAL MEAN LIFE PROCESSES FOR RIGHT CENSORED DATA

  • Park, Hyo-Il
    • Journal of the Korean Statistical Society
    • /
    • v.32 no.1
    • /
    • pp.25-32
    • /
    • 2003
  • In this paper, we define the mean life process for the right censored data and show the asymptotic equivalence between two kinds of the mean life processes. We use the Kaplan-Meier and Susarla-Van Ryzin estimates as the estimates of survival function for the construction of the mean life processes. Also we show the asymptotic equivalence between two mean residual life processes as an application and finally discuss some difficulties caused by the censoring mechanism.

Determination of the Resetting Time to the Process Mean Shift based on the Cpm+ (Cpm+ 기준에서의 공정평균이동에 대한 재조정 기간 결정)

  • Lee, Do-Kyung
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.41 no.1
    • /
    • pp.110-117
    • /
    • 2018
  • Machines and facilities are physically or chemically degenerated by continuous usage. One of the results of this degeneration is the process mean shift. By the result of degeneration, non-conforming products and malfunction of machine occur. Therefore a periodic preventive resetting the process is necessary. This type of preventive action is called 'preventive maintenance policy.' Preventive maintenance presupposes that the preventive (resetting the process) cost is smaller than the cost of failure caused by the malfunction of machine. The process mean shift problem is a field of preventive maintenance. This field deals the interrelationship between the quality cost and the process resetting cost before machine breaks down. Quality cost is the sum of the non-conforming item cost and quality loss cost. Quality loss cost is due to the deviation between the quality characteristics from the target value. Under the process mean shift, the quality cost is increasing continuously whereas the process resetting cost is constant value. The objective function is total costs per unit wear, the decision variables are the wear limit (resetting period) and the initial process mean. Comparing the previous studies, we set the process variance as an increasing concave function and set the quality loss function as Cpm+ simultaneously. In the Cpm+, loss function has different cost coefficients according to the direction of the quality characteristics from target value. A numerical example is presented.

A Comparative Study of Software Reliability Model Considering Log Type Mean Value Function (로그형 평균값함수를 고려한 소프트웨어 신뢰성모형에 대한 비교연구)

  • Shin, Hyun Cheul;Kim, Hee Cheul
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.10 no.4
    • /
    • pp.19-27
    • /
    • 2014
  • Software reliability in the software development process is an important issue. Software process improvement helps in finishing with reliable software product. Infinite failure NHPP software reliability models presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing failure occurrence rates per fault. In this paper, proposes the reliability model with log type mean value function (Musa-Okumoto and log power model), which made out efficiency application for software reliability. Algorithm to estimate the parameters used to maximum likelihood estimator and bisection method, model selection based on mean square error (MSE) and coefficient of determination($R^2$), for the sake of efficient model, was employed. Analysis of failure using real data set for the sake of proposing log type mean value function was employed. This analysis of failure data compared with log type mean value function. In order to insurance for the reliability of data, Laplace trend test was employed. In this study, the log type model is also efficient in terms of reliability because it (the coefficient of determination is 70% or more) in the field of the conventional model can be used as an alternative could be confirmed. From this paper, software developers have to consider the growth model by prior knowledge of the software to identify failure modes which can be able to help.

A Study for Robustness of Objective Function and Constraints in Robust Design Optimization

  • Lee Tae-Won
    • Journal of Mechanical Science and Technology
    • /
    • v.20 no.10
    • /
    • pp.1662-1669
    • /
    • 2006
  • Since randomness and uncertainties of design parameters are inherent, the robust design has gained an ever increasing importance in mechanical engineering. The robustness is assessed by the measure of performance variability around mean value, which is called as standard deviation. Hence, constraints in robust optimization problem can be approached as probability constraints in reliability based optimization. Then, the FOSM (first order second moment) method or the AFOSM (advanced first order second moment) method can be used to calculate the mean values and the standard deviations of functions describing constraints and object. Among two methods, AFOSM method has some advantage over FOSM method in evaluation of probability. Nevertheless, it is difficult to obtain the mean value and the standard deviation of objective function using AFOSM method, because it requires that the mean value of function is always positive. This paper presented a special technique to overcome this weakness of AFOSM method. The mean value and the standard deviation of objective function by the proposed method are reliable as shown in examples compared with results by FOSM method.