• Title/Summary/Keyword: Optimal Distribution Estimation

Search Result 218, Processing Time 0.024 seconds

Optimal estimation of rock joint characteristics using simulated annealing technique - A case study

  • Hong, Chang-Woo;Jeon, Seok-Won
    • 한국지구물리탐사학회:학술대회논문집
    • /
    • 2003.11a
    • /
    • pp.78-82
    • /
    • 2003
  • In this paper, simulated annealing technique was used to estimate the rock joint characteristics, RMR(rock mass rating) values, to overcome the defects of ordinary kriging. Ordinary kriging reduced the variance of data, so lost the characteristics of distribution. Simulated annealing technique could reflect the distribution feature and the spatial correlation of the original data. Through the comparisons between three times simulations, the uncertainty of the simulation could be quantified, and sufficient results were obtained.

  • PDF

Estimation of the optimal probability distribution for daily electricity generation by wind power in rural green-village planning (농촌 그린빌리지 계획을 위한 일별 풍력발전량의 적정확률분포형 추정)

  • Kim, Dae-Sik;Koo, Seung-Mo;Nam, Sang-Woon
    • Journal of The Korean Society of Agricultural Engineers
    • /
    • v.50 no.6
    • /
    • pp.27-35
    • /
    • 2008
  • This study aims to estimate the optimal probability distribution of daily electricity generation by wind power, in order to contribute in rural green-village planning. Wind power generation is now being recognized as one of the most popular sources for renewable resources over the country. Although it is also being adapted to rural area for may reasons, it is important to estimate the magnitudes of power outputs with reliable statistical methodologies while applying historical daily wind data, for correct feasibility analysis. In this study, one of the well-known statistical methodology is employed to define the appropriate statistical distributions for monthly power outputs for specific rural areas. The results imply that the assumption of normal distributions for many cases may lead to incorrect decision-making and therefore lead to the unreliable feasibility analysis. Subjective methodology for testing goodness of fit for normal distributions on all the cases in this study, provides possibilities to consider the other various types of statistical distributions for more precise feasibility analysis.

RFID Tag Number Estimation and Query Time Optimization Methods (RFID 태그 개수 추정 방법 및 질의 시간 최소화 방안)

  • Woo, Kyung-Moon;Kim, Chong-Kwon
    • Journal of KIISE:Information Networking
    • /
    • v.33 no.6
    • /
    • pp.420-427
    • /
    • 2006
  • An RFID system is an important technology that could replace the traditional bar code system changing the paradigm of manufacturing, distribution, and service industry. An RFID reader can recognize several hundred tags in one second. Tag identification is done by tags' random transmission of their IDs in a frame which is assigned by the reader at each round. To minimize tag identification time, optimal frame size should be selected according to the number of tags. This paper presents new query optimization methods in RFID systems. Query optimization consists of tag number estimation problem and frame length determination problem. We propose a simple yet efficient tag estimation method and calculate optimal frame lengths that minimize overall query time. We conducted rigorous performance studies. Performance results show that the new tag number estimation technique is more accurate than previous methods. We also observe that a simple greedy method is as efficient as the optimal method in minimizing the query time.

Estimation of the Optimal Dredge Amount to Maintain the Water Supply Capacity on Asan-Lake (아산호 용수공급용량 유지를 위한 적정 준설량 산정)

  • Jang Tae-Il;Kim Sang-Min;Kang Moon-Seong;Park Seung-Woo
    • Journal of The Korean Society of Agricultural Engineers
    • /
    • v.48 no.2
    • /
    • pp.45-55
    • /
    • 2006
  • This study analyze the hydrologic conditions and the effects of selected runoff characteristics as an attempt to estimate the optimal dredge amount for Asan Lake in Korea. The runoff feature was calculated by utilizing the water balance simulation from DIROM (Daily Irrigation Reservoir Operation Model), which allowed changes in landuse to be quantified using remote sensing for 14 years. The distribution of prospective sediment deposits was been tallied based on the changes in landuse, and quantity of incoming sediment estimated. From these findings, we were then able to simulate the fluctuation of water level, gauging the pumping days not already in use, to determine the frequency of the distribution for around the. requirement annual water storage and the changing water level. The optimal dredge amount was calculated on the basis of the distribution of frequency, taking into account the design criteria for agricultural water with the 10-year frequency of resistant capacity.

A Study on Optimal Release Time for Software Systems based on Mixture Weibull NHPP Model (혼합 와이블 NHPP 모형에 근거한 소프트웨어 최적방출시기에 관한 연구)

  • Lee, Sang Sik;Kim, Hee Cheul
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.6 no.2
    • /
    • pp.183-191
    • /
    • 2010
  • Decision problem called an optimal release policies, after testing a software system in development phase and transfer it to the user, is studied. The applied model of release time exploited infinite non-homogeneous Poisson process. This infinite non-homogeneous Poisson process is a model which reflects the possibility of introducing new faults when correcting or modifying the software. The failure life-cycle distribution used mixture which has various intensity, if the system is complicated. Thus, software release policies which minimize a total average software cost of development and maintenance under the constraint of satisfying a software reliability requirement becomes an optimal release policies. In a numerical example, after trend test applied and estimated the parameters using maximum likelihood estimation of inter-failure time data, estimated software optimal release time.

The Comparative Study of Software Optimal Release Time Based on Log-Logistic Distribution (Log-Logistic 분포 모형에 근거한 소프트웨어 최적방출시기에 관한 비교연구)

  • Kim, Hee-Cheul
    • Journal of the Korea Society of Computer and Information
    • /
    • v.13 no.7
    • /
    • pp.1-9
    • /
    • 2008
  • In this paper, make a study decision problem called an optimal release policies after testing a software system in development phase and transfer it to the user. When correcting or modifying the software, because of the possibility of introducing new faults when correcting or modifying the software, infinite failure non-homogeneous Poisson process models presented and propose an optimal release policies of the life distribution applied log-logistic distribution which can capture the increasing! decreasing nature of the failure occurrence rate per fault. In this paper, discuss optimal software release policies which minimize a total average software cost of development and maintenance under the constraint of satisfying a software reliability requirement. In a numerical example, after trend test applied and estimated the parameters using maximum likelihood estimation of inter-failure time data, make out estimating software optimal release time.

  • PDF

The Comparative Study of Software Optimal Release Time Based on Weibull Distribution Property (와이블 분포 특성에 근거한 소프트웨어 최적 방출시기에 관한 비교 연구)

  • Kim, Hee-Cheul;Park, Hyoung-Keun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.10 no.8
    • /
    • pp.1903-1910
    • /
    • 2009
  • In this paper, we were researched decision problem called an optimal release policies after testing a software system in development phase and transferring it to the user. The applied model of release time exploited infinite failure non-homogeneous Poisson process This infinite failure non-homogeneous Poisson process is a model which reflects the possibility of introducing new faults when correcting or modifying the software. The failure life-cycle distribution used the Weibull distribution which has the efficient various property which has the place efficient quality. Thus, optimal software release policies which minimize a total average software cost of development and maintenance under the constraint of satisfying a software reliability requirement becomes an optimal release policies. In a numerical example, after trend test applied and estimated the parameters using maximum likelihood estimation of inter-failure time data, estimated software optimal release time.

The Comparative Study of Software Optimal Release Time of Finite NHPP Model Considering Property of Nonlinear Intensity Function (비선형 강도함수 특성을 이용한 유한고장 NHPP모형에 근거한 소프트웨어 최적방출시기 비교 연구)

  • Kim, Kyung-Soo;Kim, Hee-Cheul
    • Journal of Digital Convergence
    • /
    • v.11 no.9
    • /
    • pp.159-166
    • /
    • 2013
  • In this paper, make a study decision problem called an optimal release policies after testing a software system in development phase and transfer it to the user. When correcting or modifying the software, finite failure non-homogeneous Poisson process model, presented and propose release policies of the life distribution, half-logistic property model which used to an area of reliability because of various shape and scale parameter. In this paper, discuss optimal software release policies which minimize a total average software cost of development and maintenance under the constraint of satisfying a software reliability requirement. In a numerical example, the parameters estimation using maximum likelihood estimation of failure time data, make out estimating software optimal release time. Software release time is used as prior information, potential security damages should be reduced.

Estimation of Design Rainfall by the Regional Frequency Analysis using Higher Probability Weighted Moments and GIS Techniques (III) - On the Method of LH-moments and GIS Techniques - (고차확률가중모멘트법에 의한 지역화빈도분석과 GIS기법에 의한 설계강우량 추정 (III) - LH-모멘트법과 GIS 기법을 중심으로 -)

  • 이순혁;박종화;류경식;지호근;신용희
    • Magazine of the Korean Society of Agricultural Engineers
    • /
    • v.44 no.5
    • /
    • pp.41-53
    • /
    • 2002
  • This study was conducted to derive the regional design rainfall by the regional frequency analysis based on the regionalization of the precipitation suggested by the first report of this project. According to the regions and consecutive durations, optimal design rainfalls were derived by the regional frequency analysis for L-moment in the second report of this project. Using the LH-moment ratios and Kolmogorov-Smirnov test, the optimal regional probability distribution was identified to be the Generalized extreme value (GEV) distribution among applied distributions. regional and at-site parameters of the GEV distribution were estimated by the linear combination of the higher probability weighted moments, LH-moment. Design rainfall using LH-moments following the consecutive duration were derived by the regional and at-site analysis using the observed and simulated data resulted from Monte Carlo techniques. Relative root-mean-square error (RRMSE), relative bias (RBIAS) and relative reduction (RR) in RRMSE for the design rainfall were computed and compared in the regional and at-site frequency analysis. Consequently, it was shown that the regional analysis can substantially more reduce the RRMSE, RBIAS and RR in RRMSE than at-site analysis in the prediction of design rainfall. Relative efficiency (RE) for an optimal order of L-moments was also computed by the methods of L, L1, L2, L3 and L4-moments for GEV distribution. It was found that the method of L-moments is more effective than the others for getting optimal design rainfall according to the regions and consecutive durations in the regional frequency analysis. Diagrams for the design rainfall derived by the regional frequency analysis using L-moments were drawn according to the regions and consecutive durations by GIS techniques.

Power spectrum estimation of EEG signal using robust method (로보스트 방법을 이용한 EEG 신호의 전력밀도 추정)

  • 김택수;허재만;김종순;유선국;박상희
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1991.10a
    • /
    • pp.736-740
    • /
    • 1991
  • EEG(Electroencephalogram) background signals can be represented as the sun of a conventional AR(Autoregressive) process and an innovation process, or a prediction error process. We have seen that conventional estimation techniques. such as least square estimates(LSE) or Gaussian maximum likelihood estimates(MLE-G) are optimal when the innovation process satisfies the Gaussian or presumed distribution. But when the data are contaminated by outliers, or artifacts, these assumptions are not met and conventional estimation techniques can badly fall and be strongly biased. It is known that EEG can be easily affected by artifacts. So we suggest a robust estimation technique which considerably performs well against those artifacts.

  • PDF