• Title/Summary/Keyword: and Input Measure

Search Result 881, Processing Time 0.024 seconds

Derivation of uncertainty importance measure and its application

  • Park, Chang-K.
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1990.04a
    • /
    • pp.272-288
    • /
    • 1990
  • The uncertainty quantification process in probabilistic Risk Assessment usually involves a specification of the uncertainty in the input data and the propagation of this uncertainty to the final risk results. The distributional sensitivity analysis is to study the impact of the various assumptions made during the quantification of input parameter uncertainties on the final output uncertainty. The uncertainty importance of input parameters, in this case, should reflect the degree of changes in the whole output distribution and not just in a point estimate value. A measure of the uncertainty importance is proposed in the present paper. The measure is called the distributional sensitivity measure(DSM) and explicitly derived from the definition of the Kullback's discrimination information. The DSM is applied to three typical discrimination information. The DSM is applied to three typical cases of input distributional changes: 1) Uncertainty is completely eliminated, 2) Uncertainty range is increased by a factor of 10, and 3) Type of distribution is changed. For all three cases of application, the DSM-based importance ranking agrees very well with the observed changes of output distribution while other statistical parameters are shown to be insensitive.

  • PDF

Measures of modal and gross controllability/observability for linear time-varying systems (선형 시변 시스템에 대한 모드 및 총가제어성/가관측성 척도)

  • Choe, Jae-Won;Lee, Ho-Chul;Lee, Dal-Ho
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.5 no.6
    • /
    • pp.647-655
    • /
    • 1999
  • For linear time-varying systems described by the triple (A(t),B(t),C(t)) where A(t),B(t),C(t) are the system, the input, and the output matrices, respectively, we propose concepts for measures of modal and gross controllability /observability. We introduce a differential algebraic eigenbvalue theory for linear time-varying systems to calculate the PD-eigenvalues and left and right PD-eigenvectors of the system matrix A(t) which will be used to derive the concepts for the measures. The time-dependent angle between the left PD-eigenvectors of the system matrix A(t) and the columns of the input matrix B(t), and the magnitude of the each element of the input matrix B(t) are used to propose the modal controllability measure. Similarly, the time-dependent angle between the right PD-eigenvectors of the system matrix A(t) and the rows of the output matrix C(t) are used to propose the madal observability measure. Gross measure of controllability of a mode from all inputs and its gross measure of observability in all outputs for the linear time-varying systems are also proposed. Numerical examples are presented to illustrate the proposed concepts.

  • PDF

Simulation Input Modeling : Sample Size Determination for Parameter Estimation of Probability Distributions (시뮬레이션 입력 모형화 : 확률분포 모수 추정을 위한 표본크기 결정)

  • Park Sung-Min
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.31 no.1
    • /
    • pp.15-24
    • /
    • 2006
  • In simulation input modeling, it is important to identify a probability distribution to represent the input process of interest. In this paper, an appropriate sample size is determined for parameter estimation associated with some typical probability distributions frequently encountered in simulation input modeling. For this purpose, a statistical measure is proposed to evaluate the effect of sample size on the precision as well as the accuracy related to the parameter estimation, square rooted mean square error to parameter ratio. Based on this evaluation measure, this sample size effect can be not only analyzed dimensionlessly against parameter's unit but also scaled regardless of parameter's magnitude. In the Monte Carlo simulation experiments, three continuous and one discrete probability distributions are investigated such as ; 1) exponential ; 2) gamma ; 3) normal ; and 4) poisson. The parameter's magnitudes tested are designed in order to represent distinct skewness respectively. Results show that ; 1) the evaluation measure drastically improves until the sample size approaches around 200 ; 2) up to the sample size about 400, the improvement continues but becomes ineffective ; and 3) plots of the evaluation measure have a similar plateau pattern beyond the sample size of 400. A case study with real datasets presents for verifying the experimental results.

The Optimal Partition of Initial Input Space for Fuzzy Neural System : Measure of Fuzziness (퍼지뉴럴 시스템을 위한 초기 입력공간분할의 최적화 : Measure of Fuzziness)

  • Baek, Deok-Soo;Park, In-Kue
    • Journal of the Institute of Electronics Engineers of Korea TE
    • /
    • v.39 no.3
    • /
    • pp.97-104
    • /
    • 2002
  • In this paper we describe the method which optimizes the partition of the input space by means of measure of fuzziness for fuzzy neural network. It covers its generation of fuzzy rules for input sub space. It verifies the performance of the system depended on the various time interval of the input. This method divides the input space into several fuzzy regions and assigns a degree of each of the generated rules for the partitioned subspaces from the given data using the Shannon function and fuzzy entropy function generating the optimal knowledge base without the irrelevant rules. In this scheme the basic idea of the fuzzy neural network is to realize the fuzzy rule base and the process of reasoning by neural network and to make the corresponding parameters of the fuzzy control rules be adapted by the steepest descent algorithm. According to the input interval the proposed inference procedure proves that the fast convergence of root mean square error (RMSE) owes to the optimal partition of the input space

Generation of Simulation input Stream using Threshold Bootstrap (임계값 부트스트랩을 사용한 시뮬레이션 입력 시나리오의 생성)

  • Kim Yun Bae;Kim Jae Bum
    • Korean Management Science Review
    • /
    • v.22 no.1
    • /
    • pp.15-26
    • /
    • 2005
  • The bootstrap is a method of computational inference that simulates the creation of new data by resampling from a single data set. We propose a new job for the bootstrap: generating inputs from one historical trace using Threshold Bootstrap. In this regard, the most important quality of bootstrap samples is that they be functionally indistinguishable from independent samples of the same stochastic process. We describe a quantitative measure of difference between two time series, and demonstrate the sensitivity of this measure for discriminating between two data generating processes. Utilizing this distance measure for the task of generating inputs, we show a way of tuning the bootstrap using a single observed trace. This application of the threshold bootstrap will be a powerful tool for Monte Carlo simulation. Monte Carlo simulation analysis relies on built-in input generators. These generators make unrealistic assumptions about independence and marginal distributions. The alternative source of inputs, historical trace data, though realistic by definition, provides only a single input stream for simulation. One benefit of our method would be expanding the number of inputs achieving reality by driving system models with actual historical input series. Another benefit might be the automatic generation of lifelike scenarios for the field of finance.

DEA를 이용한 통신 사업자의 효율성 측정에 관한 연구

  • 김찬규;김현종
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 2001.10a
    • /
    • pp.213-217
    • /
    • 2001
  • This paper deals with the estimation of efficiency of Telecommunication Operators(TOs) using DEA (Data Envelopment Analysis). The measure of efficiency using DEA accomplishes next two steps. First, the efficient frontier is derived using the data of input and output. And then the efficiency of the decision making units are measured by the distances from this frontier To measure efficiency, we consider a one-output, three-input production function. We use the tangible assets, investment cost, the number of employees for input data and measure the output by the revenue. After measuring the efficiency, We can compare the domestic TOs with the international TOs and the wire TOs with wireless ones by average efficiency. After accomplishing the analysis of efficiency, the internal/external efficiency is measured simultaneously through correlation between efficiency and profitability, quality level.

  • PDF

Measuring the Degree of Integration into the Global Production Network by the Decomposition of Gross Output and Imports: Korea 1970-2018

  • KIM, DONGSEOK
    • KDI Journal of Economic Policy
    • /
    • v.43 no.3
    • /
    • pp.33-53
    • /
    • 2021
  • The import content of exports (ICE) is defined as the amount of foreign input embodied in one unit of export, and it has been used as a measure of the degree of integration into the global production network. In this paper, we suggest an alternative measure based on the decomposition of gross output and imports into the contributions of final demand terms. This measure considers the manner in which a country manages its domestic production base (gross output) and utilizes the foreign sector (imports) simultaneously and can thus be regarded as a more comprehensive measure than ICE. Korea's input-output tables in 1970-2018 are used in this paper. These tables were rearranged according to the same 26-industry classification so that these measures can be computed with time-series continuity and so that the results can be interpreted clearly. The results obtained in this paper are based on extended time-series data and are expected to be reliable and robust. The suggested indicators were applied to these tables, and, based on the results we conclude that the overall importance of the global economy in Korea's economic strategy has risen and that the degree of Korea's integration into the global production network increased over the entire period. This paper also shows that ICE incorrectly measures the movement of the degree of integration into the global production network in some periods.

Evaluation of Uncertainty Importance Measure for Monotonic Function (단조함수에 대한 불확실성 중요도 측도의 평가)

  • Cho, Jae-Gyeun
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.15 no.5
    • /
    • pp.179-185
    • /
    • 2010
  • In a sensitivity analysis, an uncertainty importance measure is often used to assess how much uncertainty of an output is attributable to the uncertainty of an input, and thus, to identify those inputs whose uncertainties need to be reduced to effectively reduce the uncertainty of output. A function is called monotonic if the output is either increasing or decreasing with respect to any of the inputs. In this paper, for a monotonic function, we propose a method for evaluating the measure which assesses the expected percentage reduction in the variance of output due to ascertaining the value of input. The proposed method can be applied to the case that the output is expressed as linear and nonlinear monotonic functions of inputs, and that the input follows symmetric and asymmetric distributions. In addition, the proposed method provides a stable uncertainty importance of each input by discretizing the distribution of input to the discrete distribution. However, the proposed method is computationally demanding since it is based on Monte Carlo simulation.

Semantic Conceptual Relational Similarity Based Web Document Clustering for Efficient Information Retrieval Using Semantic Ontology

  • Selvalakshmi, B;Subramaniam, M;Sathiyasekar, K
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.9
    • /
    • pp.3102-3119
    • /
    • 2021
  • In the modern rapid growing web era, the scope of web publication is about accessing the web resources. Due to the increased size of web, the search engines face many challenges, in indexing the web pages as well as producing result to the user query. Methodologies discussed in literatures towards clustering web documents suffer in producing higher clustering accuracy. Problem is mitigated using, the proposed scheme, Semantic Conceptual Relational Similarity (SCRS) based clustering algorithm which, considers the relationship of any document in two ways, to measure the similarity. One is with the number of semantic relations of any document class covered by the input document and the second is the number of conceptual relation the input document covers towards any document class. With a given data set Ds, the method estimates the SCRS measure for each document Di towards available class of documents. As a result, a class with maximum SCRS is identified and the document is indexed on the selected class. The SCRS measure is measured according to the semantic relevancy of input document towards each document of any class. Similarly, the input query has been measured for Query Relational Semantic Score (QRSS) towards each class of documents. Based on the value of QRSS measure, the document class is identified, retrieved and ranked based on the QRSS measure to produce final population. In both the way, the semantic measures are estimated based on the concepts available in semantic ontology. The proposed method had risen efficient result in indexing as well as search efficiency also has been improved.

A Study of New Production Input Control in an Agile Manufacturing Environment (신속제조환경에서의 새로운 생산입력통제방식에 관한 연구)

  • Kim, Hyun-Soo
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.23 no.4
    • /
    • pp.699-708
    • /
    • 1997
  • Production control is usually composed of due-dote assignment, production input control, and priority dispatching rule. A production input control(PIC) is mainly to control the WIP level on the shop floor. On the other hand, a priority dispatching rule(PDR) is mainly to control the tardiness/earliness of on order and number of tardy jobs. Therefore, if we select a particular PIC which can control only a particular performance measure(i.e., tardiness), it may cause worsening other performance measure(i.e., WIP level, shopfloor time, etc.) This newly developed production input control, DRD(Dual Release-Dates), is mainly designed to control the WIP level on the shop floor by employing two different release-dates of an order(earliest release. date and latest release-date and the release condition (relationship between the current WIP level and the pre-defined maximum WIP level) while trying to meet the due-date of the order.

  • PDF