• Title/Summary/Keyword: Uncertainty Theory

Search Result 578, Processing Time 0.025 seconds

Evaluating the Importance of Certification Criteria for Call Center Services (콜센터 서비스인증 심사기준의 중요도 평가)

  • Lee, Mi-na;So, Soon-Hu
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2016.10a
    • /
    • pp.139-141
    • /
    • 2016
  • With the introduction of national standards (KS) for call centers, customers are expected to have access to better-quality services in the future. The purpose of this study is to evaluate the relative importance of certification criteria for call center services using the Fuzzy AHP(Analytic Hierarchy Process) approach. Fuzzy AHP is the combination of the fuzzy set theory and AHP to deal with the uncertainty or ambiguousness of the decision-making process. The evaluation method proposed in this study can be applied to the KS services certification promoted by government.

  • PDF

Smart modified repetitive-control design for nonlinear structure with tuned mass damper

  • ZY Chen;Ruei-Yuan Wang;Yahui Meng;Timothy Chen
    • Steel and Composite Structures
    • /
    • v.46 no.1
    • /
    • pp.107-114
    • /
    • 2023
  • A new intelligent adaptive control scheme was proposed that combines observer disturbance-based adaptive control and fuzzy adaptive control for a composite structure with a mass-adjustable damper. The most important advantage is that the control structures do not need to know the uncertainty limits and the interference effect is eliminated. Three adjustable parameters in LMI are used to control the gain of the 2D fuzzy control. Binary performance indices with weighted matrices are constructed to separately evaluate validation and training performance using the revalidation learning function. Determining the appropriate weight matrix balances control and learning efficiency and prevents large gains in control. It is proved that the stability of the control system can be ensured by a linear matrix theory of equality based on Lyapunov's theory. Simulation results show that the multilevel simulation approach combines accuracy with high computational efficiency. The M-TMD system, by slightly reducing critical joint load amplitudes, can significantly improve the overall response of an uncontrolled structure.

Mixed $H_2/H_{$\infty}$ and $\mu$-synthesis Approach to the Coupled Three-Inertia Problem (혼합 $H_2/H_{$\infty}$$\mu$-설계이론을 이용한 3관성 문제의 해법)

  • Choe, Yeon-Wook
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.7 no.11
    • /
    • pp.896-903
    • /
    • 2001
  • This study investigates the use of mixed $H_2/H_{$\infty}$ and $\mu$-synthesis to construct a robust controller for the benchmark problem. The model treated in the problem is a coupled three-inertial system that reflects the dynamics of mechanical vibrations. This kind of problem requires to be satisfied the robust performance (both in the time and frequency-domain specifications). We, first, adopt the mixed $H_2/H_{$\infty}$ theory to design a feedback controller K(s). Next, $\mu$-synthesis method is applied to the overall system to make use of structured parametric uncertainty. This process permits higher levels of controller authority and reduces the conservativeness of the controller. Finally, the feedforward controller is also used to improve the transient response of the output. We confirm that all design specifications except a complementary sensitivity condition can be achieved.

  • PDF

Diagnosis by Rough Set and Information Theory in Reinforcing the Competencies of the Collegiate (러프집합과 정보이론을 이용한 대학생역량강화 진단)

  • Park, In-Kyoo
    • Journal of Digital Convergence
    • /
    • v.12 no.8
    • /
    • pp.257-264
    • /
    • 2014
  • This paper presents the core competencies diagnosis system which targeted our collegiate students in an attempt to induce the core competencies for reinforcing the learning and employment capabilities. Because these days data give rise to a high level of redundancy and dimensionality with time complexity, they are more likely to have spurious relationships, and even the weakest relationships will be highly significant by any statistical test. So as to address the measurement of uncertainties from the classification of categorical data and the implementation of its analytic system, an uncertainty measure of rough entropy and information entropy is defined so that similar behaviors analysis is carried out and the clustering ability is demonstrated in the comparison with the statistical approach. Because the acquired and necessary competencies of the collegiate is deduced by way of the results of the diagnosis, i.e. common core competencies and major core competencies, they facilitate not only the collegiate life and the employment capability reinforcement but also the revitalization of employment and the adjustment to college life.

The systematic sampling for inferring the survey indices of Korean groundfish stocks

  • Hyun, Saang-Yoon;Seo, Young IL
    • Fisheries and Aquatic Sciences
    • /
    • v.21 no.8
    • /
    • pp.24.1-24.9
    • /
    • 2018
  • The Korean bottom trawl survey has been deployed on a regular basis for about the last decade as part of groundfish stock assessments. The regularity indicates that they sample groundfish once per grid cell whose sides are half of one latitude and that of one longitude, respectively, and whose inside is furthermore divided into nine nested grids. Unless they have a special reason (e.g., running into a rocky bottom), their sample location is at the center grid of the nine nested grids. Given data collected by the survey, we intended to show how to appropriately estimate not only the survey index of a fish stock but also its uncertainty. For the regularity reason, we applied the systematic sampling theory for the above purposes and compared its results with a reference, which was based on the simple random sampling. When using the survey data about 11 fish stocks, collected by the spring and fall surveys in 2014, the survey indices of those stocks estimated under the systematic sampling were overall more precise than those under the simple random sampling. In estimates of the survey indices in number, the standard errors of those estimates under the systematic sampling were reduced from those under the simple random sampling by 0.23~27.44%, while in estimates of the survey indices in weight, they decreased by 0.04~31.97%. In bias of the estimates, the systematic sampling was the same as the simple random sampling. Our paper is first in formally showing how to apply the systematic sampling theory to the actual data collected by the Korean bottom trawl surveys.

Probabilistic Approach of Stability Analysis for Rock Wedge Failure (확률론적 해석방법을 이용한 쐐기파괴의 안정성 해석)

  • Park, Hyuck-Jin
    • Economic and Environmental Geology
    • /
    • v.33 no.4
    • /
    • pp.295-307
    • /
    • 2000
  • Probabilistic analysis is a powerful method to quantify variability and uncertainty common in engineering geology fields. In rock slope engineering, the uncertainty and variation may be in the form of scatter in orientations and geometries of discontinuities, and also test results. However, in the deterministic analysis, the factor of safety which is used to ensure stability of rock slopes, is based on the fixed representative values for each parameter without a consideration of the scattering in data. For comparison, in the probabilistic analysis, these discontinuity parameters are considered as random variables, and therefore, the reliability and probability theories are utilized to evaluate the possibility of slope failure. Therefore, in the probabilistic analysis, the factor of safety is considered as a random variable and replaced by the probability of failure to measure the level of slope stability. In this study, the stochastic properties of discontinuity parameters are evaluated and the stability of rock slope is analyzed based on the random properties of discontinuity parameters. Then, the results between the deterministic analysis and the probabilistic analysis are compared and the differences between the two analysis methods are explained.

  • PDF

A financial feasibility analysis of architectural development projects that use probabilistic simulation analysis method (확률론적 시뮬레이션 분석방법을 적용한 건축개발사업의 재무적 타당성 분석)

  • Lee, Seong-Soo;Choi, Hee-Bok;Kang, Kyung-In
    • Korean Journal of Construction Engineering and Management
    • /
    • v.8 no.3
    • /
    • pp.76-86
    • /
    • 2007
  • Construction development work invents profit as those finalize object, and a make or break success of project depends on correct analysis and forecast business feasibility at project early. Business feasibility study would be decision-making under precarious situation because is connoting uncertainty that is future. estimate at present visual point essentially. Under uncertainty, a decision-making method is based on probability theory of statistics, but business feasibility study had applied with not feasibility study by probabilistic decision method but it by determinism derision method so far. Therefore in this study doing decision-making by a probability theory method for successful project at early business feasibility study, it present a probabilistic study method that use simulation that can supply a little more correct and reliable data to decision-maker As result, a probabilistic study method is more suitable than deterministic study method as technique for a financial feasibility study of construction development work. Making good use of this probabilistic study method at important business or careful decision-making, because efficient Judgment that is based accuracy and authoritativeness may become available.

Fuzzy Minimum Interval Partition for Uncertain Time Interval (불확실한 시간 간격을 위한 퍼지 최소 간격 분할 기법)

  • Heo, Mun-Haeng;Lee, Gwang-Gyu;Lee, Jun-Uk;Ryu, Geun-Ho;Kim, Hong-Gi
    • The KIPS Transactions:PartD
    • /
    • v.9D no.4
    • /
    • pp.571-578
    • /
    • 2002
  • In temporal database, extended time dimension for history management brings about complexity of join operation and increased cost. To solve this problem, a method that joins the divided segment time data after partition the time range into fixed time interval is introduced. But existing methods can't solve the ambiguity problem of time border that caused by temporal granularity in the partition point. In this paper, We suggested Fuzzy Minimum Interval Partition (FMIP) method that introduced the possibility distribution of fuzzy theory considered uncertainty time interval border in the partition line.

Dempster-Shafer Fusion of Multisensor Imagery Using Gaussian Mass Function (Gaussian분포의 질량함수를 사용하는 Dempster-Shafer영상융합)

  • Lee Sang-Hoon
    • Korean Journal of Remote Sensing
    • /
    • v.20 no.6
    • /
    • pp.419-425
    • /
    • 2004
  • This study has proposed a data fusion method based on the Dempster-Shafer evidence theory The Dempster-Shafer fusion uses mass functions obtained under the assumption of class-independent Gaussian assumption. In the Dempster-Shafer approach, uncertainty is represented by 'belief interval' equal to the difference between the values of 'belief' function and 'plausibility' function which measure imprecision and uncertainty By utilizing the Dempster-Shafer scheme to fuse the data from multiple sensors, the results of classification can be improved. It can make the users consider the regions with mixed classes in a training process. In most practices, it is hard to find the regions with a pure class. In this study, the proposed method has applied to the KOMPSAT-EOC panchromatic image and LANDSAT ETM+ NDVI data acquired over Yongin/Nuengpyung. area of Kyunggi-do. The results show that it has potential of effective data fusion for multiple sensor imagery.

Characterizing Spatiotemporal Variations and Mass Balance of CO2 in a Stratified Reservoir using CE-QUAL-W2 (CE-QUAL-W2를 이용한 성층 저수지에서 CO2의 시공간적 분포 및 물질수지 분석)

  • Park, Hyungseok;Chung, Sewoong
    • Journal of Korean Society on Water Environment
    • /
    • v.36 no.6
    • /
    • pp.508-520
    • /
    • 2020
  • Dam reservoirs have been reported to contribute significantly to global carbon emissions, but unlike natural lakes, there is considerable uncertainty in calculating carbon emissions due to the complex of emission pathways. In particular, the method of calculating carbon dioxide (CO2) net atmospheric flux (NAF) based on a simple gas exchange theory from sporadic data has limitations in explaining the spatiotemporal variations in the CO2 flux in stratified reservoirs. This study was aimed to analyze the spatial and temporal CO2 distribution and mass balance in Daecheong Reservoir, located in the mid-latitude monsoon climate zone, by applying a two-dimensional hydrodynamic and water quality model (CE-QUAL-W2). Simulation results showed that the Daecheong Reservoir is a heterotrophic system in which CO2 is supersaturated as a whole and releases CO2 to the atmosphere. Spatially, CO2 emissions were greater in the lacustrine zone than in the riverine and transition zones. In terms of time, CO2 emissions changed dynamically according to the temporal stratification structure of the reservoir and temporal variations of algae biomass. CO2 emissions were greater at night than during the day and were seasonally greatest in winter. The CO2 NAF calculated by the CE-QUAL-W2 model and the gas exchange theory showed a similar range, but there was a difference in the point of occurrence of the peak value. The findings provide useful information to improve the quantification of CO2 emissions from reservoirs. In order to reduce the uncertainty in the estimation of reservoir carbon emissions, more precise monitoring in time and space is required.