• Title/Summary/Keyword: uncertain data

Search Result 525, Processing Time 0.026 seconds

APPLICATION OF FUZZY LOGIC IN THE CLASSICAL CELLULAR AUTOMATA MODEL

  • Chang, Chun-Ling;Zhang, Yun-Jie;Dong, Yun-Ying
    • Journal of applied mathematics & informatics
    • /
    • v.20 no.1_2
    • /
    • pp.433-443
    • /
    • 2006
  • In [1], they build two populations' cellular automata model with predation based on the Penna model. In this paper, uncertain aspects and problems of imprecise and vague data are considered in this model. A fuzzy cellular automata model containing movable wolves and sheep has been built. The results show that the fuzzy cellular automata can simulate the classical CA model and can deal with imprecise and vague data.

Optimal Design of Nonlinear Structural Systems via EFM Based Approximations (진화퍼지 근사화모델에 의한 비선형 구조시스템의 최적설계)

  • 이종수;김승진
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2000.05a
    • /
    • pp.122-125
    • /
    • 2000
  • The paper describes the adaptation of evolutionary fuzzy model ins (EFM) in developing global function approximation tools for use in genetic algorithm based optimization of nonlinear structural systems. EFM is an optimization process to determine the fuzzy membership parameters for constructing global approximation model in a case where the training data are not sufficiently provided or uncertain information is included in design process. The paper presents the performance of EFM in terms of numbers of fuzzy rules and training data, and then explores the EFM based sizing of automotive component for passenger protection.

  • PDF

Robust H$\infty$ FIR Filtering for Uncertain Time-Varying Sampled-Data Systems

  • Ryu, Hee-Seob;Kwon, Byung-Moon;Kwon, Oh-Kyu
    • Journal of KIEE
    • /
    • v.11 no.1
    • /
    • pp.21-26
    • /
    • 2001
  • This paper considers the problem of robust H$\infty$ filter is derived by using the equivalence relationship between the FIR filter and the recursive filter, that would be guarantee a prescribed H$\infty$ performance in the continuous-time context, irrespective of the parameter uncertainty and unknown initial states.

  • PDF

Health State Clustering and Prediction Based on Bayesian HMM (Bayesian HMM 기반의 건강 상태 분류 및 예측)

  • Sin, Bong-Kee
    • Journal of KIISE
    • /
    • v.44 no.10
    • /
    • pp.1026-1033
    • /
    • 2017
  • In this paper a Bayesian modeling and duration-based prediction method is proposed for health clinic time series data using the Hierarchical Dirichlet Process Hidden Markov Model (HDP-HMM). HDP-HMM is a Bayesian extension of HMM which can find the optimal number of health states, a number which is highly uncertain and even difficult to estimate under the context of health dynamics. Test results of HDP-HMM using simulated data and real health clinic data have shown interesting modeling behaviors and promising prediction performance over the span of up to five years. The future of health change is uncertain and its prediction is inherently difficult, but experimental results on health clinic data suggests that practical long-term prediction is possible and can be made useful if we present multiple hypotheses given dynamic contexts as defined by HMM states.

Chaotic Forecast of Time-Series Data Using Inverse Wavelet Transform

  • Matsumoto, Yoshiyuki;Yabuuchi, Yoshiyuki;Watada, Junzo
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2003.09a
    • /
    • pp.338-341
    • /
    • 2003
  • Recently, the chaotic method is employed to forecast a near future of uncertain phenomena. This method makes it possible by restructuring an attractor of given time-series data in multi-dimensional space through Takens' embedding theory. However, many economical time-series data are not sufficiently chaotic. In other words, it is hard to forecast the future trend of such economical data on the basis of chaotic theory. In this paper, time-series data are divided into wave components using wavelet transform. It is shown that some divided components of time-series data show much more chaotic in the sense of correlation dimension than the original time-series data. The highly chaotic nature of the divided component enables us to precisely forecast the value or the movement of the time-series data in near future. The up and down movement of TOPICS value is shown so highly predicted by this method as 70%.

  • PDF

Evaluating LIMU System Quality with Interval Evidence and Input Uncertainty

  • Xiangyi Zhou;Zhijie Zhou;Xiaoxia Han;Zhichao Ming;Yanshan Bian
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.17 no.11
    • /
    • pp.2945-2965
    • /
    • 2023
  • The laser inertial measurement unit is a precision device widely used in rocket navigation system and other equipment, and its quality is directly related to navigation accuracy. In the quality evaluation of laser inertial measurement unit, there is inevitably uncertainty in the index input information. First, the input numerical information is in interval form. Second, the index input grade and the quality evaluation result grade are given according to different national standards. So, it is a key step to transform the interval information input by the index into the data form consistent with the evaluation result grade. In the case of uncertain input, this paper puts forward a method based on probability distribution to solve the problem of asymmetry between the reference grade given by the index and the evaluation result grade when evaluating the quality of laser inertial measurement unit. By mapping the numerical relationship between the designated reference level and the evaluation reference level of the index information under different distributions, the index evidence symmetrical with the evaluation reference level is given. After the uncertain input information is transformed into evidence of interval degree distribution by this method, the information fusion of interval degree distribution evidence is carried out by interval evidential reasoning algorithm, and the evaluation result is obtained by projection covariance matrix adaptive evolution strategy optimization. Taking a five-meter redundant laser inertial measurement unit as an example, the applicability and effectiveness of this method are verified.

Improvement of Control Performance by Data Fusion of Sensors

  • Na, Seung-You;Shin, Dae-Jung
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.4 no.1
    • /
    • pp.63-69
    • /
    • 2004
  • In this paper, we propose a general framework for sensor data fusion applied to control systems. Since many kinds of disturbances are introduced to a control system, it is necessary to rely on multisensor data fusion to improve control performance in spite of the disturbances. Multisensor data fusion for a control system is considered a sequence of making decisions for a combination of sensor data to make a proper control input in uncertain conditions of disturbance effects on sensors. The proposed method is applied to a typical control system of a flexible link system in which reduction of oscillation is obtained using a photo sensor at the tip of the link. But the control performance depends heavily on the environmental light conditions. To overcome the light disturbance difficulties, an accelerometer is used in addition to the existing photo sensor. Improvement of control performance is possible by utilizing multisensor data fusion for various output responses to show the feasibility of the proposed method in this paper.

Probabilistic Interpretation of NDE Data in Condition Assessment of Bridge Element (교량안전진단에 있어서 비파괴 시험자료의 통계적 해석 방법)

  • 심형섭;강보순;황성춘
    • Proceedings of the Korea Concrete Institute Conference
    • /
    • 2001.11a
    • /
    • pp.803-808
    • /
    • 2001
  • Mathematical basis of interpretation of data from nondestructive evaluation (NDE) methods in bridge inspection is presented. In bridge inspection with NDE methods, NDE data are not assessments. NDE data must be interpreted as condition of element. Interpretation is then assessment. Correct assessments of conditions of bridge elements depend on the accuracy and variability in test data as well as on the uncertainty of correlations between attributes (what is measured) and conditions (what is sought in the inspection). Inaccuracy and variability in test data defines the qualify or NDE test. The qualify or test itself is important, but in view of condition assessment, the significance of uncertainty in correlations of attributes and conditions must be combined. NDE methods that are accurate in their measurements may still be found to be poor methods if attributes are uncertain indicators of condition of bridge elements. This paper reports mathematical presentation of inaccuracy and variability in test data and of uncertainty in correlation of attributes to element conditions with three examples of NDE methods.

  • PDF

THE EFFECTS OF UNCERTAIN TOPOGRAPHIC DATA ON SPATIAL PREDICTION OF LANDSLIDE HAZARD

  • Park, No-Wook;Kyriakidis, Phaedon C.
    • Proceedings of the KSRS Conference
    • /
    • 2008.10a
    • /
    • pp.259-261
    • /
    • 2008
  • GIS-based spatial data integration tasks have used exhaustive thematic maps generated from sparsely sampled data or satellite-based exhaustive data. Due to a simplification of reality and error in mapping procedures, such spatial data are usually imperfect and of different accuracy. The objective of this study is to carry out a sensitivity analysis in connection with input topographic data for landslide hazard mapping. Two different types of elevation estimates, elevation spot heights and a DEM from ASTER stereo images are considered. The geostatistical framework of kriging is applied for generating more reliable elevation estimates from both sparse elevation spot heights and exhaustive ASTER-based elevation values. The effects of different accuracy arising from different terrain-related maps on the prediction performance of landslide hazard are illustrated from a case study of Boeun, Korea.

  • PDF

Technical Trends of Time-Series Data Imputation (시계열 데이터 결측치 처리 기술 동향)

  • Kim, E.D.;Ko, S.K.;Son, S.C.;Lee, B.T.
    • Electronics and Telecommunications Trends
    • /
    • v.36 no.4
    • /
    • pp.145-153
    • /
    • 2021
  • Data imputation is a crucial issue in data analysis because quality data are highly correlated with the performance of AI models. Particularly, it is difficult to collect quality time-series data for uncertain situations (for example, electricity blackout, delays for network conditions). Thus, it is necessary to research effective methods of time-series data imputation. Many studies on time-series data imputation can be divided into 5 parts, including statistical based, matrix-based, regression-based, deep learning (RNN and GAN) based methodologies. This study reviews and organizes these methodologies. Recently, deep learning-based imputation methods are developed and show excellent performance. However, it is associated to some computational problems that make it difficult to use in real-time system. Thus, the direction of future work is to develop low computational but high-performance imputation methods for application in the real field.