• Title/Summary/Keyword: conditional probability model

Search Result 126, Processing Time 0.027 seconds

Prediction of Forest Fire Hazardous Area Using Predictive Spatial Data Mining (예측적 공간 데이터 마이닝을 이용한 산불위험지역 예측)

  • Han, Jong-Gyu;Yeon, Yeon-Kwang;Chi, Kwang-Hoon;Ryu, Keun-Ho
    • The KIPS Transactions:PartD
    • /
    • v.9D no.6
    • /
    • pp.1119-1126
    • /
    • 2002
  • In this paper, we propose two predictive spatial data mining based on spatial statistics and apply for predicting the forest fire hazardous area. These are conditional probability and likelihood ratio methods. In these approaches, the prediction models and estimation procedures are depending un the basic quantitative relationships of spatial data sets relevant forest fire with respect to selected the past forest fire ignition areas. To make forest fire hazardous area prediction map using the two proposed methods and evaluate the performance of prediction power, we applied a FHR (Forest Fire Hazard Rate) and a PRC (Prediction Rate Curve) respectively. In comparison of the prediction power of the two proposed prediction model, the likelihood ratio method is mort powerful than conditional probability method. The proposed model for prediction of forest fire hazardous area would be helpful to increase the efficiency of forest fire management such as prevention of forest fire occurrence and effective placement of forest fire monitoring equipment and manpower.

The Application of SIS (Sequential Indicator Simulation) for the Manganese Nodule Fields (망간단괴광상의 매장량평가를 위한 SIS (Sequential Indicator Simulation)의 응용)

  • Park, Chan Young;Kang, Jung Keuk;Chon, Hyo Taek
    • Economic and Environmental Geology
    • /
    • v.30 no.5
    • /
    • pp.493-498
    • /
    • 1997
  • The purpose of this study is to develop geostatistical model for evaluating the abundance of deep-sea manganese nodule. The abundance data used in this study were obtained from the KODOS (Korea Deep Ocean Study) area. The variation of nodule abundance was very high within short distance, while sampling methods was very limited. As the distribution of nodule abundance showed non-gaussian, indicator simulation method was used instead of conditional simulation method and/or ordinary kriging. The abundance data were encoded into a series of indicators with 6 cutoff values. They were used to estimate the conditional probability distribution function (cpdf) of the nodule abundance at any unsampled location. The standardized indicator variogram models were obtained according to variogram analysis. This SIS method had the advantage over other traditional techniques such as the turning bands method and ordinary kriging. The estimating values by indicator conditional simulation near high abundance area were more detailed than by ordinary kriging and indicator kriging. They also showed better spatial characteristics of distribution of nodule abundance.

  • PDF

Classification of Forest Fire Occurrence Risk Regions Using Forest Site Digital Map (수치산림입지도를 이용한 산불발생위험지역 구분)

  • An Sang-Hyun;Won Myoung-Soo;Kang Young-Ho;Lee Myung-Bo
    • Fire Science and Engineering
    • /
    • v.19 no.3 s.59
    • /
    • pp.64-69
    • /
    • 2005
  • In order to decrease the area damaged by forest fires and to prevent the occurrence of forest fires, we are making an effort to improve prevention measures for forest fires. The objective of this study is developing the forest fire occurrence probability model by means of forest site characteristics such as soil type, topography, soil texture, slope, and drainage and forest fire sites. Conditional probability analysis and GIS were used in developing the forest fire occurrence probability model that was used in the classification of forest fire occurrence risk regions.

The Development of Condition Degradation Model of Railway PC Beam Bridge Using Transition Probability (철도 PC Beam교량의 전이확률을 이용한 상태저하 모델개발)

  • Kwon, Se-Gon;Park, Mi-Yun;Kim, Do-Kie;Jin, Nam-Hee;Ku, So-Yeun
    • Proceedings of the KSR Conference
    • /
    • 2009.05a
    • /
    • pp.1-5
    • /
    • 2009
  • Recently, as a method of green-development and reduction of carbon dioxide emission, increased interest has been focused on a railway. Furthermore, an intensive study has been processed on capabilities of maintenance activities, economic efficiency of maintenance on rail structure and a design of railway structure as well as the development of materials. The purpose of this paper is to develop a deteriorated model of PC Beam Bridge due to timely changes and maintenance activities. Typically, there is definite difference between maintained bridges and non-maintained bridges. As a result of proper maintenance activity, a life time of a structure can be enhanced. In this study, we will research and analyze structures with ongoing maintenance. We will also process same procedures on structures without maintenance. Therefore, we can establish the significant role in a conditional change of a structure. Based on a study, we accomplish the development of a condition-deteriorated model. To develop deteriorated model of PC Beam Bridge, We apply Marcov Theory and develop a transition probability to show the life time of bridge. This study will provide a great benefit to decision making for maintenance activities on the railway bridges for future.

  • PDF

Probability subtraction method for accurate quantification of seismic multi-unit probabilistic safety assessment

  • Park, Seong Kyu;Jung, Woo Sik
    • Nuclear Engineering and Technology
    • /
    • v.53 no.4
    • /
    • pp.1146-1156
    • /
    • 2021
  • Single-unit probabilistic safety assessment (SUPSA) has complex Boolean logic equations for accident sequences. Multi-unit probabilistic safety assessment (MUPSA) model is developed by revising and combining SUPSA models in order to reflect plant state combinations (PSCs). These PSCs represent combinations of core damage and non-core damage states of nuclear power plants (NPPs). Since all these Boolean logic equations have complemented gates (not gates), it is not easy to generate exact Boolean solutions. Delete-term approximation method (DTAM) has been widely applied for generating approximate minimal cut sets (MCSs) from the complex Boolean logic equations with complemented gates. By applying DTAM, approximate conditional core damage probability (CCDP) has been calculated in SUPSA and MUPSA. It was found that CCDP calculated by DTAM was overestimated when complemented gates have non-rare events. Especially, the CCDP overestimation drastically increases if seismic SUPSA or MUPSA has complemented gates with many non-rare events. The objective of this study is to suggest a new quantification method named probability subtraction method (PSM) that replaces DTAM. The PSM calculates accurate CCDP even when SUPSA or MUPSA has complemented gates with many non-rare events. In this paper, the PSM is explained, and the accuracy of the PSM is validated by its applications to a few MUPSAs.

Probabilistic Analyrgis of Slope Stactility for Progressive Failure (진행성 파괴에 대한 사면안정의 확률론적 해석)

  • 김영수
    • Geotechnical Engineering
    • /
    • v.4 no.2
    • /
    • pp.5-14
    • /
    • 1988
  • A probabilistic model for the progressive failure in a homogeneous soil slope consisting of strain-softening material is presented. The local safety margin of any slice above failure surface is assumed to follow a normal distribution. Uncertainties of the shear strength along potential failure surface are expressed by one-dimensional random field models. In this paper, only the case where failure initiates at toe and propagates up to the crest is considerd. The joint distribution of the safety margin of any two adjacent slices above the failure surface is assumed to be bivariate normal. The overall probability of the sliding failure is expressed as a product of probabilities of a series of conditional el.eats. Finally, the developed procedure has been applied in a case study to yield the reliability of a cut slope.

  • PDF

QUALITY IMPROVEMENT OF COMPRESSED COLOR IMAGES USING A PROBABILISTIC APPROACH

  • Takao, Nobuteru;Haraguchi, Shun;Noda, Hideki;Niimi, Michiharu
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.520-524
    • /
    • 2009
  • In compressed color images, colors are usually represented by luminance and chrominance (YCbCr) components. Considering characteristics of human vision system, chrominance (CbCr) components are generally represented more coarsely than luminance component. Aiming at possible recovery of chrominance components, we propose a model-based chrominance estimation algorithm where color images are modeled by a Markov random field (MRF). A simple MRF model is here used whose local conditional probability density function (pdf) for a color vector of a pixel is a Gaussian pdf depending on color vectors of its neighboring pixels. Chrominance components of a pixel are estimated by maximizing the conditional pdf given its luminance component and its neighboring color vectors. Experimental results show that the proposed chrominance estimation algorithm is effective for quality improvement of compressed color images such as JPEG and JPEG2000.

  • PDF

A Study of Anomaly Detection for ICT Infrastructure using Conditional Multimodal Autoencoder (ICT 인프라 이상탐지를 위한 조건부 멀티모달 오토인코더에 관한 연구)

  • Shin, Byungjin;Lee, Jonghoon;Han, Sangjin;Park, Choong-Shik
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.3
    • /
    • pp.57-73
    • /
    • 2021
  • Maintenance and prevention of failure through anomaly detection of ICT infrastructure is becoming important. System monitoring data is multidimensional time series data. When we deal with multidimensional time series data, we have difficulty in considering both characteristics of multidimensional data and characteristics of time series data. When dealing with multidimensional data, correlation between variables should be considered. Existing methods such as probability and linear base, distance base, etc. are degraded due to limitations called the curse of dimensions. In addition, time series data is preprocessed by applying sliding window technique and time series decomposition for self-correlation analysis. These techniques are the cause of increasing the dimension of data, so it is necessary to supplement them. The anomaly detection field is an old research field, and statistical methods and regression analysis were used in the early days. Currently, there are active studies to apply machine learning and artificial neural network technology to this field. Statistically based methods are difficult to apply when data is non-homogeneous, and do not detect local outliers well. The regression analysis method compares the predictive value and the actual value after learning the regression formula based on the parametric statistics and it detects abnormality. Anomaly detection using regression analysis has the disadvantage that the performance is lowered when the model is not solid and the noise or outliers of the data are included. There is a restriction that learning data with noise or outliers should be used. The autoencoder using artificial neural networks is learned to output as similar as possible to input data. It has many advantages compared to existing probability and linear model, cluster analysis, and map learning. It can be applied to data that does not satisfy probability distribution or linear assumption. In addition, it is possible to learn non-mapping without label data for teaching. However, there is a limitation of local outlier identification of multidimensional data in anomaly detection, and there is a problem that the dimension of data is greatly increased due to the characteristics of time series data. In this study, we propose a CMAE (Conditional Multimodal Autoencoder) that enhances the performance of anomaly detection by considering local outliers and time series characteristics. First, we applied Multimodal Autoencoder (MAE) to improve the limitations of local outlier identification of multidimensional data. Multimodals are commonly used to learn different types of inputs, such as voice and image. The different modal shares the bottleneck effect of Autoencoder and it learns correlation. In addition, CAE (Conditional Autoencoder) was used to learn the characteristics of time series data effectively without increasing the dimension of data. In general, conditional input mainly uses category variables, but in this study, time was used as a condition to learn periodicity. The CMAE model proposed in this paper was verified by comparing with the Unimodal Autoencoder (UAE) and Multi-modal Autoencoder (MAE). The restoration performance of Autoencoder for 41 variables was confirmed in the proposed model and the comparison model. The restoration performance is different by variables, and the restoration is normally well operated because the loss value is small for Memory, Disk, and Network modals in all three Autoencoder models. The process modal did not show a significant difference in all three models, and the CPU modal showed excellent performance in CMAE. ROC curve was prepared for the evaluation of anomaly detection performance in the proposed model and the comparison model, and AUC, accuracy, precision, recall, and F1-score were compared. In all indicators, the performance was shown in the order of CMAE, MAE, and AE. Especially, the reproduction rate was 0.9828 for CMAE, which can be confirmed to detect almost most of the abnormalities. The accuracy of the model was also improved and 87.12%, and the F1-score was 0.8883, which is considered to be suitable for anomaly detection. In practical aspect, the proposed model has an additional advantage in addition to performance improvement. The use of techniques such as time series decomposition and sliding windows has the disadvantage of managing unnecessary procedures; and their dimensional increase can cause a decrease in the computational speed in inference.The proposed model has characteristics that are easy to apply to practical tasks such as inference speed and model management.

Assessment of Slope Stability With the Uncertainty in Soil Property Characterization (지반성질 불확실성을 고려한 사면안정 해석)

  • 김진만
    • Proceedings of the Korean Geotechical Society Conference
    • /
    • 2003.03a
    • /
    • pp.123-130
    • /
    • 2003
  • The estimation of key soil properties and subsequent quantitative assessment of the associated uncertainties has always been an important issue in geotechnical engineering. It is well recognized that soil properties vary spatially as a result of depositional and post-depositional processes. The stochastic nature of spatially varying soil properties can be treated as a random field. A practical statistical approach that can be used to systematically model various sources of uncertainty is presented in the context of reliability analysis of slope stability Newly developed expressions for probabilistic characterization of soil properties incorporate sampling and measurement errors, as well as spatial variability and its reduced variance due to spatial averaging. Reliability analyses of the probability of slope failure using the different statistical representations of soil properties show that the incorporation of spatial correlation and conditional simulation leads to significantly lower probability of failure than obtained using simple random variable approach.

  • PDF

The Characteristics of Wave Statistical Data and Quality Assurance (파랑 통계자료의 특성과 신뢰성 검토)

  • Park, J.H.
    • Journal of Power System Engineering
    • /
    • v.13 no.2
    • /
    • pp.63-70
    • /
    • 2009
  • This paper discusses the influence on long-tenn predictions of the ship response in ocean by using the Global Wave Statistics data, GWS, and wave information from the remote sensing satellites. GWS's standard scatter diagrams of significant wave height and zero-crossing wave period are suggested to be corrected to a round number of 0.01/1000 fitted with a statistical analytic model of the conditional lognormal distribution for zero-crossing wave period. The GEOSAT satellite data are utilized which presented by I. R. Young and G. J. Holland (1996, named as GEOSAT data). At first, qualities of this data are investigated, and statistical characteristic trends are studied by means of applying known probability distribution functions. The wave height data of GEOSAT are compared to the data observed onboard merchant ships, the data observed by measure instrument installed on the ocean-going container ship and so on. To execute a long-tenn prediction of ship response, joint probability functions between wave height and wave period are introduced, therefore long-term statistical predictions are executed by using the functions.

  • PDF