• Title/Summary/Keyword: lognormal distribution model

Search Result 100, Processing Time 0.026 seconds

Adaptive Iterative Depeckling of SAR Imagery

  • Lee, Sang-Hoon
    • Korean Journal of Remote Sensing
    • /
    • v.23 no.5
    • /
    • pp.455-464
    • /
    • 2007
  • Lee(2007) suggested the Point-Jacobian iteration MAP estimation(PJIMAP) for noise removal of the images that are corrupted by multiplicative speckle noise. It is to find a MAP estimation of noisy-free imagery based on a Bayesian model using the lognormal distribution for image intensity and an MRF for image texture. When the image intensity is logarithmically transformed, the speckle noise is approximately Gaussian additive noise, and it tends to a normal probability much faster than the intensity distribution. The MRF is incorporated into digital image analysis by viewing pixel types as states of molecules in a lattice-like physical system. In this study, the MAP estimation is computed by the Point-Jacobian iteration using adaptive parameters. At each iteration, the parameters related to the Bayesian model are adaptively estimated using the updated information. The results of the proposed scheme were compared to them of PJIMAP with SAR simulation data generated by the Monte Carlo method. The experiments demonstrated an improvement in relaxing speckle noise and estimating noise-free intensity by using the adaptive parameters for the Ponit-Jacobian iteration.

A Study on Cost Rate Analysis Methodology of Credit Card Value Proposition (신용카드 부가서비스 요율 분석 방법론에 대한 연구)

  • Lee, Chan-Kyung;Roh, Hyung-Bong
    • Journal of Korean Society for Quality Management
    • /
    • v.46 no.4
    • /
    • pp.797-820
    • /
    • 2018
  • Purpose: It is to seek for an appropriate cost rate analysis methodology of credit card value propositions in Korea. For this issue, it is claimed that methodologies based on probability distribution is more suitable than methodologies based on data-mining. The analysis model constructed for the cost rate estimation is called VCPM model. Methods: The model includes two major variables denoted as S and P. S is monthly credit card usage amount. P stands for the proportion of usage amount at special merchants over the whole monthly usage amount. The distributions assumed for P are positively skewed distributions such as exponential, gamma and lognormal. The major inputs to the model are also derived from S and P, which are E(S) and the aggregate proportion of usage amount at special merchants over the total monthly usage amount. Results: When the credit card's value proposition is general discount, the VCPM model fits well and generates reasonable cost rate(denoted as R). However, it seems that the model does not work well for other types of credit cards. Conclusion: The VCPM model is reliable for calculating cost rate for credit cards with positively skewed distribution of P, which are general discount card. However, another model should be built for cards with other types of distributions of P.

Analysis on the Relationship between Number of Species and Survey Area of Benthic Macroinvertebrates Using Weibull Distribution Function (와이블 분포함수를 이용한 저서성 대형무척추동물의 종수-조사면적 관계 해석)

  • Kong, Dongsoo;Kim, Ah Reum
    • Journal of Korean Society on Water Environment
    • /
    • v.31 no.2
    • /
    • pp.142-150
    • /
    • 2015
  • The relationship between the number of benthic macroinvertebrate species and the accumulated survey area were investigated in a clean stream and an impaired stream of Korea. Five models to characterize species-area functions were compared, and the Weibull model fitted species-area data well. The other models (Arrhenius, Romell-Gleason, Kylin, Lognormal model) had small or notable bias. The maximum number of species and half-saturation area derived from the Weibull model may be used as the indicators of the carrying capacity and the habitat complexity respectively.

Model for Process Quality Assurance When the Fraction Nonconforming is Very Small (극소불량 공정보증을 위한 모형연구)

  • Jong-Gurl Kim
    • Proceedings of the Safety Management and Science Conference
    • /
    • 1999.11a
    • /
    • pp.247-257
    • /
    • 1999
  • There are several models for process quality assurance by quality system(ISO 9000), process capability analysis, acceptance control chart and so on. When a high level process capability has been achieved, it takes a long time to monitor the process shift, so it is sometimes necessary to develop a quicker monitoring system. To achieve a quicker quality assurance model for high-reliability process, this paper presents a model for process quality assurance when the fraction nonconforming is very small. We design an acceptance control chart based on variable quality characteristic and time-censored accelerated testing. The distribution of the characteristics is assumed to be normal of lognormal with a location parameter of the distribution that is a linear function of a stress. The design parameters are sample size, control limits and sample proportions allocated to low stress. These parameters are obtained under minimization of the relative variance of the MLE of location parameter subject to APL and RPL constraints.

  • PDF

Statistical Analysis of Electrical Tree Inception Voltage, Breakdown Voltage and Tree Breakdown Time Data of Unsaturated Polyester Resin

  • Ahmad, Mohd Hafizi;Bashir, Nouruddeen;Ahmad, Hussein;Piah, Mohamed Afendi Mohamed;Abdul-Malek, Zulkurnain;Yusof, Fadhilah
    • Journal of Electrical Engineering and Technology
    • /
    • v.8 no.4
    • /
    • pp.840-849
    • /
    • 2013
  • This paper presents a statistical approach to analyze electrical tree inception voltage, electrical tree breakdown voltage and tree breakdown time of unsaturated polyester resin subjected to AC voltage. The aim of this work was to show that Weibull and lognormal distribution may not be the most suitable distributions for analysis of electrical treeing data. In this paper, an investigation of statistical distributions of electrical tree inception voltage, electrical tree breakdown voltage and breakdown time data was performed on 108 leaf-like specimen samples. Revelations from the test results showed that Johnson SB distribution is the best fit for electrical tree inception voltage and tree breakdown time data while electrical tree breakdown voltage data is best suited with Wakeby distribution. The fitting step was performed by means of Anderson-Darling (AD) Goodness-of-fit test (GOF). Based on the fitting results of tree inception voltage, tree breakdown time and tree breakdown voltage data, Johnson SB and Wakeby exhibit the lowest error value respectively compared to Weibull and lognormal.

Determination of the Fracture Hydraulic Parameters for Three Dimensional Discrete Fracture Network Modeling (3차원 단열망모델링을 위한 단열수리인자 도출)

  • 김경수;김천수;배대석;김원영;최영섭;김중렬
    • Journal of the Korean Society of Groundwater Environment
    • /
    • v.5 no.2
    • /
    • pp.80-87
    • /
    • 1998
  • Since groundwater flow paths have one of the major roles to transport the radioactive nuclides from the radioactive waste repository to the biosphere, the discrete fracture network model is used for the rock block scale flow instead of the porous continuum model. This study aims to construct a three dimensional discrete fracture network to interpret the groundwater flow system in the study site. The modeling work includes the determination of the probabilistic distribution function from the fracture geometric and hydraulic parameters, three dimensional fracture modeling and model calibration. The results of the constant pressure tests performed in a fixed interval length at boreholes indicate that the flow dimension around boreholes shows mainly radial to spherical flow pattern. The fracture transmissivity value calculated by Cubic law is 6.12${\times}$10$\^$-7/ ㎡/sec with lognormal distribution. The conductive fracture intensity estimated by FracMan code is 1.73. Based on this intensity, the total number of conductive fractures are obtained as 3,080 in the rock block of 100 m${\times}$100 m${\times}$100 m.

  • PDF

Comparative Studies on the Simulation for the Monthly Runoff (월유출량의 모의발생에 관한 비교 연구)

  • 박명근;서승덕;이순혁;맹승진
    • Magazine of the Korean Society of Agricultural Engineers
    • /
    • v.38 no.4
    • /
    • pp.110-124
    • /
    • 1996
  • This study was conducted to simulate long seres of synthetic monthly flows by multi-season first order Markov model with selection of best fitting frequency distribution, harmonic synthetic and harmonic regression models and to make a comparison of statistical parameters between observes and synthetic flows of five watersheds in Geum river system. The results obtained through this study can be summarized as follow. 1. Both gamma and two parameter lognormal distributions were found to be suitable ones for monthly flows in all watersheds by Kolmogorov-Smirnov test. 2. It was found that arithmetic mean values of synthetic monthly flows simulated by multi-season first order Markov model with gamma distribution are much closer to the results of the observed data in comparison with those of the other models in the applied watersheds. 3. The coefficients of variation, index of fluctuation for monthly flows simulated by multi-season first order Markov model with gamma distribution are appeared closer to those of the observed data in comparison with those of the other models in Geum river system. 4. Synthetic monthly flows were simulated over 100 years by multi-season first order Markov model with gamma distribution which is acknowledged as a suitable simulation modal in this study.

  • PDF

A rice-lognormal channel model for nongeostationary land mobile satellite system

  • 황승훈;한규진;안재영;서종수;황금찬
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.23 no.4
    • /
    • pp.1113-1120
    • /
    • 1998
  • This paper introduces a channel model that is a combination of Rice and log-normal statistics, with independent shadowing affectingeach direct and diffuse component, repectively. This model extends the channel model of a combined Rice and Log-normal, proposed by Corazza, to include the independent shadowing. The validity of model is confirmed by comparisons with the data collectedin the literature, the analytical model, and the computer model in terms of probability distribution of the evvelope of each model. The model turns out to be one of many well-known narrowband models in limiting cases, e.g. Rayleigh, Rice, log-normal, Suzuki, Loo, and Corazza. Finally, the examples of bit error probability evaluations for several values of the elevation angle in the channel are provided.

  • PDF

On the Variations of Spatial Correlation Structure of Rainfall (강우공간상관구조의 변동 특성)

  • Kim, Kyoung-Jun;Yoo, Chul-Sang
    • Journal of Korea Water Resources Association
    • /
    • v.40 no.12
    • /
    • pp.943-956
    • /
    • 2007
  • Among various statistics, the spatial correlation function, that is "correlogram", is frequently used to evaluate or design the rain gauge network and to model the rainfall field. The spatial correlation structure of rainfall has the significant variation due to many factors. Thus, the variation of spatial correlation structure of rainfall causes serious problems when deciding the spatial correlation function of rainfall within the basin. In this study, the spatial rainfall structure was modeled using bivariate mixed distributions to derive monthly spatial correlograms, based on Gaussian and lognormal distributions. This study derived the correlograms using hourly data of 28 rain gauge stations in the Keum river basin. From the results, we concluded as following; (1) Among three cases (Case A, Case B, Case C) considered, the Case A(+,+) seems to be the most relevant as it is not distorted much by zero measurements. (2) The spatial correlograms based on the lognormal distribution, which is theoretically as well as practically adequate, is better than that based on the Gaussian distribution. (3) The spatial correlation in July exponentially decrease more obviously than those in other months. (4) The spatial correlograms should be derived considering the temporal resolution(hourly, daily, etc) of interest.

Performance Analysis of Economic VaR Estimation using Risk Neutral Probability Distributions

  • Heo, Se-Jeong;Yeo, Sung-Chil;Kang, Tae-Hun
    • The Korean Journal of Applied Statistics
    • /
    • v.25 no.5
    • /
    • pp.757-773
    • /
    • 2012
  • Traditional value at risk(S-VaR) has a difficulity in predicting the future risk of financial asset prices since S-VaR is a backward looking measure based on the historical data of the underlying asset prices. In order to resolve the deficiency of S-VaR, an economic value at risk(E-VaR) using the risk neutral probability distributions is suggested since E-VaR is a forward looking measure based on the option price data. In this study E-VaR is estimated by assuming the generalized gamma distribution(GGD) as risk neutral density function which is implied in the option. The estimated E-VaR with GGD was compared with E-VaR estimates under the Black-Scholes model, two-lognormal mixture distribution, generalized extreme value distribution and S-VaR estimates under the normal distribution and GARCH(1, 1) model, respectively. The option market data of the KOSPI 200 index are used in order to compare the performances of the above VaR estimates. The results of the empirical analysis show that GGD seems to have a tendency to estimate VaR conservatively; however, GGD is superior to other models in the overall sense.