• Title/Summary/Keyword: stochastic problem

Search Result 534, Processing Time 0.024 seconds

A Study on Improvement in Digital Image Restoration by a Recursive Vector Processing (순환벡터처리에 의한 디지털 영상복원에 관한 연구)

  • 이대영;이윤현
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.8 no.3
    • /
    • pp.105-112
    • /
    • 1983
  • This paper discribes technique of the recursive restoration for the images degraded by linear space invariant blur and additive white Gaussian noise. The image is characterized statistically by tis mean and correlation function. An exponential autocorrelation function has been used to model neighborhood model. The vector model was used because of analytical simplicitly and capability to implement brightness correlation function. Base on the vector model, a two-dimensional discrete stochastic a 12 point neighborhood model for represeting images was developme and used the technique of moving window processing to restore blurred and noisy images without dimensionality increesing, It has been shown a 12 point neighborhood model was found to be more adequate than a 8 point pixel model to obtain optimum pixel estimated. If the image is highly correlated, it is necessary to use a large number of points in the neighborhood in order to have improvements in restoring image. It is believed that these result could be applied to a wide range of image processing problem. Because image processing thchniques normally required a 2-D linear filtering.

  • PDF

Iterative Bispectrum Estimation and Signal Recovery Based On Weighted Regularization (가중 정규화에 기반한 반복적 바이스펙트럼 추정과 신호복원)

  • Lim, Won-Bae;Hur, Bong-Soo;Lee, Hak-Moo;Kang, Moon-Gi
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.37 no.3
    • /
    • pp.98-109
    • /
    • 2000
  • While the bispectrum has desirable properties in itself and therefore has a lot of potential to be applied to signal and Image restoration. few real-world application results have appeared in literature The major problem with this IS the difficulty In realizing the expectation operator of the true bispectrum, due to the lack of realizations. In this paper, the true bispectrum is defined as the expectation of the sample bispectrum, which IS the Fourier representation of the triple correlation given one realization The characteristics of the sample bispectrum are analyzed and a way to obtain an estimate of the true bispectrum without stochastic expectation, using the generalized theory of weighted regularization is shown. The bispectrum estimated by the proposed algorithm is experimentally demonstrated to be useful for signal recovery under blurred noisy condition.

  • PDF

An Effective Estimation method for Lexical Probabilities in Korean Lexical Disambiguation (한국어 어휘 중의성 해소에서 어휘 확률에 대한 효과적인 평가 방법)

  • Lee, Ha-Gyu
    • The Transactions of the Korea Information Processing Society
    • /
    • v.3 no.6
    • /
    • pp.1588-1597
    • /
    • 1996
  • This paper describes an estimation method for lexical probabilities in Korean lexical disambiguation. In the stochastic to lexical disambiguation lexical probabilities and contextual probabilities are generally estimated on the basis of statistical data extracted form corpora. It is desirable to apply lexical probabilities in terms of word phrases for Korean because sentences are spaced in the unit of word phrase. However, Korean word phrases are so multiform that there are more or less chances that lexical probabilities cannot be estimated directly in terms of word phrases though fairly large corpora are used. To overcome this problem, similarity for word phrases is defined from the lexical analysis point of view in this research and an estimation method for Korean lexical probabilities based on the similarity is proposed. In this method, when a lexical probability for a word phrase cannot be estimated directly, it is estimated indirectly through the word phrase similar to the given one. Experimental results show that the proposed approach is effective for Korean lexical disambiguation.

  • PDF

A Simple Stereo Matching Algorithm using PBIL and its Alternative (PBIL을 이용한 소형 스테레오 정합 및 대안 알고리즘)

  • Han Kyu-Phil
    • The KIPS Transactions:PartB
    • /
    • v.12B no.4 s.100
    • /
    • pp.429-436
    • /
    • 2005
  • A simple stereo matching algorithm using population-based incremental learning(PBIL) is proposed in this paper to decrease the general problem of genetic algorithms, such as memory consumption and inefficiency of search. PBIL is a variation of genetic algorithms using stochastic search and competitive teaming based on a probability vector. The structure of PBIL is simpler than that of other genetic algorithm families, such as serial and parallel ones, due to the use of a probability vector. The PBIL strategy is simplified and adapted for stereo matching circumstances. Thus, gene pool, chromosome crossover, and gene mutation we removed, while the evolution rule, that fitter chromosomes should have higher survival probabilities, is preserved. As a result, memory space is decreased, matching rules are simplified and computation cost is reduced. In addition, a scheme controlling the distance of neighbors for disparity smoothness is inserted to obtain a wide-area consistency of disparities, like a result of coarse-to-fine matchers. Because of this scheme, the proposed algorithm can produce a stable disparity map with a small fixed-size window. Finally, an alterative version of the proposed algorithm without using probability vector is also presented for simpler set-ups.

The application of reliability analysis for the design of storm sewer (우수관의 설계를 위한 신뢰성해석기법의 적용)

  • Kwon, Hyuk Jaea;Lee, Kyung Je
    • Journal of Korea Water Resources Association
    • /
    • v.51 no.10
    • /
    • pp.887-893
    • /
    • 2018
  • In this study, the optimum design technology is suggested by using reliability analysis method. Nowadays, urban flood inundation is easily occurred because of local heavy rain. Traditional deterministic design method for storm sewer may underestimate the size of pipe. Therefore, stochastic method for the storm sewer design is necessary to solve this problem. In the present study, reliability model using FORM (First Order Reliability Method) was developed for the storm sewer. Developed model was applied to the real storm sewers of 5 different areas. Probability of exceeding capacity has been calculated and construction costs according to diameter have been compared. Probability of exceeding capacity of storm sewers of 5 areas have been calculated after estimating the return period of rainfall intensity.

A Study on Development of Automatic Westing Software by Vectorizing Technique (벡터라이징을 이용한 자동부재배치 소프트웨어 개발에 관한 연구)

  • Lho T.J.;Kang D.J.;Kim M.S.;Park Jun-Yeong;Park S.W.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2005.10a
    • /
    • pp.748-753
    • /
    • 2005
  • Among processes to manufacture parts from footwear materials like upper leathers, one of the most essential processes is the cutting one optimally arranging lots of parts on raw footwear materials and cutting. A new nesting strategy was proposed for the 2-dimensional part layout by using a two-stage approach, where which can be effectively used for water jet cutting. In the initial layout stage, a SOAL(Self-Organization Assisted Layout) based on the combination of FCM(Fuzzy C-Means) and SOM was adopted. In the layout improvement stage, SA(Simulated Annealing) based approach was adopted for a finer layout. The proposed approach saves much CPU time through a two-stage approach scheme, while other annealing-based algorithm so far reported fur a nesting problem are computationally expensive. The proposed nesting approach uses the stochastic process, and has a much higher possibility to obtain a global solution than the deterministic searching technique. We developed the automatic nesting software of NST(ver.1.1) software for footwear industry by implementing of these proposed algorithms. The NST software was applied by the optimized automatic arrangement algorithm to cut without the loss of leathers. if possible, after detecting damage areas. Also, NST software can consider about several features in not only natural loathers but artificial ones. Lastly, the NST software can reduce a required time to implement generation of NC code. cutting time, and waste of raw materials because the NST software automatically performs parts arrangement, cutting paths generation and finally NC code generation, which are needed much effect and time to generate them manually.

  • PDF

Harmonic Estimation of Power Signal Based on Time-varying Optimal Finite Impulse Response Filter (시변 최적 유한 임펄스 응답 필터 기반 전력 신호 고조파 검출)

  • Kwon, Bo-Kyu
    • The Journal of Korean Institute of Information Technology
    • /
    • v.16 no.11
    • /
    • pp.97-103
    • /
    • 2018
  • In this paper, the estimation method for the power signal harmonics is proposed by using the time-varying optimal finite impulse response (FIR) filter. To estimate the magnitude and phase-angle of the harmonic components, the time-varying optimal FIR filter is designed for the state space representation of the noisy power signal which the magnitude and phase is considered as a stochastic process. Since the time-varying optimal FIR filter used in the proposed method does not use any priori information of the initial condition and has FIR structure, the proposed method could overcome the demerits of Kalman filter based method such as poor estimation and divergence problem. Due to the FIR structure, the proposed method is more robust against to the model uncertainty than the Kalman filter. Moreover, the proposed method gives more general solution than the time-invariant optimal FIR filter based harmonic estimation method. To verify the performance and robustness of the proposed method, the proposed method is compared with time-varying Kalman filter based method through simulation.

Mid-Term Energy Demand Forecasting Using Conditional Restricted Boltzmann Machine (조건적 제한된 볼츠만머신을 이용한 중기 전력 수요 예측)

  • Kim, Soo-Hyun;Sun, Young-Ghyu;Lee, Dong-gu;Sim, Is-sac;Hwang, Yu-Min;Kim, Hyun-Soo;Kim, Hyung-suk;Kim, Jin-Young
    • Journal of IKEEE
    • /
    • v.23 no.1
    • /
    • pp.127-133
    • /
    • 2019
  • Electric power demand forecasting is one of the important research areas for future smart grid introduction. However, It is difficult to predict because it is affected by many external factors. Traditional methods of forecasting power demand have been limited in making accurate prediction because they use raw power data. In this paper, a probability-based CRBM is proposed to solve the problem of electric power demand prediction using raw power data. The stochastic model is suitable to capture the probabilistic characteristics of electric power data. In order to compare the mid-term power demand forecasting performance of the proposed model, we compared the performance with Recurrent Neural Network(RNN). Performance comparison using electric power data provided by the University of Massachusetts showed that the proposed algorithm results in better performance in mid-term energy demand forecasting.

IMPROVING RELIABILITY OF BRIDGE DETERIORATION MODEL USING GENERATED MISSING CONDITION RATINGS

  • Jung Baeg Son;Jaeho Lee;Michael Blumenstein;Yew-Chaye Loo;Hong Guan;Kriengsak Panuwatwanich
    • International conference on construction engineering and project management
    • /
    • 2009.05a
    • /
    • pp.700-706
    • /
    • 2009
  • Bridges are vital components of any road network which demand crucial and timely decision-making for Maintenance, Repair and Rehabilitation (MR&R) activities. Bridge Management Systems (BMSs) as a decision support system (DSS), have been developed since the early 1990's to assist in the management of a large bridge network. Historical condition ratings obtained from biennial bridge inspections are major resources for predicting future bridge deteriorations via BMSs. Available historical condition ratings in most bridge agencies, however, are very limited, and thus posing a major barrier for obtaining reliable future structural performances. To alleviate this problem, the verified Backward Prediction Model (BPM) technique has been developed to help generate missing historical condition ratings. This is achieved through establishing the correlation between known condition ratings and such non-bridge factors as climate and environmental conditions, traffic volumes and population growth. Such correlations can then be used to obtain the bridge condition ratings of the missing years. With the help of these generated datasets, the currently available bridge deterioration model can be utilized to more reliably forecast future bridge conditions. In this paper, the prediction accuracy based on 4 and 9 BPM-generated historical condition ratings as input data are compared, using deterministic and stochastic bridge deterioration models. The comparison outcomes indicate that the prediction error decreases as more historical condition ratings obtained. This implies that the BPM can be utilised to generate unavailable historical data, which is crucial for bridge deterioration models to achieve more accurate prediction results. Nevertheless, there are considerable limitations in the existing bridge deterioration models. Thus, further research is essential to improve the prediction accuracy of bridge deterioration models.

  • PDF

Density Estimation Technique for Effective Representation of Light In-scattering (빛의 내부산란의 효과적인 표현을 위한 밀도 추정기법)

  • Min, Seung-Ki;Ihm, In-Sung
    • Journal of the Korea Computer Graphics Society
    • /
    • v.16 no.1
    • /
    • pp.9-20
    • /
    • 2010
  • In order to visualize participating media in 3D space, they usually calculate the incoming radiance by subdividing the ray path into small subintervals, and accumulating their respective light energy due to direct illumination, scattering, absorption, and emission. Among these light phenomena, scattering behaves in very complicated manner in 3D space, often requiring a great deal of simulation efforts. To effectively simulate the light scattering effect, several approximation techniques have been proposed. Volume photon mapping takes a simple approach where the light scattering phenomenon is represented in volume photon map through a stochastic simulation, and the stored information is explored in the rendering stage. While effective, this method has a problem that the number of necessary photons increases very fast when a higher variance reduction is needed. In an attempt to resolve such problem, we propose a different approach for rendering particle-based volume data where kernel smoothing, one of several density estimation methods, is explored to represent and reconstruct the light in-scattering effect. The effectiveness of the presented technique is demonstrated with several examples of volume data.