• Title/Summary/Keyword: Probabilistic Analysis

Search Result 1,536, Processing Time 0.022 seconds

A Framework of Recognition and Tracking for Underwater Objects based on Sonar Images : Part 2. Design and Implementation of Realtime Framework using Probabilistic Candidate Selection (소나 영상 기반의 수중 물체 인식과 추종을 위한 구조 : Part 2. 확률적 후보 선택을 통한 실시간 프레임워크의 설계 및 구현)

  • Lee, Yeongjun;Kim, Tae Gyun;Lee, Jihong;Choi, Hyun-Taek
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.51 no.3
    • /
    • pp.164-173
    • /
    • 2014
  • In underwater robotics, vision would be a key element for recognition in underwater environments. However, due to turbidity an underwater optical camera is rarely available. An underwater imaging sonar, as an alternative, delivers low quality sonar images which are not stable and accurate enough to find out natural objects by image processing. For this, artificial landmarks based on the characteristics of ultrasonic waves and their recognition method by a shape matrix transformation were proposed and were proven in Part 1. But, this is not working properly in undulating and dynamically noisy sea-bottom. To solve this, we propose a framework providing a selection phase of likelihood candidates, a selection phase for final candidates, recognition phase and tracking phase in sequence images, where a particle filter based selection mechanism to eliminate fake candidates and a mean shift based tracking algorithm are also proposed. All 4 steps are running in parallel and real-time processing. The proposed framework is flexible to add and to modify internal algorithms. A pool test and sea trial are carried out to prove the performance, and detail analysis of experimental results are done. Information is obtained from tracking phase such as relative distance, bearing will be expected to be used for control and navigation of underwater robots.

A Design and Implementation of Reliability Analyzer for Embedded Software using Markov Chain Model and Unit Testing (내장형 소프트웨어 마르코프 체인 모델과 단위 테스트를 이용한 내장형 소프트웨어 신뢰도 분석 도구의 설계와 구현)

  • Kwak, Dong-Gyu;Yoo, Chae-Woo;Choi, Jae-Young
    • Journal of the Korea Society of Computer and Information
    • /
    • v.16 no.12
    • /
    • pp.1-10
    • /
    • 2011
  • As requirements of embedded system get complicated, the tool for analyzing the reliability of embedded software is being needed. A probabilistic modeling is used as the way of analyzing the reliability of a software and to apply it to embedded software controlling multiple devices. So, it is necessary to specialize that to embedded software. Also, existing reliability analyzers should measure the transition probability of each condition in different ways and doesn't consider reusing the model once used. In this paper, we suggest a reliability analyzer for embedded software using embedded software Markov chin model and a unit testing tool. Embedded software Markov chain model is model specializing Markov chain model which is used for analyzing reliability to an embedded software. And a unit testing tool has host-target structure which is appropriate to development environment of embedded software. This tool can analyze the reliability more easily than existing tool by automatically measuring the transition probability between units for analyzing reliability from the result of unit testing. It can also directly apply the test result updated by unit testing tool by representing software model as a XML oriented document and has the advantage that many developers can access easily using the web oriented interface and SVN store. In this paper, we show reliability analyzing of a example by so doing show usefulness of reliability analyzer.

Analysis of the Mean and Standard Deviation due to the Change of the Probability Density Function on Tidal Elevation Data (조위의 확률밀도함수 변화에 따른 평균 및 표준편차 분석)

  • Cho, Hong-Yeon;Jeong, Shin-Taek;Lee, Khil-Ha;Kim, Tae-Heon
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.22 no.4
    • /
    • pp.279-285
    • /
    • 2010
  • In the process of the probabilistic-based design on the coastal structures, the probability density function (pdf) of tidal elevation data is assumed as the normal distribution function. The pdf shape of tidal elevation data, however, is better-fitted to the double-peak normal distribution function and the equivalent mean and standard deviation (SD) estimation process based on the equivalent normal distribution is required. The equivalent mean and SD (equivalent parameters) are different with the mean and SD (normal parameters) estimated in the condition that the pdf of tidal elevation is normal distribution. In this study, the difference, i.e., estimation error, between equivalent parameters and normal parameters is compared and analysed. The difference is increased as the tidal elevation and its range are increased. The mean and SD differences in the condition of the tidal elevation is ${\pm}400cm$ are above 100 cm and about 80~100 cm, respectively, in Incheon station. Whereas, the mean and SD differences in the condition of the tidal elevation is ${\pm}60cm$ are very small values in the range of 2~4 cm, in Pohang station.

A Three-Dimensiomal Slope Stability Analysis in Probabilistic Solution (3차원(次元) 사면(斜面) 안정해석(安定解析)에 관한 확률론적(確率論的) 연구(研究))

  • Kim, Young Su
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.4 no.3
    • /
    • pp.75-83
    • /
    • 1984
  • The probability of failure is used to analyze the reliability of three dimensional slope failure, instead of conventional factor of safety. The strength parameters are assumed to be normal variated and beta variated. These are interval estimated under the specified confidence level and maximum likelihood estimation. The pseudonormal and beta random variables are generated using the uniform probability transformation method according to central limit theorem and rejection method. By means of a Monte-Carlo Simulation, the probability of failure is defined as; $P_f=M/N$ N: Total number of trials M: Total number of failures Some of the conclusions derived. from the case study include; 1. Three dimensional factors of safety are generally much higher than 2-D factors of safety. However situations appear to exist where the 3-D factor of safety can be lower than the 2-D factor of safety. 2. The $F_3/F_2$ ratio appears to be quite sensitive to c and ${\phi}$ and to the shape of the 3-D shear surface and the slope but not to be to the unit weight of soil. 3. From the two models (normal, beta) considered for the distribution of the factor of safety, the beta distribution generally provides lager than normal distribution. 4. Results obtained using the beta and normal models are presented in a nomgraph relating slope height and slop angle to probability of failure.

  • PDF

Robust Design Method for Complex Stochastic Inventory Model

  • Hwang, In-Keuk;Park, Dong-Jin
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1999.04a
    • /
    • pp.426-426
    • /
    • 1999
  • ;There are many sources of uncertainty in a typical production and inventory system. There is uncertainty as to how many items customers will demand during the next day, week, month, or year. There is uncertainty about delivery times of the product. Uncertainty exacts a toll from management in a variety of ways. A spurt in a demand or a delay in production may lead to stockouts, with the potential for lost revenue and customer dissatisfaction. Firms typically hold inventory to provide protection against uncertainty. A cushion of inventory on hand allows management to face unexpected demands or delays in delivery with a reduced chance of incurring a stockout. The proposed strategies are used for the design of a probabilistic inventory system. In the traditional approach to the design of an inventory system, the goal is to find the best setting of various inventory control policy parameters such as the re-order level, review period, order quantity, etc. which would minimize the total inventory cost. The goals of the analysis need to be defined, so that robustness becomes an important design criterion. Moreover, one has to conceptualize and identify appropriate noise variables. There are two main goals for the inventory policy design. One is to minimize the average inventory cost and the stockouts. The other is to the variability for the average inventory cost and the stockouts The total average inventory cost is the sum of three components: the ordering cost, the holding cost, and the shortage costs. The shortage costs include the cost of the lost sales, cost of loss of goodwill, cost of customer dissatisfaction, etc. The noise factors for this design problem are identified to be: the mean demand rate and the mean lead time. Both the demand and the lead time are assumed to be normal random variables. Thus robustness for this inventory system is interpreted as insensitivity of the average inventory cost and the stockout to uncontrollable fluctuations in the mean demand rate and mean lead time. To make this inventory system for robustness, the concept of utility theory will be used. Utility theory is an analytical method for making a decision concerning an action to take, given a set of multiple criteria upon which the decision is to be based. Utility theory is appropriate for design having different scale such as demand rate and lead time since utility theory represents different scale across decision making attributes with zero to one ranks, higher preference modeled with a higher rank. Using utility theory, three design strategies, such as distance strategy, response strategy, and priority-based strategy. for the robust inventory system will be developed.loped.

  • PDF

Variation of probability of sonar detection by internal waves in the South Western Sea of Jeju Island (제주 서남부해역에서 내부파에 의한 소나 탐지확률 변화)

  • An, Sangkyum;Park, Jungyong;Choo, Youngmin;Seong, Woojae
    • The Journal of the Acoustical Society of Korea
    • /
    • v.37 no.1
    • /
    • pp.31-38
    • /
    • 2018
  • Based on the measured data in the south western sea of Jeju Island during the SAVEX15(Shallow Water Acoustic Variability EXperiment 2015), the effect of internal waves on the PPD (Predictive Probability of Detection) of a sonar system was analyzed. The southern west sea of Jeju Island has complex flows due to internal waves and USC (Underwater Sound Channel). In this paper, sonar performance is predicted by probabilistic approach. The LFM (Linear Frequency Modulation) and MLS (Maximum Length Sequence) signals of 11 kHz - 31 kHz band of SAVEX15 data were processed to calculate the TL (Transmission Loss) and NL (Noise Level) at a distance of approximately 2.8 km from the source and the receiver. The PDF (Probability Density Function) of TL and NL is convoluted to obtain the PDF of the SE (Signal Excess) and the PPD according to the depth of the source and receiver is calculated. Analysis of the changes in the PPD over time when there are internal waves such as soliton packet and internal tide has confirmed that the PPD value is affected by different aspects.

Relative Navigation Study Using Multiple PSD Sensor and Beacon Module Based on Kalman Filter (복수 PSD와 비콘을 이용한 칼만필터 기반 상대항법에 대한 연구)

  • Song, Jeonggyu;Jeong, Junho;Yang, Seungwon;Kim, Seungkeun;Suk, Jinyoung
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.46 no.3
    • /
    • pp.219-229
    • /
    • 2018
  • This paper proposes Kalman Filter-based relative navigation algorithms for proximity tasks such as rendezvous/docking/cluster-operation of spacecraft using PSD Sensors and Infrared Beacon Modules. Numerical simulations are performed for comparative analysis of the performance of each relative-navigation technique. Based on the operation principle and optical modeling of the PSD Sensor and the Infrared Beacon Module used in the relative navigation algorithm, a measurement model for the Kalman filter is constructed. The Extended Kalman Filter(EKF) and the Unscented Kalman Filter(UKF) are used as probabilistic relative navigation based on measurement fusion to utilize kinematics and dynamics information on translational and rotation motions of satellites. Relative position and relative attitude estimation performance of two filters is compared. Especially, through the simulation of various scenarios, performance changes are also investigated depending on the number of PSD Sensors and IR Beacons in target and chaser satellites.

Time-dependent Reduction of Sliding Cohesion due to Rock Bridges along Discontinuities (암석 브리지에 의한 불연속면 점착강도의 시간의존성에 관한 연구)

  • 박철환;전석원
    • Tunnel and Underground Space
    • /
    • v.14 no.3
    • /
    • pp.167-174
    • /
    • 2004
  • This paper is to introduce an article published in Rock Mechanics and Rock Engineering, 2003. In this research, a fracture mechanics model is developed to illustrate the importance of time-dependence far brittle fractured rock. In particular a model is developed fer the time-dependent degradation of rock joint cohesion. Degradation of joint cohesion is modeled as the time-dependent breaking of intact patches or rock bridges along the joint surface. A fracture mechanics model is developed utilizing subcritical crack growth, which results in a closed-form solution for joint cohesion as a function of time. As an example, a rock block containing rock bridges subjected to plane sliding is analyzed. The cohesion is found to continually decrease, at first slowly and then more rapidly. At a particular value of time the cohesion reduces to value that results in slope instability. A second example is given where variations in some of the material parameters are assumed. A probabilistic slope analysis is conducted, and the probability of failure as a function of time is predicted. The probability of failure is found to increase with time, from an initial value of 5% to a value at 100 years of over 40%. These examples show the importance of being able to predict the time-dependent behavior of a rock mass containing discontinuities, even for relatively short-term rock structures.

Analysis of Soil Erosion Hazard Zone by R Factor Frequency (빈도별 R인자에 의한 토양침식 위험지역 분석)

  • Kim, Joo-Hun;Oh, Deuk-Keun
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.7 no.2
    • /
    • pp.47-56
    • /
    • 2004
  • The purpose of this study is to estimate soil loss amount according to the rainfall-runoff erosivity factor frequency and to analyze the hazard zone that has high possibilities of soil erosion in the watershed. RUSLE was used to analyze soil loss quantity. The study area is Gwanchon that is part of Seomjin river basin. To obtain the frequency rainfall-runoff erosivity factor, the daily maximum rainfall data for 39 years was used. The probability rainfall was calculated by using the Normal distribution, Log-normal distribution, Pearson type III distribution, Log-Pearson type III distribution and Extreme-I distribution. Log-Pearson type III was considered to be the most accurate of all, and used to estimate 24 hours probabilistic rainfall, and the rainfall-runoff erosivity factor by frequency was estimated by adapting the Huff distribution ratio. As a result of estimating soil erosion quantity, the average soil quantity shows 12.8 and $68.0ton/ha{\cdot}yr$, respectively from 2 years to 200 years frequency. The distribution of soil loss quantity within a watershed was classified into 4 classes, and the hazard zone that has high possibilities of soil erosion was analyzed on the basis of these 4 classes. The hazard zone represents class IV. The land use area of class IV shows $0.01-5.28km^2$, it ranges 0.02-9.06% of total farming area. Especially, in the case of a frequency of 200 years, the field area occupies 77.1% of total fanning area. Accordingly, it is considered that soil loss can be influenced by land cover and cultivation practices.

  • PDF

Data processing system and spatial-temporal reproducibility assessment of GloSea5 model (GloSea5 모델의 자료처리 시스템 구축 및 시·공간적 재현성평가)

  • Moon, Soojin;Han, Soohee;Choi, Kwangsoon;Song, Junghyun
    • Journal of Korea Water Resources Association
    • /
    • v.49 no.9
    • /
    • pp.761-771
    • /
    • 2016
  • The GloSea5 (Global Seasonal forecasting system version 5) is provided and operated by the KMA (Korea Meteorological Administration). GloSea5 provides Forecast (FCST) and Hindcast (HCST) data and its horizontal resolution is about 60km ($0.83^{\circ}{\times}0.56^{\circ}$) in the mid-latitudes. In order to use this data in watershed-scale water management, GloSea5 needs spatial-temporal downscaling. As such, statistical downscaling was used to correct for systematic biases of variables and to improve data reliability. HCST data is provided in ensemble format, and the highest statistical correlation ($R^2=0.60$, RMSE = 88.92, NSE = 0.57) of ensemble precipitation was reported for the Yongdam Dam watershed on the #6 grid. Additionally, the original GloSea5 (600.1 mm) showed the greatest difference (-26.5%) compared to observations (816.1 mm) during the summer flood season. However, downscaled GloSea5 was shown to have only a -3.1% error rate. Most of the underestimated results corresponded to precipitation levels during the flood season and the downscaled GloSea5 showed important results of restoration in precipitation levels. Per the analysis results of spatial autocorrelation using seasonal Moran's I, the spatial distribution was shown to be statistically significant. These results can improve the uncertainty of original GloSea5 and substantiate its spatial-temporal accuracy and validity. The spatial-temporal reproducibility assessment will play a very important role as basic data for watershed-scale water management.