• Title/Summary/Keyword: Poisson 과정

Search Result 160, Processing Time 0.023 seconds

Stochastic Probability Model for Preventive Management of Armor Units of Rubble-Mound Breakwaters (경사제 피복재의 유지관리를 위한 추계학적 확률모형)

  • Lee, Cheol-Eung;Kim, Sang Ug
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.33 no.3
    • /
    • pp.1007-1015
    • /
    • 2013
  • A stochastic probability model based on the non-homogeneous Poisson process is represented that can correctly analyze the time-dependent linear and nonlinear behaviors of total damage over the occurrence process of loads. Introducing several types of damage intensity functions, the probability of failure and the total damage with respect to mean time to failure has been investigated in detail. Taking particularly the limit state to be the random variables followed with a distribution function, the uncertainty of that would be taken into consideration in this paper. In addition, the stochastic probability model has been straightforwardly applied to the rubble-mound breakwaters with the definition of damage level about the erosion of armor units. The probability of failure and the nonlinear total damage with respect to mean time to failure has been analyzed with the damage intensity functions for armor units estimated by fitting the expected total damage to the experimental datum. Based on the present results from the stochastic probability model, the preventive management for the armor units of the rubble-mound breakwaters would be suggested to make a decision on the repairing time and the minimum amounts repaired quantitatively.

Data Analysis of Suspension P-S Velocity Logging in Banded Gneiss Area around Hanam, Gyeonggi Province (경기도 하남시 인근 호상편마암 지역에서 Suspension P-S 속도검층 자료분석)

  • Yu, Young-Chul;Song, Moo-Young;Leem, Kook-Mook
    • The Journal of Engineering Geology
    • /
    • v.17 no.4
    • /
    • pp.623-631
    • /
    • 2007
  • In this paper, dynamic elastic module of banded gneiss were calculated on the basis of SPS velocity logging data obtained from a geotechnical test-hole in Pungsan-dong, Hanam, Gyeonggi Province, Korea. This study mainly focuses on the velocity analysis, Q factor calculation relative to attenuation factor, and generation of crack information and its relation with seismic velocity. As a result, P-wave and S-wave velocity of fresh hard rock was 5,559m/s and 3,063m/s, respectively, with Poisson's ratio being 0.28. With these results, dynamic modules were prepared, and crack information analyzed by acoustic televiewer was incorporated to identify the correlation among and between delay of first arrival by crack amplitude ratio, and velocity. The results of this study revealed that the analyzed logging hole mainly consisted of micro crack and a number of cracks and the size of crack aperture, functioned as a variable to seismic velocity in the micro crack area of this type of hard rock.

Queueing Model for Traffic Loading Improvement of DDoS Attacks in Enterprise Networks (엔터프라이즈 네트워크에서 DDoS 공격의 부하 개선을 위한 큐잉 모델)

  • Ha, Hyeon-Tae;Lee, Hae-Dong;Baek, Hyun-Chul;Kim, Sang-Bok
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.1
    • /
    • pp.107-114
    • /
    • 2011
  • Today the company adopts to use information management method at the network base such as internet, intranet and so on for the speed of business. Therefore the security of information asset protection and continuity of business within company in relation to this is directly connected to the credibility of the company. This paper secures continuity to the certified users using queuing model for the business interruption issue caused by DDoS attack which is faced seriously today. To do this I have reflected overloaded traffic improvement process to the queuing model through the analysis of related traffic information and packet when there occurs DDoS attack with worm/virus. And through experiment I compared and analyzed traffic loading improvement for general network equipment.

Analysis of an M/M/1 Queue with an Attached Continuous-type (s,S)-inventory ((s,S)-정책하의 연속형 내부재고를 갖는 M/M/1 대기행렬모형 분석)

  • Park, Jinsoo;Lee, Hyeon Geun;Kim, Jong Hyeon;Yun, Eun Hyeuk;Baek, Jung Woo
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.23 no.5
    • /
    • pp.19-32
    • /
    • 2018
  • This study focuses on an M/M/1 queue with an attached continuous-type inventory. The customers arrive into the system according to the Poisson process, and are served in their arrival order; i.e., first-come-first-served. The service times are assumed to be independent and identically distributed exponential random variable. At a service completion epoch, the customer consumes a random amount of inventory. The inventory is controlled by the traditional (s, S)-inventory policy with a generally distributed lead time. A customer that arrives during a stock-out period assumed to be lost. For the number of customers and the inventory size, we derive a product-form stationary joint probability distribution and provide some numerical examples. Besides, an operational strategy for the inventory that minimizes the long-term cost will also be discussed.

The Comparative Study of Software Optimal Release Time Based on Log-Logistic Distribution (Log-Logistic 분포 모형에 근거한 소프트웨어 최적방출시기에 관한 비교연구)

  • Kim, Hee-Cheul
    • Journal of the Korea Society of Computer and Information
    • /
    • v.13 no.7
    • /
    • pp.1-9
    • /
    • 2008
  • In this paper, make a study decision problem called an optimal release policies after testing a software system in development phase and transfer it to the user. When correcting or modifying the software, because of the possibility of introducing new faults when correcting or modifying the software, infinite failure non-homogeneous Poisson process models presented and propose an optimal release policies of the life distribution applied log-logistic distribution which can capture the increasing! decreasing nature of the failure occurrence rate per fault. In this paper, discuss optimal software release policies which minimize a total average software cost of development and maintenance under the constraint of satisfying a software reliability requirement. In a numerical example, after trend test applied and estimated the parameters using maximum likelihood estimation of inter-failure time data, make out estimating software optimal release time.

  • PDF

The Comparative Study for NHPP Software Reliability Model based on the Property of Learning Effect of Log Linear Shaped Hazard Function (대수 선형 위험함수 학습효과에 근거한 NHPP 신뢰성장 소프트웨어 모형에 관한 비교 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • Convergence Security Journal
    • /
    • v.12 no.3
    • /
    • pp.19-26
    • /
    • 2012
  • In this study, software products developed in the course of testing, software managers in the process of testing software and tools for effective learning effects perspective has been studied using the NHPP software. The log type hazard function applied to distribution was based on finite failure NHPP. Software error detection techniques known in advance, but influencing factors for considering the errors found automatically and learning factors, by prior experience, to find precisely the error factor setting up the testing manager are presented comparing the problem. As a result, the learning factor is greater than autonomous errors-detected factor that is generally efficient model could be confirmed. This paper, a failure data analysis of applying using time between failures and parameter estimation using maximum likelihood estimation method, after the efficiency of the data through trend analysis model selection were efficient using the mean square error and $R^2$(coefficient of determination).

A Comparison Study between Uniform Testing Effort and Weibull Testing Effort during Software Development (소프트웨어 개발시 일정테스트노력과 웨이불 테스트 노력의 비교 연구)

  • 최규식;장원석;김종기
    • Journal of Information Technology Application
    • /
    • v.3 no.3
    • /
    • pp.91-106
    • /
    • 2001
  • We propose a software-reliability growth model incoporating the amount of uniform and Weibull testing efforts during the software testing phase in this paper. The time-dependent behavior of testing effort is described by uniform and Weibull curves. Assuming that the error detection rate to the amount of testing effort spent during the testing phase is proportional to the current error content, the model is formulated by a nonhomogeneous Poisson process. Using this model the method the data analysis for software reliability measurement is developed. The optimum release time is determined by considering how the initial reliability R($\chi$ 0) would be. The conditions are ($R\chi$ 0)>$R_{o}$ , $P_{o}$ >R($\chi$ 0)> $R_{o}$ $^{d}$ and R($\chi$ 0)<$R_{o}$ $^{d}$ for uniform testing efforts. deal case is $P_{o}$ >($R\chi$ 0)> $R_{o}$ $^{d}$ Likewise, it is ($R\chi$ 0)$\geq$$R_{o}$ , $R_{o}$ >($R\chi$ 0)>R(eqation omitted) and ($R\chi$ 0)<R(eqation omitted)for Weibull testing efforts. Ideal case is $R_{o}$ > R($\chi$ 0)> R(eqation omitted).

  • PDF

A Probabilistic Handover Scheme for Enhancing Spectral Efficiency in Drone-based Wireless Communication Systems (드론 기반의 무선 통신 시스템에서 주파수 효율 향상을 위한 확률적 핸드오버 기법)

  • Jang, Hwan Won;Woo, Dong Hyuck;Hwang, Ho Young
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.25 no.9
    • /
    • pp.1220-1226
    • /
    • 2021
  • In this paper, we propose a probabilistic handover scheme for enhancing spectral efficiency in drone-based wireless communication systems. When a moving drone base station (DBS) provides the drone-based wireless communication service to a user equipment (UE) located on the ground, our proposed handover scheme considers the distance between DBS and UE and small scale fading. In addition, our proposed handover scheme considers a handover probability to mitigate the signalling overhead that may occur when performing frequent handovers. Through simulations for drone-based wireless communication systems, we evaluate the spectral efficiency and the handover probability of our proposed handover scheme and the conventional handover scheme. The simulation results show that our proposed handover scheme can achieve higher average spectral efficiency than the conventional handover scheme which considers only the distance between DBS and UE.

The Comparative Study of Software Optimal Release Time for the Distribution Based on Shape parameter (형상모수에 근거한 소프트웨어 최적방출시기에 관한 비교 연구)

  • Shin, Hyun-Cheul;Kim, Hee-Cheul
    • Journal of the Korea Society of Computer and Information
    • /
    • v.14 no.8
    • /
    • pp.1-9
    • /
    • 2009
  • In this paper, make a study decision problem called an optimal release policies after testing a software system in development phase and transfer it to the user. When correcting or modifying the software, because of the possibility of introducing new faults when correcting or modifying the software, infinite failure non-homogeneous Poisson process models presented and propose an optimal release policies of the life distribution applied fixed shape parameter distribution which can capture the increasing/decreasing nature of the failure occurrence rate per fault. In this paper, discuss optimal software release policies which minimize a total average software cost of development and maintenance under the constraint of satisfying a software reliability requirement. In a numerical example, after trend test applied and estimated the parameters using maximum likelihood estimation of inter-failure time data, make out estimating software optimal release time

A Spatial Statistical Approach to Migration Studies: Exploring the Spatial Heterogeneity in Place-Specific Distance Parameters (인구이동 연구에 대한 공간통계학적 접근: 장소특수적 거리 패러미터의 추출과 공간적 패턴 분석)

  • Lee, Sang-Il
    • Journal of the Korean association of regional geographers
    • /
    • v.7 no.3
    • /
    • pp.107-120
    • /
    • 2001
  • This study is concerned with providing a reliable procedure of calibrating a set of places specific distance parameters and with applying it to U.S. inter-State migration flows between 1985 and 1900. It attempts to conform to recent advances in quantitative geography that are characterized by an integration of ESDA(exploratory spatial data analysis) and local statistics. ESDA aims to detect the spatial clustering and heterogeneity by visualizing and exploring spatial patterns. A local statistic is defined as a statistically processed value given to each location as opposed to a global statistic that only captures an average trend across a whole study region. Whereas a global distance parameter estimates an averaged level of the friction of distance, place-specific distance parameters calibrate spatially varying effects of distance. It is presented that a poisson regression with an adequately specified design matrix yields a set of either origin-or destination-specific distance parameters. A case study demonstrates that the proposed model is a reliable device of measuring a spatial dimension of migration, and that place-specific distance parameters are spatially heterogeneous as well as spatially clustered.

  • PDF