• Title/Summary/Keyword: 2-포아송 모형

Search Result 76, Processing Time 0.025 seconds

Analysis of Pull-out Behavior of Tunnel-type Anchorage for Suspended Bridge Using 2-D Model Tests and Numerical Analysis (2차원 모형실험 및 수치해석을 통한 현수교 터널식 앵커리지의 인발거동 특성 분석)

  • Seo, Seunghwan;Park, Jaehyun;Lee, Sungjune;Chung, Moonkyung
    • Journal of the Korean Geotechnical Society
    • /
    • v.34 no.10
    • /
    • pp.61-74
    • /
    • 2018
  • In this study, the pull-out behavior of tunnel type anchorage of suspension bridges was analyzed based on results from laboratory size model tests and numerical analysis. Tunnel type anchorage has found its applications occasionally in both domestic and oversea projects, therefore design method including failure mode and safety factor is yet to be clearly established. In an attempt to improve the design method, scaled model tests were conducted by employing simplified shapes and structure of the Ulsan grand bridge's anchorage which was the first case history of its like in Korea. In the model tests, the anchorage body and the surrounding rocks were made by using gypsum mixture. The pull-out behavior was investigated under plane strain conditions. The results of the model tests showed that the tunnel type anchorage underwent wedge shape failure. For the verification of the model tests, numerical analysis was carried out using ABAQUS, a finite element analysis program. The failure behavior predicted by numerical analysis was consistent with that by the model tests. The result of numerical analysis also showed that the effect of Poisson's ratio was negligible, and that a plugging type failure mode could occur only when the strength of the surrounding rocks was 10 times larger than that of anchorage body.

Analysis of Software Reliability Growth Model with Gamma Family Distribution (감마족 분포를 이용한 소프트웨어 신뢰 성장 모형의 분석)

  • Kan, Kwang-Hyun;Jang, Byeong-Ok;Kim, Hee-Cheul
    • Journal of IKEEE
    • /
    • v.9 no.2 s.17
    • /
    • pp.143-151
    • /
    • 2005
  • Finite failure NHPP models proposed in the literature exhibit is either constant, monotonic increasing or monotonic decreasing failure occurrence rates per fault. For the sake of proposing shape parameter of the Gamma family distribution, used the special pattern. Data set, where the underlying failure process could not be adequately described by the knowing models, which motivated the development of the Gamma or Weibull model and Gompertz model. Analysis of failure data set that led us to the Gamma or Weibull model and Gompertz model using arithmetic and Laplace trend tests, bias tests was presented in this Paper.

  • PDF

Boundary conditions for Time-Domain Finite-Difference Elastic Wave Modeling in Anisotropic Media (이방성을 고려한 시간영역 유한차분법 탄성파 모델링에서의 경계조건)

  • Lee, Ho-Yong;Min, Dong-Joo;Kwoon, Byung-Doo;Lim, Seung-Chul;Yoo, Hai-Soo
    • Geophysics and Geophysical Exploration
    • /
    • v.11 no.2
    • /
    • pp.153-160
    • /
    • 2008
  • Seismic modeling is used to simulate wave propagation in the earth. Although the earth's subsurface is usually semi-infinite, we cannot handle the semi-infinite model in seismic modeling because of limited computational resources. For this reason, we usually assume a finite-sized model in seismic modeling. In that case, we need to eliminate the edge reflections arising from the artificial boundaries introducing a proper boundary condition. In this study, we changed three kinds of boundary conditions (sponge boundary condition, Clayton and Engquist's absorbing boundary condition, and Higdon's transparent boundary condition) so that they can be applied in elastic wave modeling for anisotropic media. We then apply them to several models whose Poisson's ratios are different. Clayton and Engquist's absorbing boundary condition is unstable in both isotropic and anisotropic media, when Poisson's ratio is large. This indicates that the absorbing boundary condition can be applied in anisotropic media restrictively. Although the sponge boundary condition yields good results for both isotropic and anisotropic media, it requires too much computational memory and time. On the other hand, Higdon's transparent boundary condition is not only inexpensive, but also reduce reflections over a wide range of incident angles. We think that Higdon's transparent boundary condition can be a method of choice for anisotropic media, where Poisson's ratio is large.

A Study on Categorizing the Types of Transit Accessibility by Residence and Working Place and Identifying its Association to Personal Transit Travel Frequency (주거와 직장의 대중교통 접근성 유형화와 대중교통 통행발생량과의 연관성에 관한 연구)

  • Sung, Hyungun
    • Journal of Korean Society of Transportation
    • /
    • v.31 no.2
    • /
    • pp.20-32
    • /
    • 2013
  • This study is aimed at identifying the relationship of transit accessibility types to its related travel frequency in the Seoul metropolitan area. A multi-level poisson regression model is employed after categorizing transit accessibility into 18 types based on locations of residential and work workplace. Analysis results offer three policy implications in improving transit use in the Seoul metropolitan area. First, increase in transit ridership can be more effectively attained when policies concerning both competitive and complementary relationships between bus and rail transit are incorporated. Second, transfer system needs to be focused on such two modal perspectives as one travels from Seoul to suburban area and residential areas given the fact that walking accessibility to bus transit is good but that to rail transit is poor. Third, it is more effective to emphasize rail transit priority rather than bus transit, especially for the travel from suburban area to the city of Seoul in order to increase transit ridership.

The Study for ENHPP Software Reliability Growth Model Based on Kappa(2) Coverage Function (Kappa(2) 커버리지 함수를 이용한 ENHPP 소프트웨어 신뢰성장모형에 관한 연구)

  • Kim, Hee-Cheul
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.11 no.12
    • /
    • pp.2311-2318
    • /
    • 2007
  • Finite failure NHPP models presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing failure occurrence rates per fault. Accurate predictions of software release times, and estimation of the reliability and availability of a software product require Release times of a critical element of the software testing process : test coverage. This model called Enhanced non-homogeneous Poission process(ENHPP). In this paper, exponential coverage and S-shaped model was reviewed, proposes the Kappa coverage model, which make out efficiency application for software reliability. Algorithm to estimate the parameters used to maximum likelihood estimator and bisection method, model selection based on SSE statistics and Kolmogorov distance, for the sake of efficient model, was employed. Numerical examples using real data set for the sake of proposing Kappa coverage model was employed. This analysis of failure data compared with the Kappaa coverage model and the existing model(using arithmetic and Laplace trend tests, bias tests) is presented.

Spatial Analyses and Modeling of Landsacpe Dynamics (지표면 변화 탐색 및 예측 시스템을 위한 공간 모형)

  • 정명희;윤의중
    • Spatial Information Research
    • /
    • v.11 no.3
    • /
    • pp.227-240
    • /
    • 2003
  • The primary focus of this study is to provide a general methodology which can be utilized to understand and analyze environmental issues such as long term ecosystem dynamics and land use/cover change by development of 2D dynamic landscape models and model-based simulation. Change processes in land cover and ecosystem function can be understood in terms of the spatial and temporal distribution of land cover resources. In development of a system to understand major processes of change and obtain predictive information, first of all, spatial heterogeneity is to be taken into account because landscape spatial pattern affects on land cover change and interaction between different land cover types. Therefore, the relationship between pattern and processes is to be included in the research. Landscape modeling requires different approach depending on the definition, assumption, and rules employed for mechanism behind the processes such as spatial event process, land degradation, deforestration, desertification, and change in an urban environment. The rule-based models are described in the paper for land cover change by natural fires. Finally, a case study is presented as an example using spatial modeling and simulation to study and synthesize patterns and processes at different scales ranging from fine-scale to global scale.

  • PDF

A Software Reliability Cost Model Based on the Shape Parameter of Lomax Distribution (Lomax 분포의 형상모수에 근거한 소프트웨어 신뢰성 비용모형에 관한 연구)

  • Yang, Tae-Jin
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.9 no.2
    • /
    • pp.171-177
    • /
    • 2016
  • Software reliability in the software development process is an important issue. Software process improvement helps in finishing with reliable software product. Infinite failure NHPP software reliability models presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing failure occurrence rates per fault. In this study, reliability software cost model considering shape parameter based on life distribution from the process of software product testing was studied. The cost comparison problem of the Lomax distribution reliability growth model that is widely used in the field of reliability presented. The software failure model was used the infinite failure non-homogeneous Poisson process model. The parameters estimation using maximum likelihood estimation was conducted. For analysis of software cost model considering shape parameter. In the process of change and large software fix this situation can scarcely avoid the occurrence of defects is reality. The conditions that meet the reliability requirements and to minimize the total cost of the optimal release time. Studies comparing emissions when analyzing the problem to help kurtosis So why Kappa efficient distribution, exponential distribution, etc. updated in terms of the case is considered as also worthwhile. In this research, software developers to identify software development cost some extent be able to help is considered.

Developing a Traffic Accident Prediction Model for Freeways (고속도로 본선에서의 교통사고 예측모형 개발)

  • Mun, Sung-Ra;Lee, Young-Ihn;Lee, Soo-Beom
    • Journal of Korean Society of Transportation
    • /
    • v.30 no.2
    • /
    • pp.101-116
    • /
    • 2012
  • Accident prediction models have been utilized to predict accident possibilities in existing or projected freeways and to evaluate programs or policies for improving safety. In this study, a traffic accident prediction model for freeways was developed for the above purposes. When selecting variables for the model, the highest priority was on the ease of both collecting data and applying them into the model. The dependent variable was set as the number of total accidents and the number of accidents including casualties in the unit of IC(or JCT). As a result, two models were developed; the overall accident model and the casualty-related accident model. The error structure adjusted to each model was the negative binomial distribution and the Poisson distribution, respectively. Among the two models, a more appropriate model was selected by statistical estimation. Major nine national freeways were selected and five-year dada of 2003~2007 were utilized. Explanatory variables should take on either a predictable value such as traffic volumes or a fixed value with respect to geometric conditions. As a result of the Maximum Likelihood estimation, significant variables of the overall accident model were found to be the link length between ICs(or JCTs), the daily volumes(AADT), and the ratio of bus volume to the number of curved segments between ICs(or JCTs). For the casualty-related accident model, the link length between ICs(or JCTs), the daily volumes(AADT), and the ratio of bus volumes had a significant impact on the accident. The likelihood ratio test was conducted to verify the spatial and temporal transferability for estimated parameters of each model. It was found that the overall accident model could be transferred only to the road with four or more than six lanes. On the other hand, the casualty-related accident model was transferrable to every road and every time period. In conclusion, the model developed in this study was able to be extended to various applications to establish future plans and evaluate policies.

Analysis of Total Crime Count Data Based on Spatial Association Structure (공간적 연관구조를 고려한 총범죄 자료 분석)

  • Choi, Jung-Soon;Park, Man-Sik;Won, Yu-Bok;Kim, Hag-Yeol;Heo, Tae-Young
    • The Korean Journal of Applied Statistics
    • /
    • v.23 no.2
    • /
    • pp.335-344
    • /
    • 2010
  • Reliability of the estimation is usually damaged in the situation where a linear regression model without spatial dependencies is employed to the spatial data analysis. In this study, we considered the conditional autoregressive model in order to construct spatial association structures and estimate the parameters via the Bayesian approaches. Finally, we compared the performances of the models with spatial effects and the ones without spatial effects. We analyzed the yearly total crime count data measured from each of 25 districts in Seoul, South Korea in 2007.

Extreme Quantile Estimation of Losses in KRW/USD Exchange Rate (원/달러 환율 투자 손실률에 대한 극단분위수 추정)

  • Yun, Seok-Hoon
    • Communications for Statistical Applications and Methods
    • /
    • v.16 no.5
    • /
    • pp.803-812
    • /
    • 2009
  • The application of extreme value theory to financial data is a fairly recent innovation. The classical annual maximum method is to fit the generalized extreme value distribution to the annual maxima of a data series. An alterative modern method, the so-called threshold method, is to fit the generalized Pareto distribution to the excesses over a high threshold from the data series. A more substantial variant is to take the point-process viewpoint of high-level exceedances. That is, the exceedance times and excess values of a high threshold are viewed as a two-dimensional point process whose limiting form is a non-homogeneous Poisson process. In this paper, we apply the two-dimensional non-homogeneous Poisson process model to daily losses, daily negative log-returns, in the data series of KBW/USD exchange rate, collected from January 4th, 1982 until December 31 st, 2008. The main question is how to estimate extreme quantiles of losses such as the 10-year or 50-year return level.