• Title/Summary/Keyword: Laplace distribution

Search Result 138, Processing Time 0.024 seconds

A Study on the Dynamic Characteristics of a Composite Beam with a Transverse Open Crack (크랙이 존재하는 복합재료 보의 동적 특성 연구)

  • 하태완;송오섭
    • Journal of KSNVE
    • /
    • v.9 no.5
    • /
    • pp.1019-1028
    • /
    • 1999
  • Free vibration characteristics of cantilevered laminated composite beams with a transverse non0propagating open carck are investigated. In the present analysis a special ply-angle distribution referred to as asymmetric stiffness configuration inducing the elastic coupling between chord-wise bending and extension is considered. The open crack is modelled as an equivalent rotational spring whose spring constant is calculated on the basis of fracture mechanics of composite material structures. Governing equations of a composite beam with a open crack are derived via Hamilton's Principle and Timoshenko beam theory encompassing transverse shear and rotary inertia effect. the effects of various parameters such as the ply angle, fiber volume fraction, crack depth, crack position and transverse shear on the free vibration characteristics of the beam with a crack is highlighted. The numerical results show that the natural frequencies obtained from Timoshenko beam theory are always lower than those from Euler beam theory. The presence of intrinsic cracks in anisotropic composite beams modifies the flexibility and in turn free vibration characteristics of the structures. It is revealed that non-destructive crack detection is possible by analyzing the free vibration responses of a cracked beam.

  • PDF

Reliability analysis of a complex system, attended by two repairmen with vacation under marked process with the application of copula

  • Tiwari, N.;Singh, S.B.;Ram, M.
    • International Journal of Reliability and Applications
    • /
    • v.11 no.2
    • /
    • pp.107-122
    • /
    • 2010
  • This paper deals with the reliability analysis of a complex system, which consists of two subsystems A and B connected in series. Subsystem A has only one unit and B has two units $B_1$ and $B_2$. Marked process has been applied to model the complex system. Present reliability model incorporated two repairmen: supervisor and novice to repair the failed units. Supervisor is always there and the novice remains in vacation and is called for repair as per demand. The repair rates for supervisor and novice follow general and exponential distributions respectively and the failure time for both the subsystems follows exponential distribution. The model is analyzed under "Head of line repair discipline". By employing supplementary variable technique, Laplace transformation and Gumbel-Hougaard family of copula various transition state probabilities, reliability, availability and cost analysis have been obtained along with the steady state behaviour of the system. At the end some special cases of the system have been taken.

  • PDF

The Study for Software Future Forecasting Failure Time Using ARIMA AR(1) (ARIMA AR(1) 모형을 이용한 소프트웨어 미래 고장 시간 예측에 관한 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • Convergence Security Journal
    • /
    • v.8 no.2
    • /
    • pp.35-40
    • /
    • 2008
  • Software failure time presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing. For data analysis of software reliability model, data scale tools of trend analysis are developed. The methods of trend analysis are arithmetic mean test and Laplace trend test. Trend analysis only offer information of outline content. In this paper, we discuss forecasting failure time case of failure time censoring. The used software failure time data for forecasting failure time is random number of Weibull distribution(shaper parameter 1, scale parameter 0.5), Using this data, we are proposed to ARIMA(AR(1)) and simulation method for forecasting failure time. The practical ARIMA method is presented.

  • PDF

Analysis on particle deposition onto a heated, horizontal free-standing wafer with electrostatic effect (정전효과가 있는 가열 수평웨이퍼로의 입자침착에 관한 해석)

  • Yoo, Kyung-Hoon;Oh, Myung-Do;Myong, Hyon-Kook
    • Transactions of the Korean Society of Mechanical Engineers B
    • /
    • v.21 no.10
    • /
    • pp.1284-1293
    • /
    • 1997
  • The electrostatic effect on particle deposition onto a heated, Horizontal free-standing wafer surface was investigated numerically. The deposition mechanisms considered were convection, Brownian and turbulent diffusion, sedimentation, thermophoresis and electrostatic force. The electric charge on particle needed to calculate the electrostatic migration velocity induced by the local electric field was assumed to be the Boltzmann equilibrium charge. The electrostatic forces acted upon the particle included the Coulombic, image, dielectrophoretic and dipole-dipole forces based on the assumption that the particle and wafer surface are conducting. The electric potential distribution needed to calculate the local electric field around the wafer was calculated from the Laplace equation. The averaged and local deposition velocities were obtained for a temperature difference of 0-10 K and an applied voltage of 0-1000 v.The numerical results were then compared with those of the present suggested approximate model and the available experimental data. The comparison showed relatively good agreement between them.

History of the Error and the Normal Distribution in the Mid Nineteenth Century (19세기 중반 오차와 정규분포의 역사)

  • Jo, Jae-Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.15 no.5
    • /
    • pp.737-752
    • /
    • 2008
  • About 1800, mathematicians combined analysis of error and probability theory into error theory. After developed by Gauss and Laplace, error theory was widely used in branches of natural science. Motivated by the successful applications of error theory in natural sciences, scientists like Adolph Quetelet tried to incorporate social statistics with error theory. But there were not a few differences between social science and natural science. In this paper we discussed topics raised then. The problems considered are as follows: the interpretation of individual man in society; the arguments against statistical methods; history of the measures for diversity. From the successes and failures of the $19^{th}$ century social statisticians, we can see how statistics became a science that is essential to both natural and social sciences. And we can see that those problems, which were not easy to solve for the $19^{th}$ century social statisticians, matter today too.

The Study for Process Capability Analysis of Software Failure Interval Time (소프트웨어 고장 간격 시간에 대한 공정능력분석에 관한 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • Convergence Security Journal
    • /
    • v.7 no.2
    • /
    • pp.49-55
    • /
    • 2007
  • Software failure time presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing. For data analysis of software reliability model, data scale tools of trend analysis are developed. The methods of trend analysis are arithmetic mean test and Laplace trend test. Trend analysis only offer information of outline content. From the subdivision of this analysis, new attemp needs the side of the quality control. In this paper, we discuss process capability analysis using process capability indexs. Because of software failure interval time is pattern of nonnegative value, instead of capability analysis of suppose to normal distribution, capability analysis of process distribution using to Box-Cox transformation is attermpted. The used software failure time data for capability analysis of process is SS3, the result of analysis listed on this chapter 4 and 5. The practical use is presented.

  • PDF

Highly-closed/-Open Porous Ceramics with Micro-Beads by Direct Foaming

  • Jang, Woo Young;Seo, Dong Nam;Park, Jung Gyu;Kim, Hyung Tae;Lee, Sung Min;Kim, Suk Young;Kim, Ik Jin
    • Journal of the Korean Ceramic Society
    • /
    • v.53 no.6
    • /
    • pp.604-609
    • /
    • 2016
  • This study reports on wet-foam stability with respect to porous ceramics from a particle-stabilized colloidal suspension that is achieved through the addition of polymethyl methacrylate (PMMA) using a wet process. To stabilize the wet foam, an initial colloidal suspension of $Al_2O_3$ was partially hydrophobized by the surfactant propyl gallate (2 wt.%) and $SiO_2$ was added as a stabilizer. The influence of the PMMA content on the bubble size, pore size, and pore distribution in terms of the contact angle, surface tension, adsorption free energy, and Laplace pressure are described in this paper. The results show a wet-foam stability of more than 83%, which corresponds to a particle free energy of $2.7{\times}10^{-12}J$ and a pressure difference of 61.1 mPa for colloidal particles with 20 wt.% of PMMA beads. It was possible to control the uniform distribution of the open/closed pores by increasing the PMMA content and by adding thick struts, leading to the achievement of a higher-stability wet foam for use in porous ceramics.

Propagation of Tsunamis Generated by Seabed Motion with Time-History and Spatial-Distribution: An Analytical Approach (시간이력 및 공간분포를 지닌 지반운동에 의한 지진해일 발생 및 전파: 해석적 접근)

  • Jung, Taehwa;Son, Sangyoung
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.30 no.6
    • /
    • pp.263-269
    • /
    • 2018
  • Changes in water depth caused by underwater earthquakes and landslides cause sea surface undulations, which in turn propagate to the coast and result in significant damage as wave heights normally increase due to the wave shoaling process. Various types of numerical models have been developed to simulate the generation and propagation of tsunami waves. Most of tsunami models determine the initial surface of the water based on the assumption that the movement of the seabed is immediately and identically transmitted to the sea surface. However, this approach does not take into account the characteristics of underwater earthquakes that occur with time history and spatial variation. Thus, such an incomplete description on the initial generation of tsunami waves is totally reflected in the error during the simulation. In this study, the analytical solution proposed by Hammack (1973) was applied in the tsunami model in order to simulate the generation of initial water surface elevation by the change of water depth with time history and its propagation. The developed solution is expected to identify the relationship among various type of seabed motions, initial surface undulations, and wave speeds of elevated water surfaces.

The Study of Infinite NHPP Software Reliability Model from the Intercept Parameter using Linear Hazard Rate Distribution (선형위험률분포의 절편모수에 근거한 무한고장 NHPP 소프트웨어 신뢰모형에 관한 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.9 no.3
    • /
    • pp.278-284
    • /
    • 2016
  • Software reliability in the software development process is an important issue. In infinite failure NHPP software reliability models, the fault occurrence rates may have constant, monotonic increasing or monotonic decreasing pattern. In this paper, infinite failures NHPP models that the situation was reflected for the fault occurs in the repair time, were presented about comparing property. Commonly, the software model of the infinite failures using the linear hazard rate distribution software reliability based on intercept parameter was used in business economics and actuarial modeling, was presented for comparison problem. The result is that a relatively large intercept parameter was appeared effectively form. The parameters estimation using maximum likelihood estimation was conducted and model selection was performed using the mean square error and the coefficient of determination. The linear hazard rate distribution model is also efficient in terms of reliability because it (the coefficient of determination is 90% or more) in the field of the conventional model can be used as an alternative model could be confirmed. From this paper, the software developers have to consider intercept parameter of life distribution by prior knowledge of the software to identify failure modes which can be able to help.

A Bayesian Poisson model for analyzing adverse drug reaction in self-controlled case series studies (베이지안 포아송 모형을 적용한 자기-대조 환자군 연구에서의 약물상호작용 위험도 분석)

  • Lee, Eunchae;Hwang, Beom Seuk
    • The Korean Journal of Applied Statistics
    • /
    • v.33 no.2
    • /
    • pp.203-213
    • /
    • 2020
  • The self-controlled case series (SCCS) study measures the relative risk of exposure to exposure period by setting the non-exposure period of the patient as the control period without a separate control group. This method minimizes the bias that occurs when selecting a control group and is often used to measure the risk of adverse events after taking a drug. This study used SCCS to examine the increased risk of side effects when two or more drugs are used in combination. A conditional Poisson model is assumed and analyzed for drug interaction between the narcotic analgesic, tramadol and multi-frequency combination drugs. Bayesian inference is used to solve the overfitting problem of MLE and the normal or Laplace prior distributions are used to measure the sensitivity of the prior distribution.