• Title/Summary/Keyword: curve number

Search Result 1,418, Processing Time 0.029 seconds

Application of the Load Duration Curve (LDC) to Evaluate the Achievement Rate of Target Water Quality in the Han-River Watersheds (부하지속곡선(Load Duration Curve; LDC)을 이용한 한강수계 오염총량관리 목표수질 평가방법 적용 방안)

  • Kim, Eunkyoung;Ryu, Jichul;Kim, Hongtae;Kim, Yongseok;Shin, Dongseok
    • Journal of Korean Society on Water Environment
    • /
    • v.31 no.6
    • /
    • pp.732-738
    • /
    • 2015
  • Water quality in four major river basin in Korea was managed with Total Maximum Daily Load (TMDL) System. The unit watershed in TMDL system has been evaluated with Target Water Quality (TWQ) assessment using average water quality, without considering its volume of water quantity. As results, although unit watershed are obtained its TWQ, its allocated loads were not satisfied and vice versa. To solve these problems, a number of TWQ assessments with using Load Duration Curve (LDC) have been studied at other watersheds. The purpose of this study was to evaluate achievement of TWQ with Flow Duration Curve (FDC) and Load Duration Curve(LDC) at 26 unit watersheds in Han river basin. The results showed that achievement rates in TWQ assessment with current method and with LDC were 50~56 % and 69~73%, respectively. Because of increasing about 20% of achievement rates with using LDC, the number of exceeded unit watershed at Han river Basin was decreased about 4~6 unit watersheds.

IMPROVEMENT OF DOSE CALCULATION ACCURACY ON kV CBCT IMAGES WITH CORRECTED ELECTRON DENSITY TO CT NUMBER CURVE

  • Ahn, Beom Seok;Wu, Hong-Gyun;Yoo, Sook Hyun;Park, Jong Min
    • Journal of Radiation Protection and Research
    • /
    • v.40 no.1
    • /
    • pp.17-24
    • /
    • 2015
  • To improve accuracy of dose calculation on kilovoltage cone beam computed tomography (kV CBCT) images, a custom-made phantom was fabricated to acquire an accurate CT number to electron density curve by full scatter of cone beam x-ray. To evaluate the dosimetric accuracy, 9 volumetric modulated arc therapy (VMAT) plans for head and neck (HN) cancer and 9 VMAT plans for lung cancer were generated with an anthropomorphic phantom. Both CT and CBCT images of the anthropomorphic phantom were acquired and dose-volumetric parameters on the CT images with CT density curve (CTCT), CBCT images with CT density curve ($CBCT_{CT}$) and CBCT images with CBCT density curve ($CBCT_{CBCT}$) were calculated for each VMAT plan. The differences between $CT_{CT}$ vs. $CBCT_{CT}$ were similar to those between $CT_{CT}$ vs. $CBCT_{CBCT}$ for HN VMAT plans. However, the differences between $CT_{CT}$ vs. $CBCT_{CT}$ were larger than those between $CT_{CT}$ vs. $CBCT_{CBCT}$ for lung VMAT plans. Especially, the differences in $D_{98%}$ and $D_{95%}$ of lung target volume were statistically significant (4.7% vs. 0.8% with p = 0.033 for $D_{98%}$ and 4.8% vs. 0.5% with p = 0.030 for $D_{95%}$). In order to calculate dose distributions accurately on the CBCT images, CBCT density curve generated with full scatter condition should be used especially for dose calculations in the region of large inhomogeneity.

Development of seismic fragility curves for high-speed railway system using earthquake case histories

  • Yang, Seunghoon;Kwak, Dongyoup;Kishida, Tadahiro
    • Geomechanics and Engineering
    • /
    • v.21 no.2
    • /
    • pp.179-186
    • /
    • 2020
  • Investigating damage potential of the railway infrastructure requires either large amount of case histories or in-depth numerical analyses, or both for which large amounts of effort and time are necessary to accomplish thoroughly. Rather than performing comprehensive studies for each damage case, in this study we collect and analyze a case history of the high-speed railway system damaged by the 2004 M6.6 Niigata Chuetsu earthquake for the development of the seismic fragility curve. The development processes are: 1) slice the railway system as 200 m segments and assigned damage levels and intensity measures (IMs) to each segment; 2) calculate probability of damage for a given IM; 3) estimate fragility curves using the maximum likelihood estimation regression method. Among IMs considered for fragility curves, spectral acceleration at 3 second period has the most prediction power for the probability of damage occurrence. Also, viaduct-type structure provides less scattered probability data points resulting in the best-fitted fragility curve, but for the tunnel-type structure data are poorly scattered for which fragility curve fitted is not meaningful. For validation purpose fragility curves developed are applied to the 2016 M7.0 Kumamoto earthquake case history by which another high-speed railway system was damaged. The number of actual damaged segments by the 2016 event is 25, and the number of equivalent damaged segments predicted using fragility curve is 22.21. Both numbers are very similar indicating that the developed fragility curve fits well to the Kumamoto region. Comparing with railway fragility curves from HAZUS, we found that HAZUS fragility curves are more conservative.

AN EFFICIENT AND SECURE STRONG DESIGNATED VERIFIER SIGNATURE SCHEME WITHOUT BILINEAR PAIRINGS

  • Islam, Sk Hafizul;Biswas, G.P.
    • Journal of applied mathematics & informatics
    • /
    • v.31 no.3_4
    • /
    • pp.425-441
    • /
    • 2013
  • In literature, several strong designated verifier signature (SDVS) schemes have been devised using elliptic curve bilinear pairing and map-topoint (MTP) hash function. The bilinear pairing requires a super-singular elliptic curve group having large number of elements and the relative computation cost of it is approximately two to three times higher than that of elliptic curve point multiplication, which indicates that bilinear pairing is an expensive operation. Moreover, the MTP function, which maps a user identity into an elliptic curve point, is more expensive than an elliptic curve scalar point multiplication. Hence, the SDVS schemes from bilinear pairing and MTP hash function are not efficient in real environments. Thus, a cost-efficient SDVS scheme using elliptic curve cryptography with pairingfree operation is proposed in this paper that instead of MTP hash function uses a general cryptographic hash function. The security analysis shows that our scheme is secure in the random oracle model with the hardness assumption of CDH problem. In addition, the formal security validation of the proposed scheme is done using AVISPA tool (Automated Validation of Internet Security Protocols and Applications) that demonstrated that our scheme is unforgeable against passive and active attacks. Our scheme also satisfies the different properties of an SDVS scheme including strongness, source hiding, non-transferability and unforgeability. The comparison of our scheme with others are given, which shows that it outperforms in terms of security, computation cost and bandwidth requirement.

Shear-fatigue behavior of high-strength reinforced concrete beams under repeated loading

  • Kwak, Kae-Hwan;Park, Jong-Gun
    • Structural Engineering and Mechanics
    • /
    • v.11 no.3
    • /
    • pp.301-314
    • /
    • 2001
  • The purpose of this experimental study is to investigate the damage mechanism due to shear-fatigue behavior of high-strength reinforced concrete beams under repeated loading. The relationship between the number of cycles and the deflection or strain, the crack growths and modes of failure with the increase of number of cycles, fatigue strength, and S-N curve were observed through a fatigue test. Based on the fatigue test results, high-strength reinforced concrete beams failed at 57-66 percent of static ultimate strength for 2 million cycles. The fatigue strength at 2 million cycles from S-N curves was shown as about 60 percent of static ultimate strength. Compared to normal-strength reinforced concrete beams, fatigue capacity of high-strength reinforced concrete beams was similar to or lower than fatigue capacity of normal-strength reinforced concrete beams. Fatigue capacity of normal-strength reinforced concrete beams improved by over 60 percent.

Numerical solution of singular integral equation for multiple curved branch-cracks

  • Chen, Y.Z.;Lin, X.Y.
    • Structural Engineering and Mechanics
    • /
    • v.34 no.1
    • /
    • pp.85-95
    • /
    • 2010
  • In this paper, numerical solution of the singular integral equation for the multiple curved branch-cracks is investigated. If some quadrature rule is used, one difficult point in the problem is to balance the number of unknowns and equations in the solution. This difficult point was overcome by taking the following steps: (a) to place a point dislocation at the intersecting point of branches, (b) to use the curve length method to covert the integral on the curve to an integral on the real axis, (c) to use the semi-open quadrature rule in the integration. After taking these steps, the number of the unknowns is equal to the number of the resulting algebraic equations. This is a particular advantage of the suggested method. In addition, accurate results for the stress intensity factors (SIFs) at crack tips have been found in a numerical example. Finally, several numerical examples are given to illustrate the efficiency of the method presented.

Deep learning classifier for the number of layers in the subsurface structure

  • Kim, Ho-Chan;Kang, Min-Jae
    • International journal of advanced smart convergence
    • /
    • v.10 no.3
    • /
    • pp.51-58
    • /
    • 2021
  • In this paper, we propose a deep learning classifier for estimating the number of layers in the Earth's structure. When installing a grounding system, knowledge of the subsurface in the area is absolutely necessary. The subsurface structure can be modeled by the earth parameters. Knowing the exact number of layers can significantly reduce the amount of computation to estimate these parameters. The classifier consists of a feedforward neural network. Apparent resistivity curves were used to train the deep learning classifier. The apparent resistivity at 20 equally spaced log points in each curve are used as the features for the input of the deep learning classifier. Apparent resistivity curve data sets are collected either by theoretical calculations or by Wenner's measurement method. Deep learning classifiers are coded by Keras, an open source neural network library written in Python. This model has been shown to converge with close to 100% accuracy.

A Study on the Wear Characteristics of R/S Passing through Curves (곡선부 주행 차량의 마모특성에 관한 연구)

  • Lee, Hi-Sung
    • Journal of the Korean Society for Railway
    • /
    • v.10 no.6
    • /
    • pp.772-778
    • /
    • 2007
  • For the wear characteristics assessment of Saemaul train passing through curves, an analysis model for multi-car system has been developed. By using this model and ADAMS/Rail, sensitivity analyses have been conducted for the wear characteristics by changing the related parameters. At low speed, the wear number and the sliding mean of right wheel showed higher than left wheel, while those of left wheel showed higher than right wheel at high speed. According to the decrease of curve radius, the wear number and the sliding mean were increased. When the length of transition curve was increased, the wear number and the sliding mean was increased. And according to increase of cant, the wear number and the sliding mean were increased.

A Smoothing Method for Digital Curve by Iterative Averaging with Controllable Error (오차 제어가 가능한 반복적 평균에 의한 디지털 곡선의 스무딩 방법)

  • Lyu, Sung-Pil
    • Journal of KIISE
    • /
    • v.42 no.6
    • /
    • pp.769-780
    • /
    • 2015
  • Smoothing a digital curve by averaging its connected points is widely employed to minimize sharp changes of the curve that are generally introduced by noise. An appropriate degree of smoothing is critical since the area or features of the original shape can be distorted at a higher degree while the noise is insufficiently removed at a lower degree. In this paper, we provide a mathematical relationship between the parameters, such as the number of iterations, average distance between neighboring points, weighting factors for averaging and the moving distance of the point on the curve after smoothing. Based on these findings, we propose to control the smoothed curve such that its deviation is bounded particular error level as well as to significantly expedite smoothing for a pixel-based digital curve.