• Title/Summary/Keyword: 확률 모델

Search Result 2,139, Processing Time 0.026 seconds

Impact Assessment of Sea_Level Rise based on Coastal Vulnerability Index (연안 취약성 지수를 활용한 해수면 상승 영향평가 방안 연구)

  • Lee, Haemi;Kang, Tae soon;Cho, Kwangwoo
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.27 no.5
    • /
    • pp.304-314
    • /
    • 2015
  • We have reviewed the current status of coastal vulnerability index(CVI) to be guided into an appropriate CVI development for Korean coast and applied a methodology into the east coast of Korea to quantify coastal vulnerability by future sea_level rise. The CVIs reviewed includes USGS CVI, sea_level rise CVI, compound CVI, and multi scale CVI. The USGS CVI, expressed into the external forcing of sea_level rise, wave and tide, and adaptive capacity of morphology, erosion and slope, is adopted here for CVI quantification. The range of CVI is 1.826~22.361 with a mean of 7.085 for present condition and increases into 2.887~30.619 with a mean of 12.361 for the year of 2100(1 m sea_level rise). The index "VERY HIGH" is currently 8.57% of the coast and occupies 35.56% in 2100. The pattern of CVI change by sea_level rise is different to different local areas, and Gangneung, Yangyang and Goseong show the highest increase. The land use pattern in the "VERY HIGH" index is dominated by both human system of housing complex, road, cropland, etc, and natural system of sand, wetland, forestry, etc., which suggests existing land utilization should be reframed in the era of climate change. Though CVI approach is highly efficient to deal with a large set of climate scenarios entailed in climate impact assessment due to uncertainties, we also propose three_level assessment for the application of CVI methodology in the site specific adaptation such as first screening assessment by CVI, second scoping assessment by impact model, and final risk quantification with the result of impact model.

Assessment of Earthquake Induced Landslide Susceptibility with Variation of Groundwater Level (지하수위 변화에 따른 지진 유발 산사태의 취약섬 분석)

  • Kim, Ji-Seok;Park, Hyuek-Jin;Lee, Jung-Hyun
    • Economic and Environmental Geology
    • /
    • v.44 no.4
    • /
    • pp.289-302
    • /
    • 2011
  • Since the frequency of the earthquake occurrence in Korean peninsular is continuously increasing, the possibility that massive landslides are triggered by earthquake is also growing in Korea. Previously, the landslide is known to be induced by large magnitude earthquake, whose magnitude is larger than 6.0. However, the landslide can be induced by only small magnitude earthquake, especially in the fully saturated soil. Therefore, the susceptibility of landslide caused by small magnitude earthquake in fully saturated soil is analyzed in this study. For that, the topographical and geological characteristics of the site were obtained and managed by GIS software. In the procedure of the study, slope angle, cohesion, friction angle, unit weight of soil were obtained and constructed as a spatial database layer. Combining these data sets in a dynamic model based on Newmark's displacement analysis, the landslide displacements were estimated in each grid cell. In order to check out the possibility of the earthquake induced landslides, the level of the groundwater table is varied from dry to 80% saturated soil. In addition, in order to analyze the effect of the magnitude of earthquake and distance to epicenter, four different earthquakes epicenters were considered in the study area.

Algorithms for Indexing and Integrating MPEG-7 Visual Descriptors (MPEG-7 시각 정보 기술자의 인덱싱 및 결합 알고리즘)

  • Song, Chi-Ill;Nang, Jong-Ho
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.1
    • /
    • pp.1-10
    • /
    • 2007
  • This paper proposes a new indexing mechanism for MPEG-7 visual descriptors, especially Dominant Color and Contour Shape descriptors, that guarantees an efficient similarity search for the multimedia database whose visual meta-data are represented with MPEG-7. Since the similarity metric used in the Dominant Color descriptor is based on Gaussian mixture model, the descriptor itself could be transform into a color histogram in which the distribution of the color values follows the Gauss distribution. Then, the transformed Dominant Color descriptor (i.e., the color histogram) is indexed in the proposed indexing mechanism. For the indexing of Contour Shape descriptor, we have used a two-pass algorithm. That is, in the first pass, since the similarity of two shapes could be roughly measured with the global parameters such as eccentricity and circularity used in Contour shape descriptor, the dissimilar image objects could be excluded with these global parameters first. Then, the similarities between the query and remaining image objects are measured with the peak parameters of Contour Shape descriptor. This two-pass approach helps to reduce the computational resources to measure the similarity of image objects using Contour Shape descriptor. This paper also proposes two integration schemes of visual descriptors for an efficient retrieval of multimedia database. The one is to use the weight of descriptor as a yardstick to determine the number of selected similar image objects with respect to that descriptor, and the other is to use the weight as the degree of importance of the descriptor in the global similarity measurement. Experimental results show that the proposed indexing and integration schemes produce a remarkable speed-up comparing to the exact similarity search, although there are some losses in the accuracy because of the approximated computation in indexing. The proposed schemes could be used to build a multimedia database represented in MPEG-7 that guarantees an efficient retrieval.

The Acclerated Life Test of Hard Disk In The Environment of PACS (PACS 환경에서 하드디스크의 가속 수명시험)

  • Cho, Euy-Hyun;Park, Jeong-Kyu;Chae, Jong-Gyu
    • Journal of Digital Contents Society
    • /
    • v.16 no.1
    • /
    • pp.63-70
    • /
    • 2015
  • In this paper, we estimate the life cycle from acceleration life test about the hard disk of disk array of image storage of PACS. Webuil distribution was selected by the Anderson-Darling goodness-of-fit test with data of down time at $50^{\circ}C$ and $60^{\circ}C$. The equality test of shape parameter and scale parameter was conducted, so that the probability distribution estimated from data of down time at $50^{\circ}C$ and $60^{\circ}C$ was not statistically significant. The shape parameter was 1.0409, The characteristic life was 24603.5 hours at normal user condition($30^{\circ}C$) by the analysis of weibull-arrhenius modeling which included the acceleration factor of temperature, and The activation energy was 0.5011 eV through arrhenius modeling. The failure analysis of the failure samples of acceleration test and the samples of market return was conducted, so that the share percentage of failure mode was detail difference but the rank of share percentage was almost same. This study suggest the test procedure of acceleration test of hard disk drive in PACS using environment, and help the life estimation at manufacture and use.

Effects of Human Error on the Optimal Test Internal and Unavailability of the Safety System (안전계통의 이용불능도 및 최적시험주기에 미치는 인간실수의 영향)

  • Chung, Dae-Wook;Koo, Bon-Hyun
    • Nuclear Engineering and Technology
    • /
    • v.23 no.2
    • /
    • pp.174-182
    • /
    • 1991
  • Effects of human error relevant to the periodic test are incorporated in the evaluations of the unavailability and optimal test interval of a safety system. Two types of possible human error with respect to the test and maintenance are considered. One is the possibility that a good safety system is inadvertently left in a bad state after test(Type A human error) and the other is the possibility that a bad safety system is undetected upon the test(Type B human error). An event tree model is developed for the steady-state unavailability of a safety system in order to determine the effects of human errors on the system unavailability and the optimal test interval. A reliability analysis of the Safety Injection System (SIS) was peformed to evaluate the effects of human error on the SIS unavailability. Results of various sensitivity analyses show that ; (1) the steady-state unavailability of the safety system increases as the probabilities of both types of human error increase and it is far more sensitive to Type A human error, (2) the optimal test interval increases slightly as the probability of Type A human error increases but it decreases as the probability of Type B human error increases, and (3) provided that the test interval of the safety injction pump is kept unchanged, the unavailability of SIS increases significantly as the probability of Type A human error increases but slightly as the probability of Type B human error increases. Therefore, to obtain the realistic result of reliability analysis, one should take shorter test interval (not optimal test interval) so that the unavailability of SIS can be maintained at the same level irrespective of human error. Since Type A human error during test & maintenance influeces greatly on the system unavailability, special efforts to reduce the possibility of Type A human error are essential in the course of test & maintenance.

  • PDF

A Study on Trade Area Analysis with the Use of Modified Probability Model (변형확률모델을 활용한 소매업의 상권분석 방안에 관한 연구)

  • Jin, Chang-Beom;Youn, Myoung-Kil
    • Journal of Distribution Science
    • /
    • v.15 no.6
    • /
    • pp.77-96
    • /
    • 2017
  • Purpose - This study aims to develop correspondence strategies to the environment change in domestic retail store types. Recently, new types of retails have emerged in retail industries. Therefore, trade area platform has developed focusing on the speed of data, no longer trade area from district border. Besides, 'trade area smart' brings about change in retail types with the development of giga internet. Thus, context shopping is changing the way of consumers' purchase pattern through data capture, technology capability, and algorithm development. For these reasons, the sales estimation model has been shown to be flawed using the notion of former scale and time, and it is necessary to construct a new model. Research design, data, and methodology - This study focuses on measuring retail change in large multi-shopping mall for the outlook for retail industry and competition for trade area with the theoretical background understanding of retail store types and overall domestic retail conditions. The competition among retail store types are strong, whereas the borders among them are fading. There is a greater need to analyze on a new model because sales expectation can be hard to get with business area competition. For comprehensive research, therefore, the research method based on the statistical analysis was excluded, and field survey and literature investigation method were used to identify problems and propose an alternative. In research material, research fidelity has improved with complementing research data related with retail specialists' as well as department stores. Results - This study analyzed trade area survival and its pattern through sales estimation and empirical studies on trade areas. The sales estimation, based on Huff model system, counts the number of households shopping absorption expectation from trade areas. Based on the results, this paper estimated sales scale, and then deducted modified probability model. Conclusions - In times of retail store chain destruction and off-line store reorganization, modified Huff model has problems in estimating sales. Transformation probability model, supplemented by the existing problems, was analyzed to be more effective in competitiveness business condition. This study offers a viable alternative to figure out related trade areas' sale estimation by reconstructing new-modified probability model. As a result, the future task is to enlarge the borders from IT infrastructure with data and evidence based business into DT infrastructure.

Lane Detection in Complex Environment Using Grid-Based Morphology and Directional Edge-link Pairs (복잡한 환경에서 Grid기반 모폴리지와 방향성 에지 연결을 이용한 차선 검출 기법)

  • Lin, Qing;Han, Young-Joon;Hahn, Hern-Soo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.20 no.6
    • /
    • pp.786-792
    • /
    • 2010
  • This paper presents a real-time lane detection method which can accurately find the lane-mark boundaries in complex road environment. Unlike many existing methods that pay much attention on the post-processing stage to fit lane-mark position among a great deal of outliers, the proposed method aims at removing those outliers as much as possible at feature extraction stage, so that the searching space at post-processing stage can be greatly reduced. To achieve this goal, a grid-based morphology operation is firstly used to generate the regions of interest (ROI) dynamically, in which a directional edge-linking algorithm with directional edge-gap closing is proposed to link edge-pixels into edge-links which lie in the valid directions, these directional edge-links are then grouped into pairs by checking the valid lane-mark width at certain height of the image. Finally, lane-mark colors are checked inside edge-link pairs in the YUV color space, and lane-mark types are estimated employing a Bayesian probability model. Experimental results show that the proposed method is effective in identifying lane-mark edges among heavy clutter edges in complex road environment, and the whole algorithm can achieve an accuracy rate around 92% at an average speed of 10ms/frame at the image size of $320{\times}240$.

Data processing system and spatial-temporal reproducibility assessment of GloSea5 model (GloSea5 모델의 자료처리 시스템 구축 및 시·공간적 재현성평가)

  • Moon, Soojin;Han, Soohee;Choi, Kwangsoon;Song, Junghyun
    • Journal of Korea Water Resources Association
    • /
    • v.49 no.9
    • /
    • pp.761-771
    • /
    • 2016
  • The GloSea5 (Global Seasonal forecasting system version 5) is provided and operated by the KMA (Korea Meteorological Administration). GloSea5 provides Forecast (FCST) and Hindcast (HCST) data and its horizontal resolution is about 60km ($0.83^{\circ}{\times}0.56^{\circ}$) in the mid-latitudes. In order to use this data in watershed-scale water management, GloSea5 needs spatial-temporal downscaling. As such, statistical downscaling was used to correct for systematic biases of variables and to improve data reliability. HCST data is provided in ensemble format, and the highest statistical correlation ($R^2=0.60$, RMSE = 88.92, NSE = 0.57) of ensemble precipitation was reported for the Yongdam Dam watershed on the #6 grid. Additionally, the original GloSea5 (600.1 mm) showed the greatest difference (-26.5%) compared to observations (816.1 mm) during the summer flood season. However, downscaled GloSea5 was shown to have only a -3.1% error rate. Most of the underestimated results corresponded to precipitation levels during the flood season and the downscaled GloSea5 showed important results of restoration in precipitation levels. Per the analysis results of spatial autocorrelation using seasonal Moran's I, the spatial distribution was shown to be statistically significant. These results can improve the uncertainty of original GloSea5 and substantiate its spatial-temporal accuracy and validity. The spatial-temporal reproducibility assessment will play a very important role as basic data for watershed-scale water management.

Application and Comparison of Dynamic Artificial Neural Networks for Urban Inundation Analysis (도시침수 해석을 위한 동적 인공신경망의 적용 및 비교)

  • Kim, Hyun Il;Keum, Ho Jun;Han, Kun Yeun
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.38 no.5
    • /
    • pp.671-683
    • /
    • 2018
  • The flood damage caused by heavy rains in urban watershed is increasing, and, as evidenced by many previous studies, urban flooding usually exceeds the water capacity of drainage networks. The flood on the area which considerably urbanized and densely populated cause serious social and economic damage. To solve this problem, deterministic and probabilistic studies have been conducted for the prediction flooding in urban areas. However, it is insufficient to obtain lead times and to derive the prediction results for the flood volume in a short period of time. In this study, IDNN, TDNN and NARX were compared for real-time flood prediction based on urban runoff analysis to present the optimal real-time urban flood prediction technique. As a result of the flood prediction with rainfall event of 2010 and 2011 in Gangnam area, the Nash efficiency coefficient of the input delay artificial neural network, the time delay neural network and nonlinear autoregressive network with exogenous inputs are 0.86, 0.92, 0.99 and 0.53, 0.41, 0.98 respectively. Comparing with the result of the error analysis on the predicted result, it is revealed that the use of nonlinear autoregressive network with exogenous inputs must be appropriate for the establishment of urban flood response system in the future.

Dynamic Management of Equi-Join Results for Multi-Keyword Searches (다중 키워드 검색에 적합한 동등조인 연산 결과의 동적 관리 기법)

  • Lim, Sung-Chae
    • The KIPS Transactions:PartA
    • /
    • v.17A no.5
    • /
    • pp.229-236
    • /
    • 2010
  • With an increasing number of documents in the Internet or enterprises, it becomes crucial to efficiently support users' queries on those documents. In that situation, the full-text search technique is accepted in general, because it can answer uncontrolled ad-hoc queries by automatically indexing all the keywords found in the documents. The size of index files made for full-text searches grows with the increasing number of indexed documents, and thus the disk cost may be too large to process multi-keyword queries against those enlarged index files. To solve the problem, we propose both of the index file structure and its management scheme suitable to the processing of multi-keyword queries against a large volume of index files. For this, we adopt the structure of inverted-files, which are widely used in the multi-keyword searches, as a basic index structure and modify it to a hierarchical structure for join operations and ranking operations performed during the query processing. In order to save disk costs based on that index structure, we dynamically store in the main memory the results of join operations between two keywords, if they are highly expected to be entered in users' queries. We also do performance comparisons using a cost model of the disk to show the performance advantage of the proposed scheme.