• Title/Summary/Keyword: PDF Method

Search Result 190, Processing Time 0.025 seconds

A Bayesian Approach to Geophysical Inverse Problems (베이지안 방식에 의한 지구물리 역산 문제의 접근)

  • Oh Seokhoon;Chung Seung-Hwan;Kwon Byung-Doo;Lee Heuisoon;Jung Ho Jun;Lee Duk Kee
    • Geophysics and Geophysical Exploration
    • /
    • v.5 no.4
    • /
    • pp.262-271
    • /
    • 2002
  • This study presents a practical procedure for the Bayesian inversion of geophysical data. We have applied geostatistical techniques for the acquisition of prior model information, then the Markov Chain Monte Carlo (MCMC) method was adopted to infer the characteristics of the marginal distributions of model parameters. For the Bayesian inversion of dipole-dipole array resistivity data, we have used the indicator kriging and simulation techniques to generate cumulative density functions from Schlumberger array resistivity data and well logging data, and obtained prior information by cokriging and simulations from covariogram models. The indicator approach makes it possible to incorporate non-parametric information into the probabilistic density function. We have also adopted the MCMC approach, based on Gibbs sampling, to examine the characteristics of a posteriori probability density function and the marginal distribution of each parameter.

The Accessibility of Taif University Blackboard for Visually Impaired Students

  • Alnfiai, Mrim;Alhakami, Wajdi
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.6
    • /
    • pp.258-268
    • /
    • 2021
  • Online learning systems are becoming an effective educational medium for many universities. The accessibility of online learning system in universities means that every student, including the visually impaired, is able use all the site's services. This research focuses on investigating the accessibility of online learning systems for visually impaired users. The paper purpose is to understand the perception of visually impaired undergraduate students towards Blackboard's accessibility and to make recommendations for a new Blackboard design with accessible features that support their needs. Impact of a new Blackboard design with accessible features on visually impaired students, using Taif University students as a case study is evaluated in this paper, as it is similar to most learning systems used by Saudi universities. A study on Taif University's utilization of Blackboard was conducted using mixed method approaches (an automatic tool and a user study). In the first phase, Taif's use of Blackboard was evaluated by the web accessibility tool called AChecker. In the second phase, we conducted a user study to verify previously discovered accessibility challenges to fully assess them according to the accessibility and usability guidelines. In this study, the accessibility of Taif University's Blackboard was evaluated by thirteen visually impaired undergraduate students. The results of the study show that Blackboard has accessibility issues, which are confusing navigation, incompatibility with assistive technologies, untitled pages or parts, unclear identification for visual elements, and inaccessible PDF files. This paper also introduces a set of recommendations that aim to improve the accessibility of Blackboard and other educational websites developed for this population. It also highlights the serious need for universities to enhance web accessibility for online learning systems for students with disabilities.

Investigating the future changes of extreme precipitation indices in Asian regions dominated by south Asian summer monsoon

  • Deegala Durage Danushka Prasadi Deegala;Eun-Sung Chung
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2023.05a
    • /
    • pp.174-174
    • /
    • 2023
  • The impact of global warming on the south Asian summer monsoon is of critical importance for the large population of this region. This study aims to investigate the future changes of the precipitation extremes during pre-monsoon and monsoon, across this region in a more organized regional structure. The study area is divided into six major divisions based on the Köppen-Geiger's climate structure and 10 sub-divisions considering the geographical locations. The future changes of extreme precipitation indices are analyzed for each zone separately using five indices from ETCCDI (Expert Team on Climate Change Detection and Indices); R10mm, Rx1day, Rx5day, R95pTOT and PRCPTOT. 10 global climate model (GCM) outputs from the latest CMIP6 under four combinations of SSP-RCP scenarios (SSP1-2.6, SSP2-4.5, SSP3-7.0, and SSP5-8.5) are used. The GCMs are bias corrected using nonparametric quantile transformation based on the smoothing spline method. The future period is divided into near future (2031-2065) and far future (2066-2100) and then the changes are compared based on the historical period (1980-2014). The analysis is carried out separately for pre-monsoon (March, April, May) and monsoon (June, July, August, September). The methodology used to compare the changes is probability distribution functions (PDF). Kernel density estimation is used to plot the PDFs. For this study we did not use a multi-model ensemble output and the changes in each extreme precipitation index are analyzed GCM wise. From the results it can be observed that the performance of the GCMs vary depending on the sub-zone as well as on the precipitation index. Final conclusions are made by removing the poor performing GCMs and by analyzing the overall changes in the PDFs of the remaining GCMs.

  • PDF

Detection and Assessment of Forest Cover Change in Gangwon Province, Inter-Korean, Based on Gaussian Probability Density Function (가우시안 확률밀도 함수기반 강원도 남·북한 지역의 산림면적 변화탐지 및 평가)

  • Lee, Sujong;Park, Eunbeen;Song, Cholho;Lim, Chul-Hee;Cha, Sungeun;Lee, Sle-gee;Lee, Woo-Kyun
    • Korean Journal of Remote Sensing
    • /
    • v.35 no.5_1
    • /
    • pp.649-663
    • /
    • 2019
  • The 2018 United Nations Development Programme (UNDP) report announced that deforestation in North Korea is the most extreme situation and in terms of climate change, this deforestation is a global scale issue. To respond deforestation, various study and projects are conducted based on remote sensing, but access to public data in North Korea is limited, and objectivity is difficult to be guaranteed. In this study, the forest detection based on density estimation in statistic using Landsat imagery was conducted in Gangwon province which is the only administrative district divided into South and North. The forest spatial data of South Korea was used as data for the labeling of forest and Non-forest in the Normalized Difference Vegetation Index (NDVI), and a threshold (0.6658) for forest detection was set by Gaussian Probability Density Function (PDF) estimation by category. The results show that the forest area decreased until the 2000s in both Korea, but the area increased in 2010s. It is also confirmed that the reduction of forest area on the local scale is the same as the policy direction of urbanization and industrialization at that time. The Kappa value for validation was strong agreement (0.8) and moderate agreement (0.6), respectively. The detection based on the Gaussian PDF estimation is considered a method for complementing the statistical limitations of the existing detection method using satellite imagery. This study can be used as basic data for deforestation in North Korea and Based on the detection results, it is necessary to protect and restore forest resources.

Selecting Climate Change Scenarios Reflecting Uncertainties (불확실성을 고려한 기후변화 시나리오의 선정)

  • Lee, Jae-Kyoung;Kim, Young-Oh
    • Atmosphere
    • /
    • v.22 no.2
    • /
    • pp.149-161
    • /
    • 2012
  • Going by the research results of the past, of all the uncertainties resulting from the research on climate change, the uncertainty caused by the climate change scenario has the highest degree of uncertainty. Therefore, depending upon what kind of climate change scenario one adopts, the projection of the water resources in the future will differ significantly. As a matter of principle, it is highly recommended to utilize all the GCM scenarios offered by the IPCC. However, this could be considered to be an impractical alternative if a decision has to be made at an action officer's level. Hence, as an alternative, it is deemed necessary to select several scenarios so as to express the possible number of cases to the maximum extent possible. The objective standards in selecting the climate change scenarios have not been properly established and the scenarios have been selected, either at random or subject to the researcher's discretion. In this research, a new scenario selection process, in which it is possible to have the effect of having utilized all the possible scenarios, with using only a few principal scenarios and maintaining some of the uncertainties, has been suggested. In this research, the use of cluster analysis and the selection of a representative scenario in each cluster have efficiently reduced the number of climate change scenarios. In the cluster analysis method, the K-means clustering method, which takes advantage of the statistical features of scenarios has been employed; in the selection of a representative scenario in each cluster, the selection method was analyzed and reviewed and the PDF method was used to select the best scenarios with the closest simulation accuracy and the principal scenarios that is suggested by this research. In the selection of the best scenarios, it has been shown that the GCM scenario which demonstrated high level of simulation accuracy in the past need not necessarily demonstrate the similarly high level of simulation accuracy in the future and various GCM scenarios were selected for the principal scenarios. Secondly, the "Maximum entropy" which can quantify the uncertainties of the climate change scenario has been used to both quantify and compare the uncertainties associated with all the scenarios, best scenarios and the principal scenarios. Comparison has shown that the principal scenarios do maintain and are able to better explain the uncertainties of all the scenarios than the best scenarios. Therefore, through the scenario selection process, it has been proven that the principal scenarios have the effect of having utilized all the scenarios and retaining the uncertainties associated with the climate change to the maximum extent possible, while reducing the number of scenarios at the same time. Lastly, the climate change scenario most suitable for the climate on the Korean peninsula has been suggested. Through the scenario selection process, of all the scenarios found in the 4th IPCC report, principal climate change scenarios, which are suitable for the Korean peninsula and maintain most of the uncertainties, have been suggested. Therefore, it is assessed that the use of the scenario most suitable for the future projection of water resources on the Korean peninsula will be able to provide the projection of the water resources management that maintains more than 70~80% level of uncertainties of all the scenarios.

Human Exposure to BTEX and Its Risk Assessment Using the CalTOX Model According to the Probability Density Function in Meteorological Input Data (기상변수들의 확률밀도함수(PDF)에 따른 CalTOX모델을 이용한 BTEX 인체노출량 및 인체위해성 평가 연구)

  • Kim, Ok;Song, Youngho;Choi, Jinha;Park, Sanghyun;Park, Changyoung;Lee, Minwoo;Lee, Jinheon
    • Journal of Environmental Health Sciences
    • /
    • v.45 no.5
    • /
    • pp.497-510
    • /
    • 2019
  • Objectives: The aim of this study was to secure the reliability of using the CalTOX model when evaluating LADD (or ADD) and Risk (or HQ) among local residents for the emission of BTEX (Benzene, Toluene, Ethylbenzene, Xylene) and by closely examining the difference in the confidence interval of the assessment outcomes according to the difference in the probability density function of input variables. Methods: The assessment was made by dividing it according to the method ($I^{\dagger}$) of inputting the probability density function in meteorological variables of the model with log-normal distribution and the method of inputting ($II^{\ddagger}$) after grasping the optimal probability density function using @Risk. A T-test was carried out in order to analyze the difference in confidence interval of the two assessment results. Results: It was evaluated to be 1.46E-03 mg/kg-d in LADD of Benzene, 1.96E-04 mg/kg-d in ADD of Toluene, 8.15E-05 mg/kg-d in ADD of Ethylbenzene, and 2.30E-04 mg/kg-d in ADD of Xylene. As for the predicted confidence interval in LADD and ADD, there was a significant difference between the $I^{\dagger}$ and $II^{\ddagger}$ methods in $LADD_{Inhalation}$ for Benzene, and in $ADD_{Inhalation}$ and ADD for Toluene and Xylene. It appeared to be 3.58E-05 for risk in Benzene, 3.78E-03 for HQ in Toluene, 1.48E-03 for HQ in Ethylbenzene, and 3.77E-03 for HQ in Xylene. As a result of the HQ in Toluene and Xylene, the difference in confidence interval between the $I^{\dagger}$ and $II^{\ddagger}$ methods was shown to be significant. Conclusions: The human risk assessment for BTEX was made by dividing it into the method ($I^{\dagger}$) of inputting the probability density function of meteorological variables for the CalTOX model with log-normal distribution, and the method of inputting ($II^{\ddagger}$) after grasping the optimal probability density function using @Risk. As a result, it was identified that Risk (or HQ) is the same, but that there is a significant difference in the confidence interval of Risk (or HQ) between the $I^{\dagger}$ and $II^{\ddagger}$ methods.

Effect of Time-dependent Diffusion and Exterior Conditions on Service Life Considering Deterministic and Probabilistic Method (결정론 및 확률론적 방법에 따라 시간의존성 염화물 확산계수 및 외부 영향인자가 내구수명에 미치는 영향)

  • Kwon, Seung-Jun
    • Journal of the Korea institute for structural maintenance and inspection
    • /
    • v.20 no.6
    • /
    • pp.65-72
    • /
    • 2016
  • Service life evaluation for RC Structures exposed to chloride attack is very important, however the previous two methods(deterministic and probabilistic method) show a big difference. The paper presents a service life simulation using deterministic and probabilistic method with time-dependent diffusion coefficient. Three different cases are considered for diffusion coefficient, concrete cover depth, and surface chloride content respectively, and then the PDF(probability of durability failure) and the related service life are obtained. Through adopting time-dependent diffusion, the discrepancy between the two methods can be reduced, which yields reasonable service life. When diffusion coefficient increases from $2.5{\times}10^{-12}m^2/sec$ to $7.5{\times}10^{-12}m^2/sec$, the service life decreases to 25.5~35.6% level, and cover depth does from 75 mm to 125 mm, it increases to 267~311% level as well. In the case of surface chloride content from $5.0kg/m^3$ to $15.0kg/m^3$, it changes to 40.9~54.5%. The effect of cover depth is higher than the others by 8~10 times and also implies it is a key parameter to service life extension.

Numerical simulation of gasification of coal-water slurry for production of synthesis gas in a two stage entrained gasifier (2단 분류층 가스화기에서 합성가스 생성을 위한 석탄 슬러리 가스화에 대한 수치 해석적 연구)

  • Seo, Dong-Kyun;Lee, Sun-Ki;Song, Soon-Ho;Hwang, Jung-Ho
    • 한국신재생에너지학회:학술대회논문집
    • /
    • 2007.11a
    • /
    • pp.417-423
    • /
    • 2007
  • Oxy-gasification or oxygen-blown gasification, enables a clean and efficient use of coal and opens a promising way to CO2 capture. The coal gasification process of a slurry feed type, entrained-flow coal gasifier was numerically predicted in this paper. The purposes of this study are to develop an evaluation technique for design and performance optimization of coal gasifiers using a numerical simulation technique, and to confirm the validity of the model. By dividing the complicated coal gasification process into several simplified stages such as slurry evaporation, coal devolatilization, mixture fraction model and two-phase reactions coupled with turbulent flow and two-phase heat transfer, a comprehensive numerical model was constructed to simulate the coal gasification process. The influence of turbulence on the gas properties was taken into account by the PDF (Probability Density Function) model. A numerical simulation with the coal gasification model is performed on the Conoco-Philips type gasifier for IGCC plant. Gas temperature distribution and product gas composition are also presented. Numerical computations were performed to assess the effect of variation in oxygen to coal ratio and steam to coal ratio on reactive flow field. The concentration of major products, CO and H2 were calculated with varying oxygen to coal ratio (0.2-1.5) and steam to coal ratio(0.3-0.7). To verify the validity of predictions, predicted values of CO and H2 concentrations at the exit of the gasifier were compared with previous work of the same geometry and operating points. Predictions showed that the CO and H2 concentration increased gradually to its maximum value with increasing oxygen-coal and hydrogen-coal ratio and decreased. When the oxygen-coal ratio was between 0.8 and 1.2, and the steam-coal ratio was between 0.4 and 0.5, high values of CO and H2 were obtained. This study also deals with the comparison of CFD (Computational Flow Dynamics) and STATNJAN results which consider the objective gasifier as chemical equilibrium to know the effect of flow on objective gasifier compared to equilibrium. This study makes objective gasifier divided into a few ranges to study the evolution of the gasification locally. By this method, we can find that there are characteristics in the each scope divided.

  • PDF

Automatic Generation of Bibliographic Metadata with Reference Information for Academic Journals (학술논문 내에서 참고문헌 정보가 포함된 서지 메타데이터 자동 생성 연구)

  • Jeong, Seonki;Shin, Hyeonho;Ji, Seon-Yeong;Choi, Sungphil
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.56 no.3
    • /
    • pp.241-264
    • /
    • 2022
  • Bibliographic metadata can help researchers effectively utilize essential publications that they need and grasp academic trends of their own fields. With the manual creation of the metadata costly and time-consuming. it is nontrivial to effectively automatize the metadata construction using rule-based methods due to the immoderate variety of the article forms and styles according to publishers and academic societies. Therefore, this study proposes a two-step extraction process based on rules and deep neural networks for generating bibliographic metadata of scientific articlles to overcome the difficulties above. The extraction target areas in articles were identified by using a deep neural network-based model, and then the details in the areas were analyzed and sub-divided into relevant metadata elements. IThe proposed model also includes a model for generating reference summary information, which is able to separate the end of the text and the starting point of a reference, and to extract individual references by essential rule set, and to identify all the bibliographic items in each reference by a deep neural network. In addition, in order to confirm the possibility of a model that generates the bibliographic information of academic papers without pre- and post-processing, we conducted an in-depth comparative experiment with various settings and configurations. As a result of the experiment, the method proposed in this paper showed higher performance.

Development of Quantification Methods for the Myocardial Blood Flow Using Ensemble Independent Component Analysis for Dynamic $H_2^{15}O$ PET (동적 $H_2^{15}O$ PET에서 앙상블 독립성분분석법을 이용한 심근 혈류 정량화 방법 개발)

  • Lee, Byeong-Il;Lee, Jae-Sung;Lee, Dong-Soo;Kang, Won-Jun;Lee, Jong-Jin;Kim, Soo-Jin;Choi, Seung-Jin;Chung, June-Key;Lee, Myung-Chul
    • The Korean Journal of Nuclear Medicine
    • /
    • v.38 no.6
    • /
    • pp.486-491
    • /
    • 2004
  • Purpose: factor analysis and independent component analysis (ICA) has been used for handling dynamic image sequences. Theoretical advantages of a newly suggested ICA method, ensemble ICA, leaded us to consider applying this method to the analysis of dynamic myocardial $H_2^{15}O$ PET data. In this study, we quantified patients' blood flow using the ensemble ICA method. Materials and Methods: Twenty subjects underwent $H_2^{15}O$ PET scans using ECAT EXACT 47 scanner and myocardial perfusion SPECT using Vertex scanner. After transmission scanning, dynamic emission scans were initiated simultaneously with the injection of $555{\sim}740$ MBq $H_2^{15}O$. Hidden independent components can be extracted from the observed mixed data (PET image) by means of ICA algorithms. Ensemble learning is a variational Bayesian method that provides an analytical approximation to the parameter posterior using a tractable distribution. Variational approximation forms a lower bound on the ensemble likelihood and the maximization of the lower bound is achieved through minimizing the Kullback-Leibler divergence between the true posterior and the variational posterior. In this study, posterior pdf was approximated by a rectified Gaussian distribution to incorporate non-negativity constraint, which is suitable to dynamic images in nuclear medicine. Blood flow was measured in 9 regions - apex, four areas in mid wall, and four areas in base wall. Myocardial perfusion SPECT score and angiography results were compared with the regional blood flow. Results: Major cardiac components were separated successfully by the ensemble ICA method and blood flow could be estimated in 15 among 20 patients. Mean myocardial blood flow was $1.2{\pm}0.40$ ml/min/g in rest, $1.85{\pm}1.12$ ml/min/g in stress state. Blood flow values obtained by an operator in two different occasion were highly correlated (r=0.99). In myocardium component image, the image contrast between left ventricle and myocardium was 1:2.7 in average. Perfusion reserve was significantly different between the regions with and without stenosis detected by the coronary angiography (P<0.01). In 66 segment with stenosis confirmed by angiography, the segments with reversible perfusion decrease in perfusion SPECT showed lower perfusion reserve values in $H_2^{15}O$ PET. Conclusions: Myocardial blood flow could be estimated using an ICA method with ensemble learning. We suggest that the ensemble ICA incorporating non-negative constraint is a feasible method to handle dynamic image sequence obtained by the nuclear medicine techniques.