• Title/Summary/Keyword: probabilistic models

Search Result 461, Processing Time 0.032 seconds

Markov's Modeling for Screening Strategies for Colorectal Cancer

  • Barouni, Mohsen;Larizadeh, Mohammad Hassan;Sabermahani, Asma;Ghaderi, Hossien
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.13 no.10
    • /
    • pp.5125-5129
    • /
    • 2012
  • Economic decision models are being increasingly used to assess medical interventions. Advances in this field are mainly due to enhanced processing capacity of computers, availability of specific software to perform the necessary tasks, and refined mathematical techniques. We here estimated the incremental cost-effectiveness of ten strategies for colon cancer screening, as well as no screening, incorporating quality of life, noncompliance and data on the costs and profit of chemotherapy in Iran. We used a Markov model to measure the costs and quality-adjusted life expectancy of a 50-year-old average-risk Iranian without screening and with screening by each test. In this paper, we tested the model with data from the Ministry of Health and published literature. We considered costs from the perspective of a health insurance organization, with inflation to 2011, the Iranian Rial being converted into US dollars. We focused on three tests for the 10 strategies considered currently being used for population screening in some Iranians provinces (Kerman, Golestan Mazandaran, Ardabil, and Tehran): low-sensitivity guaiac fecal occult blood test, performed annually; fecal immunochemical test, performed annually; and colonoscopy, performed every 10 years. These strategies reduced the incidence of colorectal cancer by 39%, 60% and 76%, and mortality by 50%, 69% and 78%, respectively, compared with no screening. These approaches generated ICER (incremental cost-effectiveness ratios) of $9067, $654 and $8700 per QALY (quality-adjusted life year), respectively. Sensitivity analyses were conducted to assess the influence of various scales on the economic evaluation of screening. The results were sensitive to probabilistic sensitivity analysis. Colonoscopy every ten years yielded the greatest net health value. Screening for colon cancer is economical and cost-effective over conventional levels of WTP8.

Design of RFID Air Protocol Filtering and Probabilistic Simulation of Identification Procedure (RFID 무선 프로토콜 필터링의 설계와 확률적 인식 과정 시뮬레이션)

  • Park, Hyun-Sung;Kim, Jong-Deok
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.34 no.6B
    • /
    • pp.585-594
    • /
    • 2009
  • Efficient filtering is an important factor in RFID system performance. Because of huge volume of tag data in future ubiquitous environment, if RFID readers transmit tag data without filtering to upper-layer applications, which results in a significant system performance degradation. In this paper, we provide an efficient filtering technique which operates on RFID air protocol. RFID air protocol filtering between tags and a reader has some advantages over filtering in readers and middleware, because air protocol filtering reduces the volume of filtering work before readers and middleware start filtering. Exploiting the air protocol filtering advantage, we introduce a geometrical algorithm for generating air protocol filters and verify their performance through simulation with analytical time models. Results of dense RFID reader environment show that air protocol filtering algorithms reduce almost a half of the total filtering time when compared to the results of linear search.

Optimal Facial Emotion Feature Analysis Method based on ASM-LK Optical Flow (ASM-LK Optical Flow 기반 최적 얼굴정서 특징분석 기법)

  • Ko, Kwang-Eun;Park, Seung-Min;Park, Jun-Heong;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.21 no.4
    • /
    • pp.512-517
    • /
    • 2011
  • In this paper, we propose an Active Shape Model (ASM) and Lucas-Kanade (LK) optical flow-based feature extraction and analysis method for analyzing the emotional features from facial images. Considering the facial emotion feature regions are described by Facial Action Coding System, we construct the feature-related shape models based on the combination of landmarks and extract the LK optical flow vectors at each landmarks based on the centre pixels of motion vector window. The facial emotion features are modelled by the combination of the optical flow vectors and the emotional states of facial image can be estimated by the probabilistic estimation technique, such as Bayesian classifier. Also, we extract the optimal emotional features that are considered the high correlation between feature points and emotional states by using common spatial pattern (CSP) analysis in order to improvise the operational efficiency and accuracy of emotional feature extraction process.

A Conceptual Approach for Discovering Proportions of Disjunctive Routing Patterns in a Business Process Model

  • Kim, Kyoungsook;Yeon, Moonsuk;Jeong, Byeongsoo;Kim, Kwanghoon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.2
    • /
    • pp.1148-1161
    • /
    • 2017
  • The success of a business process management system stands or falls on the quality of the business processes. Many experiments therefore have been devoting considerable attention to the modeling and analysis of business processes in process-centered organizations. One of those experiments is to apply the probabilistic theories to the analytical evaluations of business process models in order to improve their qualities. In this paper, we excogitate a conceptual way of applying a probability theory of proportions into modeling business processes. There are three types of routing patterns such as sequential, disjunctive, conjunctive and iterative routing patterns in modeling business processes, into which the proportion theory is applicable. This paper focuses on applying the proportion theory to the disjunctive routing patterns, in particular, and formally named proportional information control net that is the formal representation of a corresponding business process model. In this paper, we propose a conceptual approach to discover a proportional information control net from the enactment event histories of the corresponding business process, and describe the details of a series of procedural frameworks and operational mechanisms formally and graphically supporting the proposed approach. We strongly believe that the conceptual approach with the proportional information control net ought to be very useful to improve the quality of business processes by adapting to the reengineering and redesigning the corresponding business processes.

Evolutionary Algorithms with Distribution Estimation by Variational Bayesian Mixtures of Factor Analyzers (변분 베이지안 혼합 인자 분석에 의한 분포 추정을 이용하는 진화 알고리즘)

  • Cho Dong-Yeon;Zhang Byoung-Tak
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.11
    • /
    • pp.1071-1083
    • /
    • 2005
  • By estimating probability distributions of the good solutions in the current population, some researchers try to find the optimal solution more efficiently. Particularly, finite mixtures of distributions have a very useful role in dealing with complex problems. However, it is difficult to choose the number of components in the mixture models and merge superior partial solutions represented by each component. In this paper, we propose a new continuous evolutionary optimization algorithm with distribution estimation by variational Bayesian mixtures of factor analyzers. This technique can estimate the number of mixtures automatically and combine good sub-solutions by sampling new individuals with the latent variables. In a comparison with two probabilistic model-based evolutionary algorithms, the proposed scheme achieves superior performance on the traditional benchmark function optimization. We also successfully estimate the parameters of S-system for the dynamic modeling of biochemical networks.

Reliability analysis of reinforced concrete haunched beams shear capacity based on stochastic nonlinear FE analysis

  • Albegmprli, Hasan M.;Cevik, Abdulkadir;Gulsan, M. Eren;Kurtoglu, Ahmet Emin
    • Computers and Concrete
    • /
    • v.15 no.2
    • /
    • pp.259-277
    • /
    • 2015
  • The lack of experimental studies on the mechanical behavior of reinforced concrete (RC) haunched beams leads to difficulties in statistical and reliability analyses. This study performs stochastic and reliability analyses of the ultimate shear capacity of RC haunched beams based on nonlinear finite element analysis. The main aim of this study is to investigate the influence of uncertainty in material properties and geometry parameters on the mechanical performance and shear capacity of RC haunched beams. Firstly, 65 experimentally tested RC haunched beams and prismatic beams are analyzed via deterministic nonlinear finite element method by a special program (ATENA) to verify the efficiency of utilized numerical models, the shear capacity and the crack pattern. The accuracy of nonlinear finite element analyses is verified by comparing the results of nonlinear finite element and experiments and both results are found to be in a good agreement. Afterwards, stochastic analyses are performed for each beam where the RC material properties and geometry parameters are assigned to take probabilistic values using an advanced simulating procedure. As a result of stochastic analysis, statistical parameters are determined. The statistical parameters are obtained for resistance bias factor and the coefficient of variation which were found to be equal to 1.053 and 0.137 respectively. Finally, reliability analyses are accomplished using the limit state functions of ACI-318 and ASCE-7 depending on the calculated statistical parameters. The results show that the RC haunched beams have higher sensitivity and riskiness than the RC prismatic beams.

FAULT DETECTION COVERAGE QUANTIFICATION OF AUTOMATIC TEST FUNCTIONS OF DIGITAL I&C SYSTEM IN NPPS

  • Choi, Jong-Gyun;Lee, Seung-Jun;Kang, Hyun-Gook;Hur, Seop;Lee, Young-Jun;Jang, Seung-Cheol
    • Nuclear Engineering and Technology
    • /
    • v.44 no.4
    • /
    • pp.421-428
    • /
    • 2012
  • Analog instrument and control systems in nuclear power plants have recently been replaced with digital systems for safer and more efficient operation. Digital instrument and control systems have adopted various fault-tolerant techniques that help the system correctly and safely perform the specific required functions regardless of the presence of faults. Each fault-tolerant technique has a different inspection period, from real-time monitoring to monthly testing. The range covered by each faulttolerant technique is also different. The digital instrument and control system, therefore, adopts multiple barriers consisting of various fault-tolerant techniques to increase the total fault detection coverage. Even though these fault-tolerant techniques are adopted to ensure and improve the safety of a system, their effects on the system safety have not yet been properly considered in most probabilistic safety analysis models. Therefore, it is necessary to develop an evaluation method that can describe these features of digital instrument and control systems. Several issues must be considered in the fault coverage estimation of a digital instrument and control system, and two of these are addressed in this work. The first is to quantify the fault coverage of each fault-tolerant technique implemented in the system, and the second is to exclude the duplicated effect of fault-tolerant techniques implemented simultaneously at each level of the system's hierarchy, as a fault occurring in a system might be detected by one or more fault-tolerant techniques. For this work, a fault injection experiment was used to obtain the exact relations between faults and multiple barriers of faulttolerant techniques. This experiment was applied to a bistable processor of a reactor protection system.

Probability-based Deep Learning Clustering Model for the Collection of IoT Information (IoT 정보 수집을 위한 확률 기반의 딥러닝 클러스터링 모델)

  • Jeong, Yoon-Su
    • Journal of Digital Convergence
    • /
    • v.18 no.3
    • /
    • pp.189-194
    • /
    • 2020
  • Recently, various clustering techniques have been studied to efficiently handle data generated by heterogeneous IoT devices. However, existing clustering techniques are not suitable for mobile IoT devices because they focus on statically dividing networks. This paper proposes a probabilistic deep learning-based dynamic clustering model for collecting and analyzing information on IoT devices using edge networks. The proposed model establishes a subnet by applying the frequency of the attribute values collected probabilistically to deep learning. The established subnets are used to group information extracted from seeds into hierarchical structures and improve the speed and accuracy of dynamic clustering for IoT devices. The performance evaluation results showed that the proposed model had an average 13.8 percent improvement in data processing time compared to the existing model, and the server's overhead was 10.5 percent lower on average than the existing model. The accuracy of extracting IoT information from servers has improved by 8.7% on average from previous models.

Statistical analysis and probabilistic modeling of WIM monitoring data of an instrumented arch bridge

  • Ye, X.W.;Su, Y.H.;Xi, P.S.;Chen, B.;Han, J.P.
    • Smart Structures and Systems
    • /
    • v.17 no.6
    • /
    • pp.1087-1105
    • /
    • 2016
  • Traffic load and volume is one of the most important physical quantities for bridge safety evaluation and maintenance strategies formulation. This paper aims to conduct the statistical analysis of traffic volume information and the multimodal modeling of gross vehicle weight (GVW) based on the monitoring data obtained from the weigh-in-motion (WIM) system instrumented on the arch Jiubao Bridge located in Hangzhou, China. A genetic algorithm (GA)-based mixture parameter estimation approach is developed for derivation of the unknown mixture parameters in mixed distribution models. The statistical analysis of one-year WIM data is firstly performed according to the vehicle type, single axle weight, and GVW. The probability density function (PDF) and cumulative distribution function (CDF) of the GVW data of selected vehicle types are then formulated by use of three kinds of finite mixed distributions (normal, lognormal and Weibull). The mixture parameters are determined by use of the proposed GA-based method. The results indicate that the stochastic properties of the GVW data acquired from the field-instrumented WIM sensors are effectively characterized by the method of finite mixture distributions in conjunction with the proposed GA-based mixture parameter identification algorithm. Moreover, it is revealed that the Weibull mixture distribution is relatively superior in modeling of the WIM data on the basis of the calculated Akaike's information criterion (AIC) values.

Centroid-model based music similarity with alpha divergence (알파 다이버전스를 이용한 무게중심 모델 기반 음악 유사도)

  • Seo, Jin Soo;Kim, Jeonghyun;Park, Jihyun
    • The Journal of the Acoustical Society of Korea
    • /
    • v.35 no.2
    • /
    • pp.83-91
    • /
    • 2016
  • Music-similarity computation is crucial in developing music information retrieval systems for browsing and classification. This paper overviews the recently-proposed centroid-model based music retrieval method and applies the distributional similarity measures to the model for retrieval-performance evaluation. Probabilistic distance measures (also called divergence) compute the distance between two probability distributions in a certain sense. In this paper, we consider the alpha divergence in computing distance between two centroid models for music retrieval. The alpha divergence includes the widely-used Kullback-Leibler divergence and Bhattacharyya distance depending on the values of alpha. Experiments were conducted on both genre and singer datasets. We compare the music-retrieval performance of the distributional similarity with that of the vector distances. The experimental results show that the alpha divergence improves the performance of the centroid-model based music retrieval.