• Title/Summary/Keyword: Consistency Algorithm

Search Result 256, Processing Time 0.029 seconds

Bayesian estimation for finite population proportions in multinomial data

  • Kwak, Sang-Gyu;Kim, Dal-Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.3
    • /
    • pp.587-593
    • /
    • 2012
  • We study Bayesian estimates for finite population proportions in multinomial problems. To do this, we consider a three-stage hierarchical Bayesian model. For prior, we use Dirichlet density to model each cell probability in each cluster. Our method does not require complicated computation such as Metropolis-Hastings algorithm to draw samples from each density of parameters. We draw samples using Gibbs sampler with grid method. We apply this algorithm to a couple of simulation data under three scenarios and we estimate the finite population proportions using two kinds of approaches We compare results with the point estimates of finite population proportions and their standard deviations. Finally, we check the consistency of computation using differen samples drawn from distinct iterates.

An Evidence Retraction Scheme on Evidence Dependency Network

  • Lee, Gye Sung
    • International journal of advanced smart convergence
    • /
    • v.8 no.1
    • /
    • pp.133-140
    • /
    • 2019
  • In this paper, we present an algorithm for adjusting degree of belief for consistency on the evidence dependency network where various sets of evidence support different sets of hypotheses. It is common for experts to assign higher degree of belief to a hypothesis when there is more evidence over the hypothesis. Human expert without knowledge of uncertainty handling may not be able to cope with how evidence is combined to produce the anticipated belief value. Belief in a hypothesis changes as a series of evidence is known to be true. In non-monotonic reasoning environments, the belief retraction method is needed to clearly deal with uncertain situations. We create evidence dependency network from rules and apply the evidence retraction algorithm to refine belief values on the hypothesis set. We also introduce negative belief values to reflect the reverse effect of evidence combination.

A Variational Model For Longitudinal Brain Tissue Segmentation

  • Tang, Mingjun;Chen, Renwen;You, Zijuan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.11
    • /
    • pp.3479-3492
    • /
    • 2022
  • Longitudinal quantification of brain changes due to development, aging or disease plays an important role in the filed of personalized-medicine applications. However, due to the temporal variability in shape and different imaging equipment and parameters, estimating anatomical changes in longitudinal studies is significantly challenging. In this paper, a longitudinal Magnetic Resonance(MR) brain image segmentation algorithm proposed by combining intensity information and anisotropic smoothness term which contain a spatial smoothness constraint and longitudinal consistent constraint into a variational framework. The minimization of the proposed energy functional is strictly and effectively derived from a fast optimization algorithm. A large number of experimental results show that the proposed method can guarantee segmentation accuracy and longitudinal consistency in both simulated and real longitudinal MR brain images for analysis of anatomical changes over time.

Optimal dwelling time prediction for package tour using K-nearest neighbor classification algorithm

  • Aria Bisma Wahyutama;Mintae Hwang
    • ETRI Journal
    • /
    • v.46 no.3
    • /
    • pp.473-484
    • /
    • 2024
  • We introduce a machine learning-based web application to help travel agents plan a package tour schedule. K-nearest neighbor (KNN) classification predicts the optimal tourists' dwelling time based on a variety of information to automatically generate a convenient tour schedule. A database collected in collaboration with an established travel agency is fed into the KNN algorithm implemented in the Python language, and the predicted dwelling times are sent to the web application via a RESTful application programming interface provided by the Flask framework. The web application displays a page in which the agents can configure the initial data and predict the optimal dwelling time and automatically update the tour schedule. After conducting a performance evaluation by simulating a scenario on a computer running the Windows operating system, the average response time was 1.762 s, and the prediction consistency was 100% over 100 iterations.

Interface Mapping and Generation Methods for Intuitive User Interface and Consistency Provision (사용자 인터페이스의 직관적인 인식 및 일관성 부여를 위한 인터페이스 매핑 및 생성 기법)

  • Yoon, Hyo-Seok;Woo, Woon-Tack
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.135-139
    • /
    • 2009
  • In this paper we present INCUI, a user interface based on natural view of physical user interface of target devices and services in pervasive computing environment. We present a concept of Intuitively Natural and Consistent User Interface (INCUI) consisted of an image of physical user interface and a description XML file. Then we elaborate how INCUI template can be used to consistently map user interface components structurally and visually. We describe the process of INCUI mapping and a novel mapping method selection architecture based on domain size, types of source and target INCUI. Especially we developed and applied an extended LCS-based algorithm using prefix/postfix/synonym for similarity calculation.

  • PDF

GPS Integrity Monitoring Method Using Auxiliary Nonlinear Filters with Log Likelihood Ratio Test Approach

  • Ahn, Jong-Sun;Rosihan, Rosihan;Won, Dae-Hee;Lee, Young-Jae;Nam, Gi-Wook;Heo, Moon-Beom;Sung, Sang-Kyung
    • Journal of Electrical Engineering and Technology
    • /
    • v.6 no.4
    • /
    • pp.563-572
    • /
    • 2011
  • Reliability is an essential factor in a navigation system. Therefore, an integrity monitoring system is considered one of the most important parts in an avionic navigation system. A fault due to systematic malfunctioning definitely requires integrity reinforcement through systematic analysis. In this paper, we propose a method to detect faults of the GPS signal by using a distributed nonlinear filter based probability test. In order to detect faults, consistency is examined through a likelihood ratio between the main and auxiliary particle filters (PFs). Specifically, the main PF which includes all the measurements and the auxiliary PFs which only do partial measurements are used in the process of consistency testing. Through GPS measurement and the application of the autonomous integrity monitoring system, the current study illustrates the performance of the proposed fault detection algorithm.

A new Observation Model to Improve the Consistency of EKF-SLAM Algorithm in Large-scale Environments (광범위 환경에서 EKF-SLAM의 일관성 향상을 위한 새로운 관찰모델)

  • Nam, Chang-Joo;Kang, Jae-Hyeon;Doh, Nak-Ju Lett
    • The Journal of Korea Robotics Society
    • /
    • v.7 no.1
    • /
    • pp.29-34
    • /
    • 2012
  • This paper suggests a new observation model for Extended Kalman Filter based Simultaneous Localization and Mapping (EKF-SLAM). Since the EKF framework linearizes non-linear functions around the current estimate, the conventional line model has large linearization errors when a mobile robot locates faraway from its initial position. On the other hand, the model that we propose yields less linearization error with respect to the landmark position and thus suitable in a large-scale environment. To achieve it, we build up a three-dimensional space by adding a virtual axis to the robot's two-dimensional coordinate system and extract a plane by using a detected line on the two-dimensional space and the virtual axis. Since Jacobian matrix with respect to the landmark position has small value, we can estimate the position of landmarks better than the conventional line model. The simulation results verify that the new model yields less linearization errors than the conventional line model.

An Integrated Database of Engineering Documents and CAD/CAE Information for the Support of Bridge Maintenance (교량 유지관리 지원을 위한 CAD/CAE 정보와 엔지니어링 문서정보의 통합 데이터베이스)

  • Jeong Y.S.;Kim B.G.;Lee S.H.
    • Korean Journal of Computational Design and Engineering
    • /
    • v.11 no.3
    • /
    • pp.183-196
    • /
    • 2006
  • A new operation strategy. which can guarantee the data consistency of engineering information among the various intelligent information systems, is presented for engineering information of bridges, and construction methodology of integrated database is developed to support the strategy. The two core standard techniques are adopted to construct the integrated database. One is the Standard for the Exchange of Product Model Data (STEP) for CAD/CAE information and the other is the Extensible Markup Language(XML) for engineering document information. The former enabler structural engineers to handle the structural details with three-dimensional geometry-based information of bridges, and ACIS solid modeling kernel is employed to develop AutoCAD based application modules. The latter can make document files into data type for web-based application modules which assist end-users to search and retrieve engineering document data. In addition, relaying algorithm is developed to integrate the two different information, e.g. CAD/CAE information and engineering document information. The pilot application modules are also developed, and a case study subjected to the Han-Nam bridge is presented at the end of the paper to illustrate the use of the developed application modules.

Integration between XML-based Document Information and Bridge Information Model-based Structural Design Information (교량정보모델 기반의 설계정보와 XML 기반의 문서정보 통합)

  • Jeong Yeon-Suk;Kim Bong-Geun;Jeong Won-Seok;Lee Sang-Ho
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 2006.04a
    • /
    • pp.208-215
    • /
    • 2006
  • This study provides a new operation strategy which can guarantee the data consistency of engineering information among the various intelligent information systems. We present the strategies for the operation of bridges engineering information and the construction methodology of integrated database. The two core standard techniques are adopted to construct the integrated database. One of these standards is the Standard for the Exchange of Product Model Data (STEP) for CAD/CAE information and the other is the Extensible Markup Language (XML) for engineering document information. This study can transform a document me into a data type for web-based application modules which assist end-users in searching and retrieval of engineering document data. In addition, relaying algorithm is developed to integrate the two different information, e.g. CAD/CAE information and engineering document information. The pilot application modules for management and maintenance of existing bridge are also developed to show application of the strategy.

  • PDF

Conflict Resolution for Data Synchronization in Multiple Devices (다중 디바이스에서 데이터 동기화를 위한 충돌 해결)

  • Oh Seman;La Hwanggyun
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.2
    • /
    • pp.279-286
    • /
    • 2005
  • As the mobile environment has been generalized, data synchronization with mobile devices or mobile device and PC/server is required. To deal with the problem, the consortium was established by the companies, such as Motorola, Ericsson, and Nokia, and released SyncML(Synchronization Markup Language) as a standard of industrial area for interoperating with data synchronization and various transmission protocol. But, in synchronization process, when more than two clients requested data synchronization, data conflict can be happened. This paper studies the various conflict reasons that can happen in data synchronization processes and groups them systematically Through the analyzed information, we compose the Change Log Information(CLI) that can keep track of the chased information about synchronization. And we suggest an operation policy using CLI. Finally, we design an algorithm and adapt the policy as a method for the safety and consistency of data.

  • PDF