• Title/Summary/Keyword: Consistency Checks

Search Result 23, Processing Time 0.032 seconds

Design and Implementation of a Large-Scale Spatial Reasoner Using MapReduce Framework (맵리듀스 프레임워크를 이용한 대용량 공간 추론기의 설계 및 구현)

  • Nam, Sang Ha;Kim, In Cheol
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.3 no.10
    • /
    • pp.397-406
    • /
    • 2014
  • In order to answer the questions successfully on behalf of the human in DeepQA environments such as Jeopardy! of the American quiz show, the computer is required to have the capability of fast temporal and spatial reasoning on a large-scale commonsense knowledge base. In this paper, we present a scalable spatial reasoning algorithm for deriving efficiently new directional and topological relations using the MapReduce framework, one of well-known parallel distributed computing environments. The proposed reasoning algorithm assumes as input a large-scale spatial knowledge base including CSD-9 directional relations and RCC-8 topological relations. To infer new directional and topological relations from the given spatial knowledge base, it performs the cross-consistency checks as well as the path-consistency checks on the knowledge base. To maximize the parallelism of reasoning computations according to the principle of the MapReduce framework, we design the algorithm to partition effectively the large knowledge base into smaller ones and distribute them over multiple computing nodes at the map phase. And then, at the reduce phase, the algorithm infers the new knowledge from distributed spatial knowledge bases. Through experiments performed on the sample knowledge base with the MapReduce-based implementation of our algorithm, we proved the high performance of our large-scale spatial reasoner.

Reliability and Validity on Measurement Instrument for Health Status Assessment in Occupational Workers (직장인들의 건강수준 평가를 위한 측정도구의 신뢰도와 타당도 분석)

  • Koh, Sang-Baek;Chang, Sei-Jin;Kang, Myung-Guen;Cha, Bong-Suk;Park, Jong-Ku
    • Journal of Preventive Medicine and Public Health
    • /
    • v.30 no.2 s.57
    • /
    • pp.251-266
    • /
    • 1997
  • In order to test scaling assumption, and to assess the validity, reliability, and acceptability of the Short form 36(SF-36) health survey questionnaire, we conducted a survey. Samples were 296 workers who had been employed in small sized companies. All scale passed for item internal consistency(100% sucess rate) and item discriminant validity(100% success .ate). Reliability coefficients were ranged from the lowest 0.51 to the highest of 0.85. For 87.5% of the total workers, inconsistent responses were not observed. Only 3.0% of the total workers failed two or more checks. Factor analysis was performed using principal axis factor method and quartimax rotation. In this survey, the SF-36 retained available psychometric properties even when used in a generally healthy worker group. But further study with some consideration to develope health status measurement is expected : first, the definition of health status should be rationalized. Second, the measurement of outcome is an important consideration in evaluations of quality of care. But ambiguities hinder understanding of this important topic. Third, internal consistency should be interpreted with caution as an indication reliability because it ignores potentially important sources of variation that can occur over time.

  • PDF

A Study on the Duplicate Records Detection in the Serials Union Catalog (연속간행물 종합목록의 중복레코드 최소화 방안 연구)

  • Lee, Hye-jin;Choi, Ho-nam;Kim, Wan-jong;Kim, Soon-young
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2007.11a
    • /
    • pp.445-448
    • /
    • 2007
  • A Serials Union Catalog is an essential Bibliographic Control tool for integrated and shared the serials information which is scattered to the domestic libraries. It provides reliable informations about serials to user through creating optimized catalogs and holding informations. It is important of the consistency of the bibliographic record and the record's duplication ratio is an important criterion about Database Quality Assessment. This paper checks bibliographic data elements and proposes the duplicate detection process to improve union catalog quality for minimizing duplicate detection.

  • PDF

QUALITY ASSURANCE IMPLEMENTATION IN THE NATIONAL CANCER CENTRE

  • Jui, Wong-Toh
    • Proceedings of the Korean Society of Medical Physics Conference
    • /
    • 2002.09a
    • /
    • pp.19-22
    • /
    • 2002
  • The importance of accurate dose delivery in radiotherapy is well documented. Studies have shown that a mere 5% deviation of the prescribed dose can produce an undesirable treatment outcome. Uncertainties in the dose delivery can arise at different stages of the radiotherapy process. Therefore, a good quality assurance programme will ensure the best possible results and consistency of the radiotherapeutic treatment. Quality assurance in any radiotherapy department involves the responsibility of a multi-disciplinary team of radiation oncologists, medical physicists and radiation technologists. This paper will focus on the physical and technical aspects of QA. The organizational structure and responsibility of the physics QA team is outlined and also included the types and frequencies of QA checks. For a QA program to be effective, action levels should be clearly defined and understood by all staff concerned. Data of the Singapore National Cancer Centre's participation over the last ten years with the IAEA / WHO Postal TLD Dose Inter-comparison programme is presented. The data obtained were within the international criteria. For a QA program to be successfully implemented, there must be a commitment by management to provide adequate staff, test equipment, machine time as well as continual training and education. This is in addition to the positive attitudes of all the staff. A quality audit is also necessary to serve as a check and balance to ensure that the QA is in order.

  • PDF

Classifying Rules by In-out Traffic Direction to Avoid Security Policy Anomaly

  • Kim, Sung-Hyun;Lee, Hee-Jo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.4 no.4
    • /
    • pp.671-690
    • /
    • 2010
  • The continuous growth of attacks in the Internet causes to generate a number of rules in security devices such as Intrusion Prevention Systems, firewalls, etc. Policy anomalies in security devices create security holes and prevent the system from determining quickly whether allow or deny a packet. Policy anomalies exist among the rules in multiple security devices as well as in a single security device. The solution for policy anomalies requires complex and complicated algorithms. In this paper, we propose a new method to remove policy anomalies in a single security device and avoid policy anomalies among the rules in distributed security devices. The proposed method classifies rules according to traffic direction and checks policy anomalies in each device. It is unnecessary to compare the rules for outgoing traffic with the rules for incoming traffic. Therefore, classifying rules by in-out traffic, the proposed method can reduce the number of rules to be compared up to a half. Instead of detecting policy anomalies in distributed security devices, one adopts the rules from others for avoiding anomaly. After removing policy anomalies in each device, other firewalls can keep the policy consistency without anomalies by adopting the rules of a trusted firewall. In addition, it blocks unnecessary traffic because a source side sends as much traffic as the destination side accepts. Also we explain another policy anomaly which can be found under a connection-oriented communication protocol.

Quality Control of Observed Temperature Time Series from the Korea Ocean Research Stations: Preliminary Application of Ocean Observation Initiative's Approach and Its Limitation (해양과학기지 시계열 관측 자료 품질관리 시스템 구축: 국제 관측자료 품질관리 방안 수온 관측 자료 시범적용과 문제점)

  • Min, Yongchim;Jeong, Jin-Yong;Jang, Chan Joo;Lee, Jaeik;Jeong, Jongmin;Min, In-Ki;Shim, Jae-Seol;Kim, Yong Sun
    • Ocean and Polar Research
    • /
    • v.42 no.3
    • /
    • pp.195-210
    • /
    • 2020
  • The observed time series from the Korea Ocean Research Stations (KORS) in the Yellow and East China Seas (YECS) have various sources of noise, including bio-fouling on the underwater sensors, intermittent depletion of power, cable leakage, and interference between the sensors' signals. Besides these technical issues, intricate waves associated with background tidal currents tend to result in substantial oscillations in oceanic time series. Such technical and environmental issues require a regionally optimized automatic quality control (QC) procedure. Before the achievement of this ultimate goal, we examined the approach of the Ocean Observatories Initiative (OOI)'s standard QC to investigate whether this procedure is pertinent to the KORS. The OOI QC consists of three categorized tests of global/local range of data, temporal variation including spike and gradient, and sensor-related issues associated with its stuck and drift. These OOI QC algorithms have been applied to the water temperature time series from the Ieodo station, one of the KORS. Obvious outliers are flagged successfully by the global/local range checks and the spike check. Both stuck and drift checks barely detected sensor-related errors, owing to frequent sensor cleaning and maintenance. The gradient check, however, fails to flag the remained outliers that tend to stick together closely, as well as often tend to mark probably good data as wrong data, especially data characterized by considerable fluctuations near the thermocline. These results suggest that the gradient check might not be relevant to observations involving considerable natural fluctuations as well as technical issues. Our study highlights the necessity of a new algorithm such as a standard deviation-based outlier check using multiple moving windows to replace the gradient check and an additional algorithm of an inter-consistency check with a related variable to build a standard QC procedure for the KORS.

Asynchronous Cache Invalidation Strategy to Support Read-Only Transaction in Mobile Environments (이동 컴퓨팅 환경에서 읽기-전용 트랜잭션을 지원하기 위한 비동기적 캐쉬 무효화 기법)

  • Kim, Il-Do;Nam, Sung-Hun
    • The KIPS Transactions:PartC
    • /
    • v.10C no.3
    • /
    • pp.325-334
    • /
    • 2003
  • In stateless server, if an asynchronous cache invalidation scheme attempts to support local processing of read-only transaction in mobile client/sever database systems, a critical problem may occur ; the asynchronous invalidation reports provide no guarantees of waiting time for mobile transactions requesting commit. To solve this problem, the server in our algorithm broadcasts two kind of messages, asynchronous invalidation report to reduce transaction latency and periodic guide message to avoid the uncertainty of waiting time for the next invalidation report. The asynchronous invalidation report has its own sequence number and the periodic guide message has the sequence number of the most recently broadcast asynchronous invalidation report. A mobile client checks its cache validity by using the sequence numbers of these messages.

Model Checking of Concurrent Object-Oriented Systems (병렬 객체지향 시스템의 검증)

  • Cho, Seung-Mo;Kim, Young-Gon;Bae, Doo-Hwan;Byun, Sung-Won;Kim, Sang-Taek
    • Journal of KIISE:Software and Applications
    • /
    • v.27 no.1
    • /
    • pp.1-12
    • /
    • 2000
  • Model checking is a formal verification technique which checks the consistency between a requirement specification and a behavior model of the system by explorating the state space of the model. We apply model checking to the formal verification of the concurrent object-oriented system, using an existing model checker SPIN which has been successful in verifying concurrent systems. First, we propose an Actor-based modeling language, called APromela, by extending the modeling language Promela which is a modeling language supported in SPIN. APromela supports not only all the primitives of Promela, but additional primitives needed to model concurrent object-oriented systems, such as class definition, object instantiation, message send, and synchronization.Second, we provide translation rules for mapping APromela's such modeling primitives to Promela's. As an application of APromela, we suggest a verification method for UML models. By giving an example of specification, translation, and verification, we also demonstrate the applicability of our proposed approach, and discuss the limitations and further research issues.

  • PDF

Early overcounting in otoliths: a case study of age and growth for gindai (Pristipomoides zonatus) using bomb 14C dating

  • Andrews, Allen H;Scofield, Taylor R.
    • Fisheries and Aquatic Sciences
    • /
    • v.24 no.1
    • /
    • pp.53-62
    • /
    • 2021
  • Gindai (Pristipomoides zonatus) is one of six snappers in a management complex called the Deep 7 of the Hawaiian Islands. Little is known about its life history and a preliminary analysis of otolith thin sections indicated the species may exhibit moderate growth with a lifespan approaching 40 years. Preliminary age estimates from the previous study were reinvestigated using the same otolith sections in an attempt to validate those ages with bomb radiocarbon (14C) dating. From the misalignment of birth years for the otolith 14C measurements with regional references - the post-peak bomb 14C decline period - it was concluded that previous ages were inflated from overcounting of the earliest growth zone structure in otolith sections. The oldest gindai was re-aged to 26 years once the age reading was adjusted for early overcounting, 13 years younger than the original estimate of 39 years for this fish. In general, the earliest otolith growth of gindai was massive and complicated by numerous subannual checks. The approach of lumping the early growth structures was supported by the alignment of 14C measurements from otolith core material (first year of growth). The result was greater consistency of calculated birthdates with the 14C decline reference, along with minor offsets that may indicate age estimation was imprecise by a few years for some individuals. The revised von Bertalanffy growth function applied to the validated age-at-length estimates revealed more rapid growth (k = 0.378 cf. 0.113) and a lifespan of approximately 30 years. The findings presented here are a case study of how the bomb 14C decline period can be used as a tool in the refinement of age reading protocols.

Design and Implementation of a Spatial-Operation-Trigger for Supporting the Integrity of Meet-Spatial-Objects (상접한 공간 객체의 무결성 지원을 위한 공간 연산 트리거의 설계 및 구현)

  • Ahn, Jun-Soon;Cho, Sook-Kyoung;Chung, Bo-Hung;Lee, Jae-Dong;Bae, Hae-Young
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.8 no.2
    • /
    • pp.127-140
    • /
    • 2002
  • In a spatial database system, the semantic integrity should be supported for maintaining the data consistency. In the real world, spatial objects In boundary layer should always meet neighbor objects, and they cannot hold the same name. This characteristic is an implied concept in real world. So, when this characteristic is disobeyed due to the update operations of spatial objects, it is necessary to maintain the integrity of a layer. In this thesis, we propose a spatial-operation-trigger for supporting the integrity of spatial objects. The proposed method is defined a spatial-operation-trigger based on SQL-3 and executed when the constraint condition is violated. A spatial-operation-trigger have the strategy of execution. Firstly, for one layer, the spatial and aspatial data triggers are executed respectively. Secondly, the aspatial data trigger for the other layers is executed. Spatial-operation-trigger for one layer checks whether the executed operation updates only spatial data, aspatial data, or both of them, and determines the execution strategy of a spatial-operation-trigger. Finally, the aspatial data trigger for the other layers is executed. A spatial-operation-trigger is executed in three steps for the semantic integrity of the meet-property of spatial objects. And, it provides the semantic integrity of spatial objects and the convenience for users using automatic correcting operation.