• Title/Summary/Keyword: Consistency check

Search Result 139, Processing Time 0.029 seconds

Development of Section Load Estimation Program for Smart Distribution Management System (스마트배전 운영시스템용 구간부하 추정 프로그램 개발)

  • Yun, Sang-Yun;Chu, Chul-Min;Kwan, Seung-Chul;Song, Il-Keun;Lim, Sung-Il
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.61 no.8
    • /
    • pp.1083-1090
    • /
    • 2012
  • In this paper, we present the section load estimation program of distribution system for smart distribution management system. The proposed program is composed with three parts. One is the consistency check part for switch measurements which consist a section. The consistency check is divided into the current and angle test. For the current test, we examine the input and output power flow for the switch group. For the angle test, the result of power flow calculation at previous step is used. Another is the voltage estimation part for the measured switches. We use the weighted least square (WLS) method for the voltage estimation. The third is the part of final section load calculation. The database structure for accomplishing the developed estimation program is also proposed. To verify the accuracy of the experimental results, case studies are performed using a actual data of Jeju island. The developed program can be effectively applied to the distribution operation systems.

A study on the Development of BIM-based Quality Pre-checking System in Architecture Design Phase

  • Shin, Jihye;Choi, Jungsik;Kim, Inhan
    • International conference on construction engineering and project management
    • /
    • 2015.10a
    • /
    • pp.284-288
    • /
    • 2015
  • Recently, the mandate on utilizing BIM implemented by public institutions of many countries has great impact on the significantly increasing practices of BIM. The improvement of work efficiency and productivity, which is occurred by BIM adoption, depends on the consistency and accuracy of data. To maximize the benefit of BIM, the interests in BIM data quality have been enlarging all over the world. The BIM data quality pre-check, which is conducted by designer in the design phase, offers opportunities for quality improvement by continuously assessing BIM data. However, BIM quality pre-check is being conducted under arbitrary interpretation of users because of the absence of specific review factors and assessment methods for checking BIM quality. The purpose of this study is to establish an automated BIM quality pre-checking system to improve BIM design quality effectively and efficiently. It could be expected to meet the owner's requirements and to minimize the cost and time occurred additionally from revising and reproducing data by constructing consistency and accuracy of it.

  • PDF

The Model of Appraisal Method on Authentic Records (전자기록의 진본 평가 시스템 모형 연구)

  • Kim, Ik-Han
    • The Korean Journal of Archival Studies
    • /
    • no.14
    • /
    • pp.91-117
    • /
    • 2006
  • Electronic Records need to be appraised the authenticity as well as the value itself. There has been various kinds of discussion about how records to be appraised the value of themselves, but there's little argument about how electronic records to be appraised the authenticity of themselves. Therefore this article is modeling some specific authenticity appraisal methods and showing each stages those methods should or may be applied. At the Ingest stage, integrity verification right after records creation in the organization which produced the records, quality and integrity verification about the transferred in the organization which received the records and integrity check between SIP and AIP in the organization which received and preserved the records are essential. At the Preservation stage, integrity check between same AIPs stored in different medium separately and validation of records where or not damaged and recovery damaged records are needed. At the various Processing stages, suitability evaluation after changing the record's management control meta data and changing the record's classification, integrity check after records migration and periodical validation and integrity verification about DIPs are required. For those activities, the appraisal methods including integrity verification, content consistency check, suitability evaluation about record's meta data, feasibility check of unauthorized update and physical status validation should be applied to the electronic records management process.

An Analysis of the Fake News Assessment Criteria on Fact-check Coverage (팩트체크 보도의 가짜뉴스 판단 기준 검토)

  • Baek, Kanghui
    • The Journal of the Korea Contents Association
    • /
    • v.20 no.2
    • /
    • pp.172-181
    • /
    • 2020
  • This study examines the fact-check coverage provided by the SNU fact-check center site(factcheck.snu.ac.kr). A total of 50 articles that were cross-checked by multiple news media organizations were analyzed. The study's variables were topics, types, characteristics, consistency of the news media organizations' judgement, and fact-check sources. This study found that fact-checking coverage was generally focused on presidential or general election candidates or politicians, as well as political topics. The types of fact-checking coverage primarily included factual information, as well as some opinions or interpretations. Fact-check coverage was mainly focused on the facts of the statements themselves, causal relationships, or the timing or target of the comparison criteria. On average, the fact-checking coverage most frequently assigned the judgment 'mostly false, and primarily used interviews of individuals or data from organizations involved in the issue, government data, and experts' statements as the bases for its fact-checking judgements.

Concurrency Control based on Serialization Graph for Query Transactions in Broadcast Environment : CCSG/QT (방송환경에서 질의 거래를 위해 직렬화 그래프에 기반을 둔 동시성 제어 기법)

  • 이욱현;황부현
    • Journal of KIISE:Databases
    • /
    • v.30 no.1
    • /
    • pp.95-107
    • /
    • 2003
  • The broadcast environment has asymmetric communication aspect that is typically much greater communication bandwidth available from server to clients than in the opposite direction. In addition, most of mobile computing systems allow mostly read-only transactions from mobile clients for retrieving different types of information such as stock data, traffic information and mews updates. Since previous concurrency control protocols, however, do not consider such a particular characteristics, the performance degradation occurs when previous schemes are applied to the broadcast environment. In this paper, we propose the efficient concurrency control for query transaction in broadcast environment. The following requirements are satisfied by adapting weak consistency that is the appropriate correctness criterion of read-only transactions: (1) the mutual consistency of data maintained by the server and read by clients (2) the currency of data read by clients. We also use the serialization graph scheme to check the weak consistency efficiently. As a result, we improved a performance by reducing unnecessary aborts and restarts of read-only transactions caused when global serializability was adopted.

An Analysis of Current Science Instruction Consistency by Micro Instructional Design Theory (미시적 교수설계이론에 의한 현행 과학교수의 일관성 분석 - 과학 I (하) 'V.l.태양계' 단원을 중심으로 -)

  • Paik, Seoung-Hey;Kim, Seung-Hwa;Hong, Sung-Il;Yang, II-Ho;Lee, Jae-Cheon
    • Journal of The Korean Association For Science Education
    • /
    • v.13 no.3
    • /
    • pp.366-376
    • /
    • 1993
  • In this study, a part of high school science instructional materials is evaluated by Instructional Quality Profile(IQP) based on the Merrill's Component Display Theory(CDT). The CDT is based on the Gagne's assumption of different conditions of learning for different outcomes. The IQP enables the user to check both the consistency and adequacy of existing cognitive instruction. The IQP can be used to predict student performance, and also to design and develop new instructional materials. The instructional components are classified according to 5 task levels; An Use-Generalities on Newly Encountered Examples(UGeg), A Remember-Paraphrased-Generalities(RpG), A Remember-Verbatim-Generalities(RvG), A Remember-Paraphrased-Examples (Rpeg). A Remember-Verbatim-Examples (Rveg). The analyses are composed of 3 parts; Justifying the task level of objectives, Objective-test consistency, and Test-presentation consistency. The objectives, the presentations and the tests given in a teacher's guide and a textbook are analyzed. The results show that the task levels and the content levels of the objectives are not consistent with those of the tests. And the indices of the test-presentation consistency indicate the presentation problems of the instructional materials.

  • PDF

Optimistic Concurrency Control based on TimeStamp Intervals for Broadcast Environment: OCC/TI (방송환경에서 타임스탬프 구간에 기반을 둔 낙관적 동시성 제어 기법)

  • 이욱현;황부현
    • Journal of KIISE:Databases
    • /
    • v.29 no.6
    • /
    • pp.477-491
    • /
    • 2002
  • The broadcast environment has asymmetric communication aspect that is typically much greater communication bandwidth available from server to clients than in the opposite direction. In addition, mobile computing systems generate mostly read-only transactions from mobile clients for retrieving different types of information such as stock data, traffic information and news updates. Since previous concurrency control protocols, however, do not consider such a particular characteristics, the performance degradation occurs when previous schemes are applied to the broadcast environment. In this paper, we propose optimistic concurrency control based on timestamp interval for broadcast environment. The following requirements are satisfied by adapting weak consistency that is the appropriate correctness criterion of read-only transactions: (1) the mutual consistency of data maintained by the server and read by clients (2) the currency of data read by clients. We also adopt the timestamp Interval protocol to check the weak consistency efficiently. As a result, we improved a performance by reducing unnecessary aborts and restarts of read-only transactions caused when global serializability was adopted.

Four Consistency Levels in Trigger Processing (트리거 처리 4 단계 일관성 레벨)

  • ;Eric Hanson
    • Journal of KIISE:Databases
    • /
    • v.29 no.6
    • /
    • pp.492-501
    • /
    • 2002
  • An asynchronous trigger processor (ATP) is a oftware system that processes triggers after update transactions to databases are complete. In an ATP, discrimination networks are used to check the trigger conditions efficiently. Discrimination networks store their internal states in memory nodes. TriggerMan is an ATP and uses Gator network as the .discrimination network. The changes in databases are delivered to TriggerMan in the form of tokens. Processing tokens against a Gator network updates the memory nodes of the network and checks the condition of a trigger for which the network is built. Parallel token processing is one of the methods that can improve the system performance. However, uncontrolled parallel processing breaks trigger processing semantic consistency. In this paper, we propose four trigger processing consistency levels that allow parallel token processing with minimal anomalies. For each consistency level, a parallel token processing technique is developed. The techniques are proven to be valid and are also applicable to materialized view maintenance.

Efficient Schemes for Cache Consistency Maintenance in a Mobile Database System (이동 데이터베이스 시스템에서 효율적인 캐쉬 일관성 유지 기법)

  • Lim, Sang-Min;Kang, Hyun-Chul
    • The KIPS Transactions:PartD
    • /
    • v.8D no.3
    • /
    • pp.221-232
    • /
    • 2001
  • Due to rapid advance of wireless communication technology, demand on data services in mobile environment is gradually increasing. Caching at a mobile client could reduce bandwidth consumption and query response time, and yet a mobile client must maintain cache consistency. It could be efficient for the server to broadcast a periodic cache invalidation report for cache consistency in a cell. In case that long period of disconnection prevents a mobile client from checking validity of its cache based solely on the invalidation report received, the mobile client could request the server to check cache validity. In doing so, some schemes may be more efficient than others depending on the number of available channels and the mobile clients involved. In this paper, we propose new cache consistency schemes, effects, efficient especially (1) when channel capacity is enough to deal with the mobile clients involved or (2) when that is not the case, and evaluate their performance.

  • PDF

Detecting Errors and Checking Consistency in the Object-Oriented Design Models (객체지향 설계방법에서 오류 검출과 일관성 점검기법 연구)

  • Jeong, Gi-Won;Jo, Yong-Seon;Gwon, Seong-Gu
    • The Transactions of the Korea Information Processing Society
    • /
    • v.6 no.8
    • /
    • pp.2072-2087
    • /
    • 1999
  • As software size ever increases and user's requirements become more and more sophisticated., the importance of software quality is more and more emphasized. However, we are not satisfied for the present techniques on detecting errors and checking consistency in the object-oriented design model. This paper proposes a systematic approach which produces implementable rules to detect errors and check consistency. At first, the meta-models for UML diagrams are constructed, generalized meta-rules are reduced from the meta-models, and then the meta-rules are applied to produce the implementable rules. This approach enables to pursue the completeness of the rules and the automation of rule application. An example of rule application shows the feasibility of the rule application.

  • PDF