• Title/Summary/Keyword: Database Quality

Search Result 1,359, Processing Time 0.027 seconds

A Study on Factors Affecting Web Academic Information Service Quality (학술정보 웹 서비스 만족도 향상을 위한 영향 요인에 관한 연구)

  • Park, Cheon-Woong;Lee, Ki-Dong
    • Journal of Digital Convergence
    • /
    • v.7 no.4
    • /
    • pp.91-99
    • /
    • 2009
  • To determine factors affecting scholarly web database service quality, this research is to investigate factors of web academic information service quality, and analyze how these factors influencing on web academic information service quality satisfactions. As a results, three dimensions were identified namely, information retrieval, ease of use and interaction. The scholarly web database service management has more critical problems than information service in these days. It indicates that the qualitative aspect is becoming more important than quantitative in web academic information service management.

  • PDF

A Study on the Quality Assurance of National Basemap Digital Mapping Database (국가기본도 수치지도제작 데이터베이스의 품질 확보에 관한 연구)

  • 이현직;최석근;신동빈;박경열
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.15 no.1
    • /
    • pp.117-129
    • /
    • 1997
  • In recent years, the 1 : 5,000 scale Digital National Basemap(DNB) has been generated under National Geo-graphic Information System(NGIS) Project. The DNB database generated will be the backdrop data for thematic maps, underground facilities maps and so on. The DNB database will be distributed to the government and private sectors in near future so that it should meet the requirements as the basic data. In order to assure the quality of DNB database, the establishment of quality assurance process to database was in great need. In this study, we were mainly concerned with improving the quality of digital national basemap database in geomatric aspect as well as the processing time due to the amount of digital data generated. As a results of this study, the quality assuance process of DNB database is established and automatic quality assurance program is developed. Also, the program developed in this study is contributed to quality assurance of DNB database as well as economic aspects.

  • PDF

Hiding Sensitive Frequent Itemsets by a Border-Based Approach

  • Sun, Xingzhi;Yu, Philip S.
    • Journal of Computing Science and Engineering
    • /
    • v.1 no.1
    • /
    • pp.74-94
    • /
    • 2007
  • Nowadays, sharing data among organizations is often required during the business collaboration. Data mining technology has enabled efficient extraction of knowledge from large databases. This, however, increases risks of disclosing the sensitive knowledge when the database is released to other parties. To address this privacy issue, one may sanitize the original database so that the sensitive knowledge is hidden. The challenge is to minimize the side effect on the quality of the sanitized database so that non-sensitive knowledge can still be mined. In this paper, we study such a problem in the context of hiding sensitive frequent itemsets by judiciously modifying the transactions in the database. Unlike previous work, we consider the quality of the sanitized database especially on preserving the non-sensitive frequent itemsets. To preserve the non-sensitive frequent itemsets, we propose a border-based approach to efficiently evaluate the impact of any modification to the database during the hiding process. The quality of database can be well maintained by greedily selecting the modifications with minimal side effect. Experiments results are also reported to show the effectiveness of the proposed approach.

A Selection Method of Database System Quality Characteristics Using the Analytic Hierarchy Process (계층분석적 의사결정기법을 이용한 데이터베이스 시스템 품질 특성의 선정 방법)

  • Park, Mi-Young;Seung, Hyon-Woo
    • Journal of the Korean BIBLIA Society for library and Information Science
    • /
    • v.20 no.4
    • /
    • pp.191-204
    • /
    • 2009
  • It is essential to estimate and evaluate for user satisfaction and quality management of database system, understanding of user needs and quality characteristics. Based on ISO 25000 series, it was suggested, the first research, that 5 main quality characteristics, 21 sub quality characteristics and 48 internal quality characteristics. There are comparative significance methods of main quality characteristics, sub quality characteristics and internal quality characteristics but it is not easy to use directly quality model in database system industry field. Also, Considering time and cost in quality evaluation, it is impossible to evaluate 48 internal quality characteristics and its level of quality evaluation is not equal in accordance with integrity level of database system. By using AHP, this study presents selection method of quality characteristics in weight and possibility of application quality model.

Developing an IFC-based database for construction quality evaluation

  • Xu, Zhao;Li, Bingjing;Li, Qiming
    • International conference on construction engineering and project management
    • /
    • 2017.10a
    • /
    • pp.301-312
    • /
    • 2017
  • Quality evaluation and control represent increasingly important concerns for construction quality management. There is an evident need for a standard data model to be used as the basis for computer-aided quality management. This study focuses on how to realize evaluation of construction quality based on BIM and database technology. In this paper, the reinforced concrete main structure is taken as an example, and the BP neural network evaluation model is established by inquiring current construction quality acceptance specification and evaluation standard. Furthermore, IFC standard is extended to integrate quality evaluation information and realize the mapping of evaluation information in BIM model, contributing to the visualization and transfer sharing of evaluation information. Furthermore, the conceptual entity model is designed to build quality evaluation database, and this paper select MySQL workbench system to achieve the establishment of the database. This study is organized to realize the requirement of visualization and data integration on construction quality evaluation which makes it more effective, convenient, intuitive, easy to find quality problems and provide more comprehensive and reliable data for the quality management of construction enterprises and official construction administratiors.

  • PDF

Development of a Quality Assurance Safety Assessment Database for Near Surface Radioactive Waste Disposal

  • Park J.W.;Kim C.L.;Park J.B.;Lee E.Y.;Lee Y.M.;Kang C.H.;Zhou W.;Kozak M.W.
    • Nuclear Engineering and Technology
    • /
    • v.35 no.6
    • /
    • pp.556-565
    • /
    • 2003
  • A quality assurance safety assessment database, called QUARK (QUality Assurance Program for Radioactive Waste Management in Korea), has been developed to manage both analysis information and parameter database for safety assessment of low- and intermediate-level radioactive waste (LILW) disposal facility in Korea. QUARK is such a tool that serves QA purposes for managing safety assessment information properly and securely. In QUARK, the information is organized and linked to maximize the integrity of information and traceability. QUARK provides guidance to conduct safety assessment analysis, from scenario generation to result analysis, and provides a window to inspect and trace previous safety assessment analysis and parameter values. QUARK also provides default database for safety assessment staff who construct input data files using SAGE(Safety Assessment Groundwater Evaluation), a safety assessment computer code.

The National Clinical Database as an Initiative for Quality Improvement in Japan

  • Murakami, Arata;Hirata, Yasutaka;Motomura, Noboru;Miyata, Hiroaki;Iwanaka, Tadashi;Takamoto, Shinichi
    • Journal of Chest Surgery
    • /
    • v.47 no.5
    • /
    • pp.437-443
    • /
    • 2014
  • The JCVSD (Japan Cardiovascular Surgery Database) was organized in 2000 to improve the quality of cardiovascular surgery in Japan. Web-based data harvesting on adult cardiac surgery was started (Japan Adult Cardiovascular Surgery Database, JACVSD) in 2001, and on congenital heart surgery (Japan Congenital Cardiovascular Surgery Database, JCCVSD) in 2008. Both databases grew to become national databases by the end of 2013. This was influenced by the success of the Society for Thoracic Surgeons' National Database, which contains comparable input items. In 2011, the Japanese Board of Cardiovascular Surgery announced that the JACVSD and JCCVSD data are to be used for board certification, which improved the quality of the first paperless and web-based board certification review undertaken in 2013. These changes led to a further step. In 2011, the National Clinical Database (NCD) was organized to investigate the feasibility of clinical databases in other medical fields, especially surgery. In the NCD, the board certification system of the Japan Surgical Society, the basic association of surgery was set as the first level in the hierarchy of specialties, and nine associations and six board certification systems were set at the second level as subspecialties. The NCD grew rapidly, and now covers 95% of total surgical procedures. The participating associations will release or have released risk models, and studies that use 'big data' from these databases have been published. The national databases have contributed to evidence-based medicine, to the accountability of medical professionals, and to quality assessment and quality improvement of surgery in Japan.

The Implementation of Database Building System for Korean Medical Paper Database (한의학술논문 데이터베이스 구축을 위한 입력 및 검수 시스템 개발)

  • Yea, Sang-Jun;Kim, Ik-Tae;Jang, Yun-Ji;Seong, Bo-Seok;Jang, Hyun-Chul;Kim, Sang-Kyun;Kim, An-Na;Song, Mi-Young;Kim, Chul
    • Korean Journal of Oriental Medicine
    • /
    • v.18 no.3
    • /
    • pp.141-146
    • /
    • 2012
  • Objectives : KIOM(Korean Institute of Oriental Medicine) built up korean medical paper database and services it through information portal OASIS. The database are updated about 1,600 papers and 48,000 references annually. Because lots of manpower and time are needed to update database, it is very important to raise up efficiency and quality of it. Methods : In this paper, we implemented web based database building system utilizing pre-built OASIS' database to improve the working process, data quality and ease of management. Results : First we designed and implemented web based system to input bibliography of the paper efficiently. It raised efficiency using OASIS' paper and reference database. Second we improved the refining process using web based system to raise up data quality. And third we developed the manager functions of web based system to control and check the working process. Conclusions : If we add korean medical dictionary and link outside paper database in the future, we hope that work efficiency and data quality will be raised more. And because the database schema of OASIS system and developed system are different, we are implementing the data transformation system.

Constructing Database and Probabilistic Analysis for Ultimate Bearing Capacity of Aggregate Pier (쇄석다짐말뚝의 극한지지력 데이터베이스 구축 및 통계학적 분석)

  • Park, Joon-Mo;Kim, Bum-Joo;Jang, Yeon-Soo
    • Journal of the Korean Geotechnical Society
    • /
    • v.30 no.8
    • /
    • pp.25-37
    • /
    • 2014
  • In load and resistance factor design (LRFD) method, resistance factors are typically calibrated using resistance bias factors obtained from either only the data within ${\pm}2{\sigma}$ or the data except the tail values of an assumed probability distribution to increase the reliability of the database. However, the data selection approach has a shortcoming that any low-quality data inadvertently included in the database may not be removed. In this study, a data quality evaluation method, developed based on the quality of static load test results, the engineering characteristics of in-situ soil, and the dimension of aggregate piers, is proposed for use in constructing database. For the evaluation of the method, a total 65 static load test results collected from various literatures, including static load test reports, were analyzed. Depending on the quality of the database, the comparison between bias factors, coefficients of variation, and resistance factors showed that uncertainty in estimating bias factors can be reduced by using the proposed data quality evaluation method when constructing database.

Intelligent Data Governance for the Federated Integration of Air Quality Databases in the Railway Industry (철도 산업의 공기 질 데이터베이스 연합형 통합을 위한 지능형 데이터 거버넌스)

  • Minjeong, Kim;Jong-Un, Won;Sangchan, Park;Gayoung, Park
    • Journal of Korean Society for Quality Management
    • /
    • v.50 no.4
    • /
    • pp.811-830
    • /
    • 2022
  • Purpose: In this paper, we will discuss 1) prioritizing databases to be integrated; 2) which data elements should be emphasized in federated database integration; and 3) the degree of efficiency in the integration. This paper aims to lay the groundwork for building data governance by presenting guidelines for database integration using metrics to identify and evaluate the capabilities of the UK's air quality databases. Methods: This paper intends to perform relative efficiency analysis using Data Envelope Analysis among the multi-criteria decision-making methods. In federated database integration, it is important to identify databases with high integration efficiency when prioritizing databases to be integrated. Results: The outcome of this paper aims not to present performance indicators for the implementation and evaluation of data governance, but rather to discuss what criteria should be used when performing 'federated integration'. Using Data Envelope Analysis in the process of implementing intelligent data governance, authors will establish and present practical strategies to discover databases with high integration efficiency. Conclusion: Through this study, it was possible to establish internal guidelines from an integrated point of view of data governance. The flexiblity of the federated database integration under the practice of the data governance, makes it possible to integrate databases quickly, easily, and effectively. By utilizing the guidelines presented in this study, authors anticipate that the process of integrating multiple databases, including the air quality databases, will evolve into the intelligent data governance based on the federated database integration when establishing the data governance practice in the railway industry.