• Title/Summary/Keyword: DB Quality

Search Result 340, Processing Time 0.028 seconds

Quality Evaluation and Management of a Shared Cataloging DB: the Case of KERIS UNICAT DB (공동목록 DB의 품질평가와 품질관리: KERIS의 종합목록 DB를 중심으로)

  • Lee, Jae-Whoan
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.36 no.1
    • /
    • pp.61-90
    • /
    • 2002
  • This study intends to evaluate the quality of the KERIS UNICAT DB, and to suggest both theoretical and practical methods for the quality improvement of the DB. To the end. this study developed a quality evaluation model and verified the quality of the UNICAT DB in a comprehensive way, Emphasis was on analyzing the factors causing such inferior and substandard bibliographic records in the UNICAT DB. Also suggested are the management strategies and substantial guidelines to improve the quality of the UNICAT DB.

GIS DB Quality Improvement in the user′s point of view (사용자 관점에서의 지형DB 품질 확보를 위한 연구)

  • 이권한;이한나;김대중
    • Spatial Information Research
    • /
    • v.11 no.4
    • /
    • pp.359-370
    • /
    • 2003
  • This study aims at preparing the foundation for continuous development of Topographic DB by suggesting an actual quality improvement scheme and proposing a draft for complementing existing progress and method of implementation and management according to this scheme. For this purpose, we surveyed the Topographic DB users to grasp the current DB quality and the required quality level. Then, the suggested quality improvement scheme was concertized and verified through an experimental research. The experimental research was concentrated upon evaluating primitive data, modifying geometrical and logical errors and management. At last, a plan to improve the related systems was proposed.

  • PDF

Quality Evaluation of a Shared Cataloging DB : the Case of KOLIS-NET (KOLIS-NET 종합목록 DB의 품질평가)

  • Kim, Sun-Ae;Lee, Soo-Sang
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.40 no.1
    • /
    • pp.95-117
    • /
    • 2006
  • The research purpose for this study is to evaluate the quality of the KOLIS-NET DB which builded bibliographic data of the holding collections of nationwide public libraries. The quality evaluation of the KOLIS-NET DB was inspired by the successful experience from researches precedent and tried to approach the case study focuses on six quality dimensions : coverage, duplication. currentness. accuracy, consistency completeness. The study verified comprehensively the quality of the KOLIS-NET DB through a quality evaluation model and analyzed the factors causing such inferior and substandard bibliographic records in the KOLIS-NET DB. Based on the results of the quality evaluation, the quality improvements of the KOLIS-NET DB was suggested.

A Study on Real-time Quality Evaluation Method of Bibliographic Database (실시간 서지데이터베이스 평가방법에 관한 연구)

  • 노경란;권오진;유현종;문영호;홍성화
    • The Journal of the Korea Contents Association
    • /
    • v.2 no.4
    • /
    • pp.76-84
    • /
    • 2002
  • The conventional database evaluation method is carried out by the way in which the person in charge of each specialty database(DB manager) composes the evaluation sheets for corretionㆍrevision on the already-constructed database in a manual method and carries out the measurement and re-education of DB workers based upon it. As a result, that way consumes much time on career information and measurement works about DB workers, causing low time and cost efficiency and lack of systematic management of DB workers, resulting in becoming the hindrance factor of databases quality improvement. This research provides on-line, red-time results of measurements about the efficiency of DB production and DB workers by combining the static measurement with dynamic measurement by DB manager, both of which utilize the System. Therefore, the DB manager can contribute to the improvement of DB quality by determining the continuation of DB production by DB workers or carrying out the re-education of DB workers without being affected by time or spacial constraints.

  • PDF

Methods for Quality Control and Evaluation in the Scientific and Technical Bibliographic Databases (과학기술분야 서지 DB의 품질관리 및 평가 방안: KORDIC의 KRISTAL DB를 중심으로)

  • Lee Jae-Whoan
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.31 no.3
    • /
    • pp.109-134
    • /
    • 1997
  • This study discusses the quality issue of large scientific and technical (S&T) bibliographic databases in South Korea. In details, this study develops the criteria to evaluate the quality of S&T bibliographic databases, evaluates the quality of the selected two databases - UN10N DB and SATURN DB of the KORDIC, and finally, suggests both organizational and technical methods for the quality improvement of such bibliographic databases.

  • PDF

A Study on the Quality Management of a Union DB Built in a Distributed System (분산체계로 구축된 통합 DB의 품질관리에 관한 연구)

  • Lee Jae-Whoan
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.32 no.3
    • /
    • pp.179-206
    • /
    • 1998
  • The purpose of this study is to discuss the theoretical and practical strategies for the quality management of a union database built in a distributed system. To the end, this study introduces the quality control methods employed for the OCLC union catalog, which has been built in a distributed system and also known as the most typical union DB. The main discussion is about the quality issue of KORDIC's SATURN DB, the most typical union DB in South Korea. The final recommendation includes the management strategies and practical guidelines for the efficient quality control of a union DB built in a distributed system.

  • PDF

A Study on Quality Evaluation of Medical Web DBs : PubMed and Embase (의학 분야 Web DB의 품질평가 -PubMed와 Embase를 대상으로-)

  • Kim, Sang-Jun
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.38 no.2
    • /
    • pp.161-187
    • /
    • 2004
  • This study concerns DB quality of the Web databases which has been produced in medical science. For the quality evaluation of Embase and PubMed, 10 evaluating criteria were developed on the basis of literature review. The evaluation result showed that PubMed has superiority to the currentness, accuracy, completness, consistency, ease of use, customer support, searchability, cost, and network & H/W on DB, but Embase has only superiority to the coverage. It is needed for purchasing, user training, and library service based on the above evaluation result.

A Study on the Quality Evaluation of Scholarly Web Databases Focused on NDSL, PubMed, Scopus, and Web of Science (학술 웹 데이터베이스의 품질 비교 평가 : NDSL, P ubMed, Scopus와 Web of Science를 중심으로)

  • Kim, Sang-Jun
    • Journal of Information Management
    • /
    • v.36 no.3
    • /
    • pp.127-165
    • /
    • 2005
  • This study is focused on the quality of the Web databases which has been produced in science. For the quality evaluation of NDSL, PubMed, Scopus, and WoS, 10 evaluating criteria are developed on the basis of literature review. The evaluation results show that NDSL and PubMed are superior in the currentness and cost. Scopus and WoS are superior in the information of citing and the analysis tool. It is needed for purchasing, user training, and library service based on the above evaluation results.

Pruning Methodology for Reducing the Size of Speech DB for Corpus-based TTS Systems (코퍼스 기반 음성합성기의 데이터베이스 축소 방법)

  • 최승호;엄기완;강상기;김진영
    • The Journal of the Acoustical Society of Korea
    • /
    • v.22 no.8
    • /
    • pp.703-710
    • /
    • 2003
  • Because of their human-like synthesized speech quality, recently Corpus-Based Text-To-Speech(CB-TTS) have been actively studied worldwide. However, due to their large size speech database (DB), their application is very restricted. In this paper we propose and evaluate three DB reduction algorithms to which are designed to solve the above drawback. The first method is based on a K-means clustering approach, which selects k-representatives among multiple instances. The second method is keeping only those unit instances that are selected during synthesis, using a domain-restricted text as input to the synthesizer. The third method is a kind of hybrid approach of the above two methods and is using a large text as input in the system. After synthesizing the given sentences, the used unit instances and their occurrence information is extracted. As next step a modified K-means clustering is applied, which takes into account also the occurrence information of the selected unit instances, Finally we compare three pruning methods by evaluating the synthesized speech quality for the similar DB reduction rate, Based on perceptual listening tests, we concluded that the last method shows the best performance among three algorithms. More than this, the results show that the last method is able to reduce DB size without speech quality looses.

Implementation of Wideband Waveform Interpolation Coder for TTS DB Compression (TTS DB 압축을 위한 광대역 파형보간 부호기 구현)

  • Yang, Hee-Sik;Hahn, Min-Soo
    • MALSORI
    • /
    • v.55
    • /
    • pp.143-158
    • /
    • 2005
  • The adequate compression algorithm is essential to achieve high quality embedded TTS system. in this paper, we Propose waveform interpolation coder for TTS corpus compression after many speech coder investigation. Unlike speech coders in communication system, compression rate and anality are more important factors in TTS DB compression than other performance criteria. Thus we select waveform interpolation algorithm because it provides good speech quality under high compression rate at the cost of complexity. The implemented coder has bit rate 6kbps with quality degradation 0.47. The performance indicates that the waveform interpolation is adequate for TTS DB compression with some further study.

  • PDF