• Title/Summary/Keyword: DB Quality

Search Result 340, Processing Time 0.035 seconds

A Study on the Measurement of Voluntary Disclosure Quality Using Real-Time Disclosure By Programming Technology

  • Shin, YeounOuk;Kim, KiBum
    • International journal of advanced smart convergence
    • /
    • v.7 no.2
    • /
    • pp.86-94
    • /
    • 2018
  • This study focuses on presenting the IT program module provided by real - time forecasting and database of the voluntary disclosure quality measure in order to solve the problem of capital cost due to information asymmetry of external investors and corporate executives. This study suggests a model of the algorithm that the quality of real - time voluntary disclosure can be provided to all investors immediately by IT program in order to deliver the meaningful value in the domestic capital market. This is a method of generating and analyzing real-time or non-real-time prediction models by transferring the predicted estimates delivered to the Big Data Log Analysis System through the statistical DB to the statistical forecasting engine.

The implementation of database for high quality Embedded Text-to-speech system (고품질 내장형 음성합성 시스템을 위한 음성합성 DB구현)

  • Kwon, Oh-Il
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.42 no.4 s.304
    • /
    • pp.103-110
    • /
    • 2005
  • Speech Database is one of the most important part of Text-to-speech(TTS) system Especially, the embedded TTS system needs more small size of database than that of the server TTS system So, the compression and statistical reduction or database is a very important factor in the embedded TTS system But this compression and statistical reduction of database always rise a loss of quality of the synthesised speech. In this paper, we propose a method of constructing database for high quality embedded TTS system and verify the quality of synthesised speech with MOS(Mean Opinion Score) test.

A Forensic Methodology for Detecting Image Manipulations (이미지 조작 탐지를 위한 포렌식 방법론)

  • Jiwon Lee;Seungjae Jeon;Yunji Park;Jaehyun Chung;Doowon Jeong
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.33 no.4
    • /
    • pp.671-685
    • /
    • 2023
  • By applying artificial intelligence to image editing technology, it has become possible to generate high-quality images with minimal traces of manipulation. However, since these technologies can be misused for criminal activities such as dissemination of false information, destruction of evidence, and denial of facts, it is crucial to implement strong countermeasures. In this study, image file and mobile forensic artifacts analysis were conducted for detecting image manipulation. Image file analysis involves parsing the metadata of manipulated images and comparing them with a Reference DB to detect manipulation. The Reference DB is a database that collects manipulation-related traces left in image metadata, which serves as a criterion for detecting image manipulation. In the mobile forensic artifacts analysis, packages related to image editing tools were extracted and analyzed to aid the detection of image manipulation. The proposed methodology overcomes the limitations of existing graphic feature-based analysis and combines with image processing techniques, providing the advantage of reducing false positives. The research results demonstrate the significant role of such methodology in digital forensic investigation and analysis. Additionally, We provide the code for parsing image metadata and the Reference DB along with the dataset of manipulated images, aiming to contribute to related research.

A base study of an Ecological Mapping technique by using GIS and Remote Sensing (GIS와 RS를 이용한 생태지도 작성기법에 관한 기초연구)

  • Yi, Gi-Chul;Lee, Won-Hwa;Yoon, Hae-Soon;Nam, Chun-Hee;Kim, Gu-Yeon;Kim, Seong-Hwan;Suh, Sang-Hyun
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.7 no.3
    • /
    • pp.57-69
    • /
    • 2004
  • This study developed an ecological mapping technique with GIS database using the analyses of existing ecological survey reports and the change detection on the Nakdong river estuary. The data which are used to establish GIS DB include 2 Landsat TM images on Nov. 31, 1984 and May 17, 1997, 1:25,000 topographical maps established by National Geography Institution and various ecological survey reports published by Busan metropolitan city government. The details for producing ecological map are as follows. At first, the current methods of ecomapping efforts and previous ecological surveys of Nakdong river estuary were carefully examined. Secondly, the land cover maps were created from the classified Landsat images of 1984 and 1997 for the spatiotemporal ecosystem analysis. Thirdly, the ecosystem was evaluated by using GIS ecological database based on the criteria of botany, zoology and water quality etc. Each criteria was reclassified into 3 stages which describe the overall quality of ecological condition. At last, the comprehensive ecological map was suggested as a prototype of ecosystem assesment and management tool with the discussion of further study. The findings of this study would be a milestone for preserving and managing the ecosystem.

  • PDF

Development of Linking & Management System for High-Resolution Raw Geo-spatial Data based on the Point Cloud DB (Point Cloud 기반의 고해상도 원시데이터 연계 및 관리시스템 개발)

  • KIM, Jae-Hak;LEE, Dong-Ha
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.21 no.4
    • /
    • pp.132-144
    • /
    • 2018
  • 3D Geo-spatial information models have been widely used in the field of Civil Engineering, Medical, Computer Graphics, Urban Management and many other. Especially, in surveying and geo-spatial field, the demand for high quality 3D geospatial information and indoor spatial information is so highly increasing. However, it is so difficult to provide a low-cost and high efficiency service to the field which demand the highest quality of 3D model, because pre-constructed spatial data are composed of different formats and storage structures according to the application purpose of each institutes. In fact, the techniques to construct a high applicable 3D geo-spatial model is very expensive to collect and analyze geo-spatial data, but most demanders of 3D geo-spatial model never want to pay the high-cost to that. This study, therefore, suggest the effective way to construct 3D geo-spatial model with low-cost of construction. In general, the effective way to reduce the cost of constructing 3D geo-spatial model as presented in previous studies is to combine the raw data obtained from point cloud observatory and UAV imagery, however this method has some limitation of usage from difficulties to approve the use of raw data because of those have been managed separately by various institutes. To solve this problem, we developed the linking & management system for unifying a high-Resolution raw geo-spatial data based on the point cloud DB and apply this system to extract the basic database from 3D geo-spatial mode for the road database registration. As a result of this study, it can be provided six contents of main entries for road registration by applying the developed system based on the point cloud DB.

Initial System for Automation of PDQ-based Shape Quality Verification of Naval Ship Product Model (제품데이터품질(PDQ) 평가에 따른 함정 제품모델의 형상 품질검증 자동화 초기 시스템)

  • Oh, Dae-Kyun;Hwang, In-Hyuck;Ryu, Cheol-Ho;Lee, Dong-Kun
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.20 no.1
    • /
    • pp.113-119
    • /
    • 2014
  • Recently, R.O.K. Navy is increasing re-usability of design data and application of M&S(Modeling and Simulation) through the establishment of collaborative product development environment focused on Naval Ship Product Model(NSPM). As a result, the reliability of the result of design is getting better, and furthermore, a study to improve quality of construction through simulation of production/operation is in progress. Accordingly, the database construction of design data and the DB(Database) quality become important, but there was not research related to those or it was just initial state. This paper conducted research about system of the quality verification process of shape elements which compose NSPM based on the quality verification guideline of NSPM as the result of the precedent study. The hull surface was limited as verification object. The study to verify two things that application of basic drawing by the cad model of hull surface, and whether there is error in the geometric quality of cad model was progressed. To achieve this goal, the verification criteria and algorithm were defined and the prototype system which is based on was developed.

Reengineering of the Data Collection Process for Discharge Abstract Database (퇴원환자 진료정보 DB의 데이터 수집 과정 재설계)

  • Hong, Joon Hyun;Choi, Kwisook;Lee, Eun Mee
    • Quality Improvement in Health Care
    • /
    • v.7 no.1
    • /
    • pp.106-116
    • /
    • 2000
  • Background : Severance Hospital is an university hospital which has 1,580 beds. A LAN system was installed in the Medical Record Department in 1992 and discharge abstract data have been added to the discharge abstract database(DB) The previous work flow in the Medical Record Department had 5 levels: 1) chart collection from wards, 2) assembling, 3) abstracting data from medical record on worksheet by 2 RRAs, 4) checking deficiencies and coding diagnosis and procedures by 4 RRAs, 5) inputting the data into the discharge abstract data base by 1 RRA. The average processing time took 19.3 days from the patient discharge date. It had the production of monthly statistical report delayed. Besides, it caused the users in the hospital to complain. Methods : A CQI team was organized to find a way to shorten the processing time less than 10 days. The team identified the factors making the processing time long and integrated three levels from the 3rd level into one. Each of 7 RRAs performed the integrated level on her workstation instead of taking one of three separate levels. The comparison of processing time before and after the changes was made with 3'846 discharges of April, 1999 and 4,189 discharges of August, 1999. Results : The average processing time was shortened from 19.3 days to 8.7 days. Especially the integrated level took only 3.6 days, compared with 12.3 days before the change. The percentage of finishing up the whole processing within 10 days from discharge was increased up to 77.6%, which was 2.4% before the integration. The prevalence of error in data input was not increased in the new method. Conclusions : The integrated processing method has the following advantages: 1) the expedition of production of monthly statistical report, 2) the increase of utilizing rate of dischare abstract data by Billing Dept, Emergency Room, QI Dept., etc., 3) the improvement of intradepartmental work follow, 4) the enhancement of medical record quality by checking the deficiencies earlier than before.

  • PDF

A Study on the Behaviors and Customer Satisfactions of University Library Users of the Electronic Journals (대학도서관 전자저널이용자의 이용행태와 만족도에 관한 연구 - K대학교 도서관이용자를 중심으로 -)

  • Oh, Dong-Geun;Kim, Sook-Chan
    • Journal of the Korean Society for information Management
    • /
    • v.23 no.4 s.62
    • /
    • pp.129-146
    • /
    • 2006
  • This study analyzed the user behaviors of the electronic journal users and the influences of service quality of electronic journals on the customer satisfactions, customer loyalty, and frequency of visit to the library building. Approximately 60 percent of users prefer e-journal to printed formats. Service quality of electronic journal was measured by four dimensions: reliability of service, convenience of service, public relations, and user instructions.100 faculty members and 267 graduate school students were surveyed using questionnaires. It was concluded that each dimensions of service quality positively influenced on the customer satisfactions, and customer satisfaction positively influenced on loyalty, and negatively on frequency of visit to the library building.

Developing dirty data cleansing service between SOA-based services (SOA 기반 서비스 사이의 오류 데이터 정제 서비스 개발)

  • Ji, Eun-Mi;Choi, Byoung-Ju;Lee, Jung-Won
    • The KIPS Transactions:PartD
    • /
    • v.14D no.7
    • /
    • pp.829-840
    • /
    • 2007
  • Dirty Data Cleansing technique so far have aimed to integrate large amount of data from various sources and manage data quality resided in DB so that it enables to extract meaningful information. Prompt response to varying environment is required in order to persistently survive in rapidly changing business environment and the age of limitless competition. As system requirement is recently getting complexed, Service Oriented Architecture is proliferated for the purpose of integration and implementation of massive distributed system. Therefore, SOA necessarily needs Data Exchange among services through Data Cleansing Technique. In this paper, we executed quality management of XML data which is transmitted through events between services while they are integrated as a sole system. As a result, we developed Dirty Data Cleansing Service based on SOA as focusing on data cleansing between interactive services rather than cleansing based on detection of data error in DB already integrated.

Analysis of Quality Measurement & Evaluation Index in applying Web Information Service I (Web 정보서비스 평가를 위한 기존 측정지표 분석 I)

  • Yoo, Sa-Rah
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.34 no.3
    • /
    • pp.133-156
    • /
    • 2000
  • A fundamental issue to consider when searching Web information is the quality of information itself and of service. If a Quality Information System (QIS) of Digital Library is sought, new measurement criteria and evaluation index for Web information service are required. Reliance on the existing evaluation criteria is not acceptable if data retrieved are not information and the service is only noise to end-user. Applying existing evaluation criteria of online database to Environmental & Energy Engineering Web DB revealed the limitations and provided a practical and case-based information for improvement. No attempt was made to survey comprehensively all of the evaluation methods that could possibly be relevant. Instead, this discussion concentrates on the information service evaluation index being developed in the KDPC Project. The research found that domestic Web-DB service are still woefully insufficient to conduct comprehensive investigations on environmental topics. More qualified and specialized R&D Web information service and the development of new evaluation criteria based on this investigation(research I.) will be discussed in follow-up research II.

  • PDF