• 제목/요약/키워드: High Quality Data

검색결과 5,191건 처리시간 0.036초

공공 한영 병렬 말뭉치를 이용한 기계번역 성능 향상 연구 (A Study on the Performance Improvement of Machine Translation Using Public Korean-English Parallel Corpus)

  • 박찬준;임희석
    • 디지털융복합연구
    • /
    • 제18권6호
    • /
    • pp.271-277
    • /
    • 2020
  • 기계번역이란 소스언어를 목적언어로 컴퓨터가 번역하는 소프트웨어를 의미하며 규칙기반, 통계기반 기계번역을 거쳐 최근에는 인공신경망 기반 기계번역에 대한 연구가 활발히 이루어지고 있다. 인공신경망 기계번역에서 중요한 요소 중 하나로 고품질의 병렬 말뭉치를 뽑을 수 있는데 이제까지 한국어 관련 언어쌍의 고품질 병렬 코퍼스를 구하기 쉽지 않은 실정이었다. 최근 한국정보화진흥원의 AI HUB에서 고품질의 160만 문장의 한-영 기계번역 병렬 말뭉치를 공개하였다. 이에 본 논문은 AI HUB에서 공개한 데이터 및 현재까지 가장 많이 쓰인 한-영 병렬 데이터인 OpenSubtitles와 성능 비교를 통해 각각의 데이터의 품질을 검증하고자 한다. 테스트 데이터로 한-영 기계번역 관련 공식 테스트셋인 IWSLT에서 공개한 테스트셋을 이용하여 보다 객관성을 확보하였다. 실험결과 동일한 테스트셋으로 실험한 기존의 한-영 기계번역 관련 논문들보다 좋은 성능을 보임을 알 수 있었으며 이를 통해 고품질 데이터의 중요성을 알 수 있었다.

Proposal of Public Data Quality Management Level Evaluation Domain Rule Mapping Model

  • Jeong, Ha-Na;Kim, Jae-Woong;Chung, Young-Suk
    • 한국컴퓨터정보학회논문지
    • /
    • 제27권12호
    • /
    • pp.189-195
    • /
    • 2022
  • 정부는 공공데이터의 민간 개방, 활용을 장려함으로써 신산업, 일자리 창출 등 창조경제 활성화에 기여하는 것을 주요 국정과제로 삼고 있다. 그리고 고품질 공공데이터 보유를 위해 공공데이터 품질관리 수준평가 진행 등의 활동을 통해 공공데이터 품질 향상을 도모하고 있다. 그러나 품질진단 도구 사용자의 데이터 전문성, 이해도에 따라 공공데이터 품질관리 수준평가 결과에 격차가 발생하기 때문에 진단 결과의 정확성을 보장하기 어렵다. 본 논문은 데이터 이해도가 낮은 사용자의 진단 결과에 대한 정확성을 보장하기 위해 데이터 품질진단 기준 중 유효성 진단에 적용 가능한 공공데이터 품질관리 수준평가 도메인규칙 매핑 모델을 제안하였다. 또한 모델에 실제 데이터를 적용한 결과 공공데이터 품질진단의 안정성과 정확성을 높이는 것을 확인하였다.

LOSA Data 품질(Quality)에 영향을 미치는 요소 (Factors Affecting LOSA Data Quality)

  • 이경호;이장룡
    • 한국항공운항학회지
    • /
    • 제31권2호
    • /
    • pp.72-80
    • /
    • 2023
  • Line Operations Safety Audit (LOSA) is a well known preventive aviation safety program for Threat and Error management (TEM). High quality LOSA data suitable for safety management is obtained when a flight crew flies at the same level of attention as ordinary flight. Factors contributing to LOSA data quality may include flight crew's understanding on LOSA purpose, observer's career, and characteristics of the organization responsible for LOSA operations. This study explored purposes of TEM and LOSA, as well as their relationship. Previous studies mentioned quality of LOSA data can be influenced by heuristic judgment, hawthorne effect, and priming effect. This study recognized the importance of LOSA data quality to be effectively used for preventive safety management. It was confirmed that the level of understanding on LOSA concept, experience of the observer, and the characteristics of the department in charge of LOSA operation could affect the quality of LOSA data.

A Study on Quality Checking of National Scholar Content DB

  • Kim, Byung-Kyu;Choi, Seon-Hee;Kim, Jay-Hoon;You, Beom-Jong
    • International Journal of Contents
    • /
    • 제6권3호
    • /
    • pp.1-4
    • /
    • 2010
  • The national management and retrieval service of the national scholar Content DB are very important. High quality content can improve the user's utilization and satisfaction and be a strong base for both the citation index creation and the calculation of journal impact factors. Therefore, the system is necessary to check data quality effectively. We have closely studied and developed a webbased data quality checking system that will support anything from raw digital data to its automatic validation as well as hands-on validation, all of which will be discussed in this paper.

Accurate Camera Self-Calibration based on Image Quality Assessment

  • Fayyaz, Rabia;Rhee, Eun Joo
    • Journal of Information Technology Applications and Management
    • /
    • 제25권2호
    • /
    • pp.41-52
    • /
    • 2018
  • This paper presents a method for accurate camera self-calibration based on SIFT Feature Detection and image quality assessment. We performed image quality assessment to select high quality images for the camera self-calibration process. We defined high quality images as those that contain little or no blur, and have maximum contrast among images captured within a short period. The image quality assessment includes blur detection and contrast assessment. Blur detection is based on the statistical analysis of energy and standard deviation of high frequency components of the images using Discrete Cosine Transform. Contrast assessment is based on contrast measurement and selection of the high contrast images among some images captured in a short period. Experimental results show little or no distortion in the perspective view of the images. Thus, the suggested method achieves camera self-calibration accuracy of approximately 93%.

Multi-Sever based Distributed Coding based on HEVC/H.265 for Studio Quality Video Editing

  • Kim, Jongho;Lim, Sung-Chang;Jeong, Se-Yoon;Kim, Hui-Yong
    • Journal of Multimedia Information System
    • /
    • 제5권3호
    • /
    • pp.201-208
    • /
    • 2018
  • High Efficiency Video Coding range extensions (HEVC RExt) is a kind of extension model of HEVC. HEVC RExt was specially designed for dealing the high quality images. HEVC RExt is very essential for studio editing which handle the very high quality and various type of images. There are some problems to dealing these massive data in studio editing. One of the most important procedure is re-encoding and decoding procedure during the editing. Various codecs are widely used for studio data editing. But most of the codecs have common problems to dealing the massive data in studio editing. First, the re-encoding and decoding processes are frequently occurred during the studio data editing and it brings enormous time-consuming and video quality loss. This paper, we suggest new video coding structure for the efficient studio video editing. The coding structure which is called "ultra-low delay (ULD)". It has the very simple and low-delayed referencing structure. To simplify the referencing structure, we can minimize the number of the frames which need decoding and re-encoding process. It also prevents the quality degradation caused by the frequent re-encoding. Various fast coding algorithms are also proposed for efficient editing such as tool-level optimization, multi-serve based distributed coding and SIMD (Single instruction, multiple data) based parallel processing. It can reduce the enormous computational complexity during the editing procedure. The proposed method shows 9500 times faster coding speed with negligible loss of quality. The proposed method also shows better coding gain compare to "intra only" structure. We can confirm that the proposed method can solve the existing problems of the studio video editing efficiently.

Feasibility to Expand Complex Wards for Efficient Hospital Management and Quality Improvement

  • CHOI, Eun-Mee;JUNG, Yong-Sik;KWON, Lee-Seung;KO, Sang-Kyun;LEE, Jae-Young;KIM, Myeong-Jong
    • 산경연구논집
    • /
    • 제11권12호
    • /
    • pp.7-15
    • /
    • 2020
  • Purpose: This study aims to explore the feasibility of expanding complex wards to provide efficient hospital management and high-quality medical services to local residents of Gangneung Medical Center (GMC). Research Design, Data and Methodology: There are four research designs to achieve the research objectives. We analyzed Big Data for 3 months on Social Network Services (SNS). A questionnaire survey conducted on 219 patients visiting the GMC. Surveys of 20 employees of the GMC applied. The feasibility to expand the GMC ward measured through Focus Group Interview by 12 internal and external experts. Data analysis methods derived from various surveys applied with data mining technique, frequency analysis, and Importance-Performance Analysis methods, and IBM SPSS statistical package program applied for data processing. Results: In the result of the big data analysis, the GMC's recognition on SNS is high. 95.9% of the residents and 100.0% of the employees required the need for the complex ward extension. In the analysis of expert opinion, in the future functions of GMC, specialized care (△3.3) and public medicine (△1.4) increased significantly. Conclusion: GMC's complex ward extension is an urgent and indispensable project to provide efficient hospital management and service quality.

유황분석과 수질변화 평가를 통한 비점오염원 관리대상지역 선정방법 연구 (Watershed Selection for Diffuse Pollution Management Based on Flow Regime Alteration and Water Quality Variation Analysis)

  • 정우혁;이상진;김건하;정상만
    • 한국물환경학회지
    • /
    • 제27권2호
    • /
    • pp.228-234
    • /
    • 2011
  • The goal of water quality management on stream and watershed is to focus not on discharged loads management but on a water quality management. Discharged loads management is not goal of water quality management but way for perform with total maximum daily loads management. It is necessary to estimate the relation between non-point source with stromwater runoff (NPSSR) and water quality to select a watershed where it is required to manage NPSSR for water quality improvement. To evaluate the effects of NPSSR on stream's water quality, we compare the aspects of water quality in dry and wet seasons using flow duration curve analysis based on flow rate variation data by actual surveying. In this study we attempt to quantify the variation characteristic of water quality and estimate the Inflow characteristic of pollution source with water quality and flow rate monitoring on 10 watersheds. We try to estimate water quality and flow rate by regression analysis and try again regression analysis with each high and low water quality data more than estimations. An analysis of relation between water quality and flow rate of 10 watersheds shows that the water quality of the Nonsan and the Ganggyeong streams had been polluted by NPSSR pollutants. Other eight streams were important point source more than NPSSR. It is wide variation range of $BOD_5$ also high average concentration of $BOD_5$. We have to quantify water quality variation by cv1 in wet season and cv365 in dry season with comparing the estimate of high water quality and low water quality. This method can be used to indicator for water quality variation according to flow rate.

시계열 네트워크분석을 통한 데이터품질 연구경향 및 산업연관 분석 (Trend of Research and Industry-Related Analysis in Data Quality Using Time Series Network Analysis)

  • 장경애;이광석;김우제
    • 정보처리학회논문지:소프트웨어 및 데이터공학
    • /
    • 제5권6호
    • /
    • pp.295-306
    • /
    • 2016
  • 본 연구는 데이터품질과 관련된 선행연구의 메타정보를 활용하여 연구경향을 분석하고 이를 통해서 산업계의 흐름을 예측하기 위한 목적의 연구이다. 다양한 분야에서 연구경향을 분석하려는 시도는 이어져 왔으나, 데이터품질 영역은 그 범위가 방대하여 선행 연구자료에 대한 분석을 수행하기 어려웠다. 본 연구는 Web of Science 색인DB에 수록된 최근 10년간의 연구 메타데이터를 수집하여 텍스트 마이닝, 사회연결망 분석기법을 활용한 시계열 네트워크 분석을 수행하였다. 연구주제 분석 결과, 수학 및 전산 생물학, 화학, 건강관리 과학 및 서비스, 생화학 및 분자 생물학, 운영 연구 및 경영 과학, 의료정보학은 연구비율이 감소하고 있었고, 환경, 수자원, 지질학, 계측기 및 계측의 연구비율은 증가하고 있었다. 또한 사회연결망 분석 결과 데이터품질 연구에서는 분석, 알고리즘, 네트워크의 주제가 중앙성이 높은 중요한 주제로 나타났으며, 이미지와 모델, 센서, 최적화가 데이터품질에서 중요한 주제로 등장하는 추세를 보였다. 데이터품질의 산업과 연관관계 분석 결과는 기술, 산업, 건강, 유틸리티, 고객서비스가 연관성이 높은 산업으로 나타났다. 본 연구의 결과는 데이터품질 연구의 패턴을 분석하고 산업과 연관관계를 찾는 데이터품질 관련 연구자 뿐아니라 산업계에도 유용한 자료로 활용되리라 판단된다.

A Pilot Study of the Scanning Beam Quality Assurance Using Machine Log Files in Proton Beam Therapy

  • Chung, Kwangzoo
    • 한국의학물리학회지:의학물리
    • /
    • 제28권3호
    • /
    • pp.129-133
    • /
    • 2017
  • The machine log files recorded by a scanning control unit in proton beam therapy system have been studied to be used as a quality assurance method of scanning beam deliveries. The accuracy of the data in the log files have been evaluated with a standard calibration beam scan pattern. The proton beam scan pattern has been delivered on a gafchromic film located at the isocenter plane of the proton beam treatment nozzle and found to agree within ${\pm}1.0mm$. The machine data accumulated for the scanning beam proton therapy of five different cases have been analyzed using a statistical method to estimate any systematic error in the data. The high-precision scanning beam log files in line scanning proton therapy system have been validated to be used for off-line scanning beam monitoring and thus as a patient-specific quality assurance method. The use of the machine log files for patient-specific quality assurance would simplify the quality assurance procedure with accurate scanning beam data.