• Title/Summary/Keyword: Classification of Quality

Search Result 1,574, Processing Time 0.026 seconds

A Study on the Analysis of Current Issues and the Operation Plan of News Media Asset Management System in Korean Broadcasting Companies: the Case Study of KBS Digital Newsroom (방송사 보도영상관리시스템 운영 현황분석과 개선안 연구 - KBS 디지털뉴스룸 사례를 중심으로 -)

  • Choi, Hyo-jin;Park, Choonwon;Kim, Sooyoung;Song, Jeonga;Park, Yeajin;Shin, Bongseung;Ji, Sunho;Sun, Sangwon
    • Journal of the Korean BIBLIA Society for library and Information Science
    • /
    • v.33 no.3
    • /
    • pp.123-155
    • /
    • 2022
  • This study focuses on the management of the news production system in broadcasting companies. This paper concentrates on the process of data registration and metadata management in order to examine whether the currently produced news can have value as a 'public record' in the long term, and whether reliable and accurate information is preserved. In addition, the user experience in the current system is analyzed through in-depth interviews with Ingest Managers, Editors, and Archive Managers, who are closely related to metadata creation compared to other members of the its News Department. Finally, a sustainable metadata quality management method is sought to increase the value of news footage as a 'public record'. In this study, these points can be found out: the metadata of the news agency footage is input manually according to the user's will or working style, that is, the user-friendly metadata input system is insufficient. Accordingly, it can be seen that the quality of the metadata of the news video continues to deteriorate. As an alternative to overcome this, it is found that work flow improvement, system improvement, classification system and metadata improvement plan, etc. are definitely necessary in the short and long term.

Clinical Microscopy: Performance, Maintenance and Laser Safety (임상에서의 현미경: 작동, 유지보수 및 레이저 안전)

  • Lee, Tae Bok
    • Korean Journal of Clinical Laboratory Science
    • /
    • v.51 no.2
    • /
    • pp.125-133
    • /
    • 2019
  • A microscope is the fundamental research and diagnostic apparatus for clinical investigation of signaling transduction, morphological changes and physiological tracking of cells and intact tissues from patients in the biomedical laboratory science. Proper use, care and maintenance of microscope with comprehensive understanding in mechanism are fully requested for reliable image data and accurate interpretation for diagnosis in the clinical laboratory. The standard operating procedure (SOP) for light microscopes includes performance procedure, brief information of all mechanical parts of microscopes with systematic troubleshooting mechanism depending on the laboratory capacity. Maintenance program encompasses cleaning objective, ocular lenses and inner optics; replacement and calibration of light source; XY sample stage management; point spread function (PSF) measurement for confocal laser scanning microscope (CLSM); quality control (QC) program in fluorescent microscopy; and systematic troubleshooting. Laser safety is one of the concern for medical technologists engaged in CLSM laboratory. Laser safety guideline based on the laser classification and risk level, and advisory lab wear for CLSM users are also expatiated in this overview. Since acquired image data presents a wide range of information at the moment of acquisition, well-maintained microscopes with proper microscopic maintenance program are impulsive for its interpretation and diagnosis in the clinical laboratory.

The Model of Appraisal Method on Authentic Records (전자기록의 진본 평가 시스템 모형 연구)

  • Kim, Ik-Han
    • The Korean Journal of Archival Studies
    • /
    • no.14
    • /
    • pp.91-117
    • /
    • 2006
  • Electronic Records need to be appraised the authenticity as well as the value itself. There has been various kinds of discussion about how records to be appraised the value of themselves, but there's little argument about how electronic records to be appraised the authenticity of themselves. Therefore this article is modeling some specific authenticity appraisal methods and showing each stages those methods should or may be applied. At the Ingest stage, integrity verification right after records creation in the organization which produced the records, quality and integrity verification about the transferred in the organization which received the records and integrity check between SIP and AIP in the organization which received and preserved the records are essential. At the Preservation stage, integrity check between same AIPs stored in different medium separately and validation of records where or not damaged and recovery damaged records are needed. At the various Processing stages, suitability evaluation after changing the record's management control meta data and changing the record's classification, integrity check after records migration and periodical validation and integrity verification about DIPs are required. For those activities, the appraisal methods including integrity verification, content consistency check, suitability evaluation about record's meta data, feasibility check of unauthorized update and physical status validation should be applied to the electronic records management process.

The Determination of Risk Group and Severity by Traffic Accidents Types - Focusing on Seoul City - (교통사고 위험그룹 및 사고유형별 심각도 결정 연구 - 서울시 중심 -)

  • Shim, Kywan-Bho
    • International Journal of Highway Engineering
    • /
    • v.11 no.2
    • /
    • pp.195-203
    • /
    • 2009
  • This research wished to risk type and examine closely driver special quality and relation of traffic accidents by occurrence type of traffic accidents and traffic accidents seriousness examine closely relation with Severity. Fractionate traffic accidents type by eight, and driver's special quality for risk group's classification did to distinction of sex, vehicle type, age etc. analyzed relation with injury degree adding belt used putting on availability for security the objectivity with wave. Used log-Linear model and Logit model for analysis of category data. A head-on collision and overtaking accident, right-turn accident are high injury or death accident and possibility to associate in relation with accident type and seriousness degree. In risk group analysis The age less than 20 years in motor-cycle driver, taxi driver in 41 years to 50 years old are very dangerous. The woman also was construed to the more risk group than man from when related to car, mini-bus, goods vehicle etc. Therefore, traffic safety education and Enforcement for risk group that way that can reduce accident that produce to reduce a loss of lives at traffic accidents appearance a head-on collision and overtaking accidents, right-turn accidents should be studied and as traffic accidents weakness class may have to be solidified.

  • PDF

Denoising Self-Attention Network for Mixed-type Data Imputation (혼합형 데이터 보간을 위한 디노이징 셀프 어텐션 네트워크)

  • Lee, Do-Hoon;Kim, Han-Joon;Chun, Joonghoon
    • The Journal of the Korea Contents Association
    • /
    • v.21 no.11
    • /
    • pp.135-144
    • /
    • 2021
  • Recently, data-driven decision-making technology has become a key technology leading the data industry, and machine learning technology for this requires high-quality training datasets. However, real-world data contains missing values for various reasons, which degrades the performance of prediction models learned from the poor training data. Therefore, in order to build a high-performance model from real-world datasets, many studies on automatically imputing missing values in initial training data have been actively conducted. Many of conventional machine learning-based imputation techniques for handling missing data involve very time-consuming and cumbersome work because they are applied only to numeric type of columns or create individual predictive models for each columns. Therefore, this paper proposes a new data imputation technique called 'Denoising Self-Attention Network (DSAN)', which can be applied to mixed-type dataset containing both numerical and categorical columns. DSAN can learn robust feature expression vectors by combining self-attention and denoising techniques, and can automatically interpolate multiple missing variables in parallel through multi-task learning. To verify the validity of the proposed technique, data imputation experiments has been performed after arbitrarily generating missing values for several mixed-type training data. Then we show the validity of the proposed technique by comparing the performance of the binary classification models trained on imputed data together with the errors between the original and imputed values.

National GIS Standards: Contents and Future Directions (국가 GIS 표준의 내용과 표준화 방향)

  • Jang, Sung-Gheel;Kim, Tschang-Ho
    • Journal of Korea Spatial Information System Society
    • /
    • v.1 no.2 s.2
    • /
    • pp.99-113
    • /
    • 1999
  • The role of a GIS as a tool for a national information infrastructure can best be fulfilled once GIS standards are implemented. In this paper, we have identified what the contents of GIS standards in other countries are, and what should be the future direction for implementing a nation's GIS standards. Based on a detailed review on GIS standards in the USA, Australia, Japan and the United Kingdom, we derived the following: (1) A nations's GIS standards should include both geographic information content standards and geographic information service standards: (2) A nation's GIS standards should be a profile of ISO GIS standards: (3) Each GIS standards should be developed on the bassis of the Entity-Relationship Model using Unified Modeling Language: and (4) Experts in GIS should pay much more attention on studies on GIS service standardization. As for building the national GIS Standards for Korea, we recommend both GIS Content Standards and GIS Service Standards be simultaneously developed. GIS Content Standards include geographic feature content standard, feature classification standard, portrayal standard, rules for application standards, spatial reference model and terminology. GIS Service Standards include standards for data sharing such as metadata standard and transfer standard, quality standard, quality principle and portrayal standards.

  • PDF

Intelligent VOC Analyzing System Using Opinion Mining (오피니언 마이닝을 이용한 지능형 VOC 분석시스템)

  • Kim, Yoosin;Jeong, Seung Ryul
    • Journal of Intelligence and Information Systems
    • /
    • v.19 no.3
    • /
    • pp.113-125
    • /
    • 2013
  • Every company wants to know customer's requirement and makes an effort to meet them. Cause that, communication between customer and company became core competition of business and that important is increasing continuously. There are several strategies to find customer's needs, but VOC (Voice of customer) is one of most powerful communication tools and VOC gathering by several channels as telephone, post, e-mail, website and so on is so meaningful. So, almost company is gathering VOC and operating VOC system. VOC is important not only to business organization but also public organization such as government, education institute, and medical center that should drive up public service quality and customer satisfaction. Accordingly, they make a VOC gathering and analyzing System and then use for making a new product and service, and upgrade. In recent years, innovations in internet and ICT have made diverse channels such as SNS, mobile, website and call-center to collect VOC data. Although a lot of VOC data is collected through diverse channel, the proper utilization is still difficult. It is because the VOC data is made of very emotional contents by voice or text of informal style and the volume of the VOC data are so big. These unstructured big data make a difficult to store and analyze for use by human. So that, the organization need to automatic collecting, storing, classifying and analyzing system for unstructured big VOC data. This study propose an intelligent VOC analyzing system based on opinion mining to classify the unstructured VOC data automatically and determine the polarity as well as the type of VOC. And then, the basis of the VOC opinion analyzing system, called domain-oriented sentiment dictionary is created and corresponding stages are presented in detail. The experiment is conducted with 4,300 VOC data collected from a medical website to measure the effectiveness of the proposed system and utilized them to develop the sensitive data dictionary by determining the special sentiment vocabulary and their polarity value in a medical domain. Through the experiment, it comes out that positive terms such as "칭찬, 친절함, 감사, 무사히, 잘해, 감동, 미소" have high positive opinion value, and negative terms such as "퉁명, 뭡니까, 말하더군요, 무시하는" have strong negative opinion. These terms are in general use and the experiment result seems to be a high probability of opinion polarity. Furthermore, the accuracy of proposed VOC classification model has been compared and the highest classification accuracy of 77.8% is conformed at threshold with -0.50 of opinion classification of VOC. Through the proposed intelligent VOC analyzing system, the real time opinion classification and response priority of VOC can be predicted. Ultimately the positive effectiveness is expected to catch the customer complains at early stage and deal with it quickly with the lower number of staff to operate the VOC system. It can be made available human resource and time of customer service part. Above all, this study is new try to automatic analyzing the unstructured VOC data using opinion mining, and shows that the system could be used as variable to classify the positive or negative polarity of VOC opinion. It is expected to suggest practical framework of the VOC analysis to diverse use and the model can be used as real VOC analyzing system if it is implemented as system. Despite experiment results and expectation, this study has several limits. First of all, the sample data is only collected from a hospital web-site. It means that the sentimental dictionary made by sample data can be lean too much towards on that hospital and web-site. Therefore, next research has to take several channels such as call-center and SNS, and other domain like government, financial company, and education institute.

Study on Change of Algae Occurrence Before & After Gangcheon and Ipoh Weir Construction at Namhan River (남한강 강천보와 이포보 건설 전·후 조류 발생의 변화에 대한 연구)

  • Chae, Soo-Kwon;Oh, Seung-Eun;Chun, Seung-Hoon;Ahn, Hong-Kyu
    • Journal of Wetlands Research
    • /
    • v.18 no.4
    • /
    • pp.394-403
    • /
    • 2016
  • This study was carried out to verify change and relationship between the concentration of chlorophyll-a and environmental factors including weather, water quality and discharge at before & after Gangcheon and Ipoh weir construction at Namhan river, based on the weather and water quality data provided by the measuring network. We classified the period of before & after weir construction by the cluster analysis with Ward's method, and also through the correlation analysis between the concentration of chlorophyll-a and environmental factors, the influence factors related with algae occurrence(Chlorophyll-a) were analyzed. The result by cluster analysis based on data of the total 12 factors (water temperature, rainfall, daylight, pH, DO, BOD, COD, T-N, $NH_3-N$, $NO_3-N$, T-P, $PO_4-P$) from 2005 to 2015 indicated a clear classification into two periods, before(2006-2007) & after (2012-2013) weir construction. After weir construction, class of BOD at Gangcheon weir was better than before, changed from II class to Ia class, and likewise class of BOD at Ipoh weir was improved from II-III class to Ia-IIclass. Also T-P and T-N concentration also were to be improved in general after weir construction. Concentraion of Chlorophyll-a afterGangcheon and Ipoh weir construction was to be decreased. However, frequency of algae warning was increased from 9 to 15 after Ipoh weir construction due to increasing of HRT and water temperature. After weirs construction, the result of correlation analysis between weather, water quality and discharge and concentration of chlorophyll-a indicated a positive correlation, order of BOD(0.579) > COD(0.413) > temperature(0.237), and a negative correlation, order of $NO_3-N$(-0.344) > T-N(-0.293) at Gangcheon weir. And there were likewise positive correlation, order of BOD(0.795) > pH(0.581) > Water temperature(0.422), and negative correlation, order of $NO_3-N$(-0.457) > T-N(-0.371) > $NH_3-N$(-0.326) > $PO_4-P$(-0.288) > Discharge(-0.213) after Ipoh weir construction. Although water quality after Ipoh weir construction was generally improved, increase of frequency of algae warning occurrence was influenced by change of water conditions such as reduction of the velocity, increase of HRT and water temperature, etc impacted strongly by change of the stream flow more than change of water environments after weir construction.

Development of Deep Learning Structure to Improve Quality of Polygonal Containers (다각형 용기의 품질 향상을 위한 딥러닝 구조 개발)

  • Yoon, Suk-Moon;Lee, Seung-Ho
    • Journal of IKEEE
    • /
    • v.25 no.3
    • /
    • pp.493-500
    • /
    • 2021
  • In this paper, we propose the development of deep learning structure to improve quality of polygonal containers. The deep learning structure consists of a convolution layer, a bottleneck layer, a fully connect layer, and a softmax layer. The convolution layer is a layer that obtains a feature image by performing a convolution 3x3 operation on the input image or the feature image of the previous layer with several feature filters. The bottleneck layer selects only the optimal features among the features on the feature image extracted through the convolution layer, reduces the channel to a convolution 1x1 ReLU, and performs a convolution 3x3 ReLU. The global average pooling operation performed after going through the bottleneck layer reduces the size of the feature image by selecting only the optimal features among the features of the feature image extracted through the convolution layer. The fully connect layer outputs the output data through 6 fully connect layers. The softmax layer multiplies and multiplies the value between the value of the input layer node and the target node to be calculated, and converts it into a value between 0 and 1 through an activation function. After the learning is completed, the recognition process classifies non-circular glass bottles by performing image acquisition using a camera, measuring position detection, and non-circular glass bottle classification using deep learning as in the learning process. In order to evaluate the performance of the deep learning structure to improve quality of polygonal containers, as a result of an experiment at an authorized testing institute, it was calculated to be at the same level as the world's highest level with 99% good/defective discrimination accuracy. Inspection time averaged 1.7 seconds, which was calculated within the operating time standards of production processes using non-circular machine vision systems. Therefore, the effectiveness of the performance of the deep learning structure to improve quality of polygonal containers proposed in this paper was proven.

PM2.5 Simulations for the Seoul Metropolitan Area: (II) Estimation of Self-Contributions and Emission-to-PM2.5 Conversion Rates for Each Source Category (수도권 초미세먼지 농도모사 : (II) 오염원별, 배출물질별 자체 기여도 및 전환율 산정)

  • Kim, Soontae;Bae, Changhan;Yoo, Chul;Kim, Byeong-Uk;Kim, Hyun Cheol;Moon, Nankyoung
    • Journal of Korean Society for Atmospheric Environment
    • /
    • v.33 no.4
    • /
    • pp.377-392
    • /
    • 2017
  • A set of BFM (Brute Force Method) simulations with the CMAQ (Community Multiscale Air Quality) model were conducted in order to estimate self-contributions and conversion rates of PPM (Primary $PM_{2.5}$), $NO_x$, $SO_2$, $NH_3$, and VOC emissions to $PM_{2.5}$ concentrations over the SMA (Seoul Metropolitan Area). CAPSS (Clean Air Policy Support System) 2013 EI (emissions inventory) from the NIER (National Institute of Environmental Research) was used for the base and sensitivity simulations. SCCs (Source Classification Codes) in the EI were utilized to group the emissions into area, mobile, and point source categories. PPM and $PM_{2.5}$ precursor emissions from each source category were reduced by 50%. In turn, air quality was simulated with CMAQ during January, April, July, and October in 2014 for the BFM runs. In this study, seasonal variations of SMA $PM_{2.5}$ self-sensitivities to PPM, $SO_2$, and $NH_3$ emissions can be observed even when the seasonal emission rates are almost identical. For example, when the mobile PPM emissions from the SMA were 634 TPM (Tons Per Month) and 603 TPM in January and July, self-contributions of the emissions to monthly mean $PM_{2.5}$ were $2.7{\mu}g/m^3$ and $1.3{\mu}g/m^3$ for the months, respectively. Similarly, while $NH_3$ emissions from area sources were 4,169 TPM and 3,951 TPM in January and July, the self-contributions to monthly mean $PM_{2.5}$ for the months were $2.0{\mu}g/m^3$ and $4.4{\mu}g/m^3$, respectively. Meanwhile, emission-to-$PM_{2.5}$ conversion rates of precursors vary among source categories. For instance, the annual mean conversion rates of the SMA mobile, area, and point sources were 19.3, 10.8, and $6.6{\mu}g/m^3/10^6TPY$ for $SO_2$ emissions while those rates for PPM emissions were 268.6, 207.7, and 181.5 (${\mu}g/m^3/10^6TPY$), respectively, over the region. The results demonstrate that SMA $PM_{2.5}$ responses to the same amount of reduction in precursor emissions differ for source categories and in time (e.g. seasons), which is important when the cost-benefit analysis is conducted during air quality improvement planning. On the other hand, annual mean $PM_{2.5}$ sensitivities to the SMA $NO_x$ emissions remains still negative even after a 50% reduction in emission category which implies that more aggressive $NO_x$ reductions are required for the SMA to overcome '$NO_x$ disbenefit' under the base condition.