• Title/Summary/Keyword: Performance standards

Search Result 2,375, Processing Time 0.034 seconds

A Study on Analysis of Research Trends and Intellectual Structure in the Overseas Cataloging Research (해외 목록학 연구동향 및 지적구조 분석)

  • Ji Won Lee;Sung Sook Lee
    • Journal of the Korean Society for information Management
    • /
    • v.41 no.1
    • /
    • pp.367-387
    • /
    • 2024
  • This study aims to identify the recent trends and intellectual structure of international research in the field of catalog, which is undergoing a major change due to the enactment of new standards and rules and the anticipated future. For this purpose, we collected 680 articles published in the 14 years since 2010 and analyzed 1,942 author keywords extracted from them after preprocessing. The main findings of the analysis are as follows First, overseas cataloging research has seen notable growth since 2017. Second, the most frequent research topics were: cataloging, metadata, RDA, university libraries, authority control, linked data, FRBR, catalog, LCSH, libraries, andonline cataloging. Third, the research themes were divided into two clusters, one related to the traditional aspects of library cataloging and the other related to the more recently discussed topics of authority control, cooperative cataloging, RDA, and linked data, which were further subdivided into 14 subclusters. Fourth, we looked at the growth index and standard performance index of the 14 keyword clusters and found that all but one cluster showed growth in terms of discipline growth. This study is significant in that it can be used as a basis for predicting the future development of inventories for Korean academia and the field and for related education.

Evaluation method for interoperability of weapon systems applying natural language processing techniques (자연어처리 기법을 적용한 무기체계의 상호운용성 평가방법)

  • Yong-Gyun Kim;Dong-Hyen Lee
    • Journal of The Korean Institute of Defense Technology
    • /
    • v.5 no.3
    • /
    • pp.8-17
    • /
    • 2023
  • The current weapon system is operated as a complex weapon system with various standards and protocols applied, so there is a risk of failure in smooth information exchange during combined and joint operations on the battlefield. The interoperability of weapon systems to carry out precise strikes on key targets through rapid situational judgment between weapon systems is a key element in the conduct of war. Since the Korean military went into service, there has been a need to change the configuration and improve performance of a large number of software and hardware, but there is no verification system for the impact on interoperability, and there are no related test tools and facilities. In addition, during combined and joint training, errors frequently occur during use after arbitrarily changing the detailed operation method and software of the weapon/power support system. Therefore, periodic verification of interoperability between weapon systems is necessary. To solve this problem, rather than having people schedule an evaluation period and conduct the evaluation once, AI should continuously evaluate the interoperability between weapons and power support systems 24 hours a day to advance warfighting capabilities. To solve these problems, To this end, preliminary research was conducted to improve defense interoperability capabilities by applying natural language processing techniques (①Word2Vec model, ②FastText model, ③Swivel model) (using published algorithms and source code). Based on the results of this experiment, we would like to present a methodology (automated evaluation of interoperability requirements evaluation / level measurement through natural language processing model) to implement an automated defense interoperability evaluation tool without relying on humans.

  • PDF

The Effect of Corporate Association on the Perceived Risk of the Product (소비자의 제품 지각 위험에 대한 기업연상과 효과: 지식과 관여의 조절적 역활을 중심으로)

  • Cho, Hyun-Chul;Kang, Suk-Hou;Kim, Jin-Yong
    • Journal of Global Scholars of Marketing Science
    • /
    • v.18 no.4
    • /
    • pp.1-32
    • /
    • 2008
  • Brown and Dacin (1997) have investigated the relationship between corporate associations and product evaluations. Their study focused on the effects of associations with a company's corporate ability (CA) and its corporate social responsibility (CSR) on consumers' product evaluations. Their study has found that both of CA and CSR influenced product evaluation but CA association has a stronger effect than CSR associations. Brown and Dacin (1997) have, however, claimed that there are few researches on how corporate association impacts product responses. Accordingly, some of researchers have found the variables to moderate or to mediate the relationship between the corporate association and the product responses. In particular, there has been existed a few of studies that tested the influence of the reputation on the product-relevant perceived risk, but the effects of two types of the corporate association on the product-relevant perceived risk were not identified so far. The primary goal of this article is to identify and empirically examine some variables to moderate the effects of CA association and CSR association on the perceived risk of the product. In this articles, we take the concept of the corporate associations that Brown and Dacin (1997) had proposed. CA association is those association related to the company's expertise in producing and delivering its outputs and CSR association reflected the organization's status and activities with respect to its perceived societal obligations. Also, this study defines the risk, which is the uncertainty or loss of the product and corporate that consumers have taken in a particular purchase decision or after having purchased. The risk is classified into product-relevant performance risk and financial risk. Performance risk is the possibility or the consequence of a product not functioning at some expected level and financial risk is the monetary loss one perceives to be incurring if a product does not function at some expected level. In relation to consumer's knowledge, expert consumers have much of the experiences or knowledge of the product in consumer position and novice consumers does not. The model tested in this article are shown in Figure 1. The model indicates that both of CA association and CSR association influence on performance risk and financial risk. In addition, the effects of CA and CSR are moderated by product category knowledge (product knowledge) and product category involvement (product involvement). In this study, the relationships between the corporate association and product-relevant perceived risk are hypothesized as the following form. For example, Hypothesis 1a($H_{1a}$) is represented that CA association has a positive influence on the performance risk of consumer. Also, the hypotheses that identified some variables to moderate the effects of two types of corporate association on the perceived risk of the product are laid down. One of the hypotheses of the interaction effect is Hypothesis 3a($H_{3a}$), it is described that consumer's knowledges of the product moderates the negative relationship between CA association and product-relevant performance risk. A field experiment was conducted in order to examine our model. The company tested was not real but imagined to meet the internal validity. Water purifiers were used for our study. Four scenarios have been developed and described as the imaginary company: Type A with both of superior CA and CSR, Type B with superior CSR and inferior CA, Type C with superior CA and inferior CSR, and Type D with both inferior of CA and CSR. The respondents of this study were classified into four groups. One type of four scenarios (Type A, B, C, or D) in its questionnaire was given to the respondent who filled out questions. Data were collected by means of a self-administered questionnaire to the respondents, chosen in convenience. A total of 300 respondents filled out the questionnaire but 207 were used for further analysis. Table 1 indicates that the scales in this study are reliable because the range of coefficients of Cronbach's $\alpha$ are from 0.85 to 0.92. The composite reliability is in the range of 0,85 to 0,92 and average variance extracted is in 0.72-0.98 range that is higher than the base level of 0.6. As shown in Table 2, the values for CFI, NNFI, root-mean-square error approximation (RMSEA), and standardized root-mean-square residual (SRMR) are acceptably close to the standards suggested by Hu and Bentler (1999):.95 for CFI and NNFI,.06 for RMSEA, and.08 for SRMR. We also tested discriminant validity provided by Fornell and Larcker (1981). As shown in Table 2, we found strong evidence for discriminant validity between each possible pair of latent constructs in all samples. Given that these batteries of overall goodness-of-fit indices were accurate and that the model was developed on theoretical bases, and given the high level of consistency across samples, this enables us to proceed the previously defined scales. We used the moderated hierarchical regression analysis to test the influence of the corporate association(CA and CSR associations) on product-relevant perceived risk(performance and financial risks) and to identify the variables moderating the relationship between the corporate association and product-relevant performance risk. In this study, dependent variables are performance and financial risk. CA and CSR associations are described the independent variables. The moderating variables are product category knowledge and product category involvement. The results are, as expected, found that CA association has statistically a significant influence on the perceived risk of the product, but CSR association does not. Product category knowledge and involvement moderate the relationship between the CA association and the perceived risk of the product. However, the effect of CSR association on the perceived risk of the product is not moderated by the consumers' knowledge and involvement. For this result, it is necessary for a corporate to inform its customers CA association more than CSR association so that they could be felt to be the reduction of the perceived risk. The important theoretical contribution of this research is the meanings that two types of corporate association that Brown and Dacin(1997), and Brown(1998) have proposed replicated the difference of the effects on product evaluation. According to Hunter(2001), it was an important affair to accomplish the validity of a particular study and we had to take about ten studies to deduce a strict study. Next, there is the contribution of the this study to find that the effects of corporate association on the perceived risk of the product are varied by the moderator variables. In particular, the moderating effect of knowledge on the relationship between corporate association and product-relevant perceived risk has not been tested in Korea. In the managerial implications of this research, we suggest the necessity to stress the ability that corporate manufactures the product well(CA association) than the accomplishment of corporate's social obligation(CSR association). This study suffers from various limitations that imply future research directions. The moderating effects of product category knowledge and involvement on the relationship between corporate association and perceived risk need to be replicated. Next, future research could explore whether the mediated effects of the perceived risk has the relationship between corporate association and consumer's product purchase. In addition, to ensure the external validity of the study will be needed to use realistic company, not artificial.

  • PDF

The Prediction of Export Credit Guarantee Accident using Machine Learning (기계학습을 이용한 수출신용보증 사고예측)

  • Cho, Jaeyoung;Joo, Jihwan;Han, Ingoo
    • Journal of Intelligence and Information Systems
    • /
    • v.27 no.1
    • /
    • pp.83-102
    • /
    • 2021
  • The government recently announced various policies for developing big-data and artificial intelligence fields to provide a great opportunity to the public with respect to disclosure of high-quality data within public institutions. KSURE(Korea Trade Insurance Corporation) is a major public institution for financial policy in Korea, and thus the company is strongly committed to backing export companies with various systems. Nevertheless, there are still fewer cases of realized business model based on big-data analyses. In this situation, this paper aims to develop a new business model which can be applied to an ex-ante prediction for the likelihood of the insurance accident of credit guarantee. We utilize internal data from KSURE which supports export companies in Korea and apply machine learning models. Then, we conduct performance comparison among the predictive models including Logistic Regression, Random Forest, XGBoost, LightGBM, and DNN(Deep Neural Network). For decades, many researchers have tried to find better models which can help to predict bankruptcy since the ex-ante prediction is crucial for corporate managers, investors, creditors, and other stakeholders. The development of the prediction for financial distress or bankruptcy was originated from Smith(1930), Fitzpatrick(1932), or Merwin(1942). One of the most famous models is the Altman's Z-score model(Altman, 1968) which was based on the multiple discriminant analysis. This model is widely used in both research and practice by this time. The author suggests the score model that utilizes five key financial ratios to predict the probability of bankruptcy in the next two years. Ohlson(1980) introduces logit model to complement some limitations of previous models. Furthermore, Elmer and Borowski(1988) develop and examine a rule-based, automated system which conducts the financial analysis of savings and loans. Since the 1980s, researchers in Korea have started to examine analyses on the prediction of financial distress or bankruptcy. Kim(1987) analyzes financial ratios and develops the prediction model. Also, Han et al.(1995, 1996, 1997, 2003, 2005, 2006) construct the prediction model using various techniques including artificial neural network. Yang(1996) introduces multiple discriminant analysis and logit model. Besides, Kim and Kim(2001) utilize artificial neural network techniques for ex-ante prediction of insolvent enterprises. After that, many scholars have been trying to predict financial distress or bankruptcy more precisely based on diverse models such as Random Forest or SVM. One major distinction of our research from the previous research is that we focus on examining the predicted probability of default for each sample case, not only on investigating the classification accuracy of each model for the entire sample. Most predictive models in this paper show that the level of the accuracy of classification is about 70% based on the entire sample. To be specific, LightGBM model shows the highest accuracy of 71.1% and Logit model indicates the lowest accuracy of 69%. However, we confirm that there are open to multiple interpretations. In the context of the business, we have to put more emphasis on efforts to minimize type 2 error which causes more harmful operating losses for the guaranty company. Thus, we also compare the classification accuracy by splitting predicted probability of the default into ten equal intervals. When we examine the classification accuracy for each interval, Logit model has the highest accuracy of 100% for 0~10% of the predicted probability of the default, however, Logit model has a relatively lower accuracy of 61.5% for 90~100% of the predicted probability of the default. On the other hand, Random Forest, XGBoost, LightGBM, and DNN indicate more desirable results since they indicate a higher level of accuracy for both 0~10% and 90~100% of the predicted probability of the default but have a lower level of accuracy around 50% of the predicted probability of the default. When it comes to the distribution of samples for each predicted probability of the default, both LightGBM and XGBoost models have a relatively large number of samples for both 0~10% and 90~100% of the predicted probability of the default. Although Random Forest model has an advantage with regard to the perspective of classification accuracy with small number of cases, LightGBM or XGBoost could become a more desirable model since they classify large number of cases into the two extreme intervals of the predicted probability of the default, even allowing for their relatively low classification accuracy. Considering the importance of type 2 error and total prediction accuracy, XGBoost and DNN show superior performance. Next, Random Forest and LightGBM show good results, but logistic regression shows the worst performance. However, each predictive model has a comparative advantage in terms of various evaluation standards. For instance, Random Forest model shows almost 100% accuracy for samples which are expected to have a high level of the probability of default. Collectively, we can construct more comprehensive ensemble models which contain multiple classification machine learning models and conduct majority voting for maximizing its overall performance.

Problems and Improvement Measures of Private Consulting Firms Working on Rural Area Development (농촌지역개발 민간컨설팅회사의 실태와 개선방안)

  • Kim, Jung Tae
    • Journal of Agricultural Extension & Community Development
    • /
    • v.21 no.2
    • /
    • pp.1-28
    • /
    • 2014
  • Private consulting firms that are currently participating in rural area development projects with a bottom-up approach are involved in nearly all areas of rural area development, and the policy environment that emphasizes the bottom-up approach will further expand their participation. Reviews of private consulting firms, which started out with high expectations in the beginning, are now becoming rather negative. Expertise is the key issue in the controversy over private consulting firms, and the analysis tends to limit the causes of the problems within firms. This study was conducted on the premise that the fixation on cause and structure results in policy issues in the promotion process. That is because the government authorities are responsible for managing and supervising the implementation of policies, not developing the policies. The current issues with consulting firms emerged because of the hasty implementation of private consulting through the government policy trend without sufficient consideration, as well as the policy environment that demanded short-term outcomes even though the purpose of bottom-up rural area development lies in the ideology of endogenous development focused on the changes in residents' perceptions. Research was conducted to determine how the problems of private consulting firms that emerged and were addressed in this context influenced the consulting market, using current data and based on the firms' business performance. In analyzing the types, firms were divided into three groups: top performers including market leaders (9), excellent performers (36), and average performers (34). An analysis of the correlation between the business performance of each type and managerial resources such as each firm's expertise revealed that there was only a correlation between human resources and regional development in excellent performers, and none was found with the other types. These results imply that external factors other than a firm's capabilities (e.g., expertise) play a significant role in the standards of selecting private consulting firms. Thus, government authorities must reflect on their error of hastily adopting private consulting firms without sufficient consideration and must urgently establish response measures.

Optimization of Analytical Methods for Ochratoxin A and Zearalenone by UHPLC in Rice Straw Silage and Winter Forage Crops (UHPLC를 이용한 볏짚 사일리지와 동계사료작물의 오크라톡신과 제랄레논 분석법 최적화)

  • Ham, Hyeonheui;Mun, Hye Yeon;Lee, Kyung Ah;Lee, Soohyung;Hong, Sung Kee;Lee, Theresa;Ryu, Jae-Gee
    • Journal of The Korean Society of Grassland and Forage Science
    • /
    • v.36 no.4
    • /
    • pp.333-339
    • /
    • 2016
  • The objective of this study was to optimize analytical methods for ochratoxin A (OTA) and zearalenone (ZEA) in rice straw silage and winter forage crops using ultra-high performance liquid chromatography (UHPLC). Samples free of mycotoxins were spiked with $50{\mu}g/kg$, $250{\mu}g/kg$, or $500{\mu}g/kg$ of OTA and $300{\mu}g/kg$, $1500{\mu}g/kg$, or $3000{\mu}g/kg$ of ZEA. OTA and ZEA were extracted by acetonitrile and cleaned-up using an immunoaffinity column. They were then subjected to analysis with UHPLC equipped with a fluorescence detector. The correlation coefficients of calibration curves showed high linearity ($R^2{\geq_-}0.9999$ for OTA and $R^2{\geq_-}0.9995$ for ZEA). The limit of detection and quantification were $0.1{\mu}g/kg$ and $0.3{\mu}g/kg$, respectively, for OTA and $5{\mu}g/kg$ and $16.7{\mu}g/kg$, respectively, for ZEA. The recovery and relative standard deviation (RSD) of OTA were as follows: rice straw = 84.23~95.33%, 2.59~4.77%; Italian ryegrass = 79.02~95%, 0.86~5.83%; barley = 74.93~97%, 0.85~9.19%; rye = 77.99~96.67%, 0.33~6.26%. The recovery and RSD of ZEA were: rice straw = 109.6~114.22%, 0.67~7.15%; Italian ryegrass = 98.01~109.44%, 1.65~4.81%; barley = 98~113.53%, 0.25~5.85%; rye = 90.44~108.56%, 2.5~4.66%. They both satisfied the standards of European Commission criteria (EC 401-2006) for quantitative analysis. These results showed that the optimized methods could be used for mycotoxin analysis of forages.

Simultaneous Determination of Eight Sugar Alcohols in Foodstuffs by High Performance Liquid Chromatography (HPLC를 이용한 식품 중 당알코올 8종 동시분석)

  • Lim, Ho-Soo;Park, Sung-Kwan;Kwak, In-Shin;Kim, Hyung-Il;Sung, Jun-Hyun;Choi, Jung-Yoon;Kim, So-Hee
    • Journal of Food Hygiene and Safety
    • /
    • v.26 no.1
    • /
    • pp.16-24
    • /
    • 2011
  • A method was established for the simultaneous determination of sugar alcohols, erythritol, xylitol, sorbitol, inositol, mannitol, maltitiol, lactitol and isomalt by High Performance Liquid Chromatography (HPLC). The sugar alcohols were converted into strong ultraviolet (UV)-absorbing derivatives with p-nitrobenzoyl chloride (PNBC). HPLC was performed on Imtakt Unison US-$C_18$ column, using acetonitrile: water (77:23) as a mobile phase and UV detection (260 nm). The calibration curves for all sugar alcohols tested were linear in the 10~200 mg/L range. The average recoveries of the sugar alcohols from three confectioneries spiked at 100 ppm of eight sugar alcohol standards ranged from 81.2 to 123.1% with relative standard deviations ranging fromo 0.2 to 4.9%. The limits of detection (LODs) were $0.5{\sim}8\;{\mu}g/L$ and the limits of quantification (LOQs) were $2{\sim}17\;{\mu}g/L$. Reproducibility of 8 sugar alcohols was 0.28~1.97 %RSD. The results of the analysis of confectioneries showed that 89 samples of 130 were detected and the sugar alcohols content of samples investigated varied between 0.4 and 693.7 g/kg. A method for the simultaneous determination of eight sugar alcohols will be used as basic data for control of sugar alcohols in confectioneries, and quality control in food manufacturing.

SysML-Based System Modeling for Design of BIPV Electric Power Generation (건물일체형 태양광 시스템의 전력발전부 설계를 위한 SysML기반 시스템 모델링)

  • Lee, Seung-Joon;Lee, Jae-Chon
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.19 no.10
    • /
    • pp.578-589
    • /
    • 2018
  • Building Integrated Photovoltaic (BIPV) system is a typical integrated system that simultaneously performs both building function and solar power generation function. To maximize its potential advantage, however, the solar photovoltaic power generation function must be integrated from the early conceptual design stage, and maximum power generation must be designed. To cope with such requirements, preliminary research on BIPV design process based on architectural design model and computer simulation results for improving solar power generation performance have been published. However, the requirements of the BIPV system have not been clearly identified and systematically reflected in the subsequent design. Moreover, no model has verified the power generation design. To solve these problems, we systematically model the requirements of BIPV system and study power generation design based on the system requirements model. Through the study, we consistently use the standard system modeling language, SysML. Specifically, stakeholder requirements were first identified from stakeholders and related BIPV standards. Then, based on the domain model, the design requirements of the BIPV system were derived at the system level, and the functional and physical architectures of the target system were created based on the system requirements. Finally, the power generation performance of the BIPV system was evaluated through a simulated SysML model (Parametric diagram). If the SysML system model developed herein can be reinforced by reflecting the conditions resulting from building design, it will open an opportunity to study and optimize the power generation in the BIPV system in an integrated fashion.

Innovation Technology Development & Commercialization Promotion of R&D Performance to Domestic Renewable Energy (신재생에너지 기술혁신 개발과 R&D성과 사업화 촉진 방안)

  • Lee, Yong-Seok;Rho, Do-Hwan
    • Journal of Korea Technology Innovation Society
    • /
    • v.12 no.4
    • /
    • pp.788-818
    • /
    • 2009
  • Renewable energy refers to solar energy, biomass energy, hydrogen energy, wind power, fuel cell, coal liquefaction and vaporization, marine energy, waste energy, and liquidity fuel made out of byproduct of geothermal heat, hydrogen and coal; it excludes energy based on coal, oil, nuclear energy and natural gas. Developed countries have recognized the importance of these energies and thus have set the mid to long term plans to develop and commercialize the technology and supported them with drastic political and financial measures. Considering the growing recognition to the field, it is necessary to analysis up-to-now achievement of the government's related projects, in the standards of type of renewable energy, management of sectional goals, and its commercialization. Korean government is chiefly following suit the USA and British policies of developing and distributing renewable energy. However, unlike Japan which is in the lead role in solar rays industry, it still lacks in state-directed support, participation of enterprises and social recognition. The research regarding renewable energy has mainly examinedthe state of supply of each technology and suitability of specific region for applying the technology. The evaluation shows that the research has been focused on supply and demand of renewable as well as general energy and solution for the enhancement of supply capacity in certain area. However, in-depth study for commercialization and the increase of capacity in industry followed by development of the technology is still inadequate. 'Cost-benefit model for each energy source' is used in analysis of technology development of renewable energy and quantitative and macro economical effects of its commercialization in order to foresee following expand in related industries and increase in added value. First, Investment on the renewable energy technology development is in direct proportion both to the product and growth, but product shows slightly higher index under the same amount of R&D investment than growth. It indicates that advance in technology greatly influences the final product, the energy growth. Moreover, while R&D investment on renewable energy product as well as the government funds included in the investment have proportionate influence on the renewable energy growth, private investment in the total amount invested has reciprocal influence. This statistic shows that research and development is mainly driven by government funds rather than private investment. Finally, while R&D investment on renewable energy growth affects proportionately, government funds and private investment shows no direct relations, which indicates that the effects of research and development on renewable energy do not affect government funds or private investment. All of the results signify that although it is important to have government policy in technology development and commercialization, private investment and active participation of enterprises are the key to the success in the industry.

  • PDF

Modern Paper Quality Control

  • Olavi Komppa
    • Proceedings of the Korea Technical Association of the Pulp and Paper Industry Conference
    • /
    • 2000.06a
    • /
    • pp.16-23
    • /
    • 2000
  • The increasing functional needs of top-quality printing papers and packaging paperboards, and especially the rapid developments in electronic printing processes and various computer printers during past few years, set new targets and requirements for modern paper quality. Most of these paper grades of today have relatively high filler content, are moderately or heavily calendered , and have many coating layers for the best appearance and performance. In practice, this means that many of the traditional quality assurance methods, mostly designed to measure papers made of pure. native pulp only, can not reliably (or at all) be used to analyze or rank the quality of modern papers. Hence, introduction of new measurement techniques is necessary to assure and further develop the paper quality today and in the future. Paper formation , i.e. small scale (millimeter scale) variation of basis weight, is the most important quality parameter of paper-making due to its influence on practically all the other quality properties of paper. The ideal paper would be completely uniform so that the basis weight of each small point (area) measured would be the same. In practice, of course, this is not possible because there always exists relatively large local variations in paper. However, these small scale basis weight variations are the major reason for many other quality problems, including calender blacking uneven coating result, uneven printing result, etc. The traditionally used visual inspection or optical measurement of the paper does not give us a reliable understanding of the material variations in the paper because in modern paper making process the optical behavior of paper is strongly affected by using e.g. fillers, dye or coating colors. Futhermore, the opacity (optical density) of the paper is changed at different process stages like wet pressing and calendering. The greatest advantage of using beta transmission method to measure paper formation is that it can be very reliably calibrated to measure true basis weight variation of all kinds of paper and board, independently on sample basis weight or paper grade. This gives us the possibility to measure, compare and judge papers made of different raw materials, different color, or even to measure heavily calendered, coated or printed papers. Scientific research of paper physics has shown that the orientation of the top layer (paper surface) fibers of the sheet paly the key role in paper curling and cockling , causing the typical practical problems (paper jam) with modern fax and copy machines, electronic printing , etc. On the other hand, the fiber orientation at the surface and middle layer of the sheet controls the bending stiffness of paperboard . Therefore, a reliable measurement of paper surface fiber orientation gives us a magnificent tool to investigate and predict paper curling and coclking tendency, and provides the necessary information to finetune, the manufacturing process for optimum quality. many papers, especially heavily calendered and coated grades, do resist liquid and gas penetration very much, bing beyond the measurement range of the traditional instruments or resulting invonveniently long measuring time per sample . The increased surface hardness and use of filler minerals and mechanical pulp make a reliable, nonleaking sample contact to the measurement head a challenge of its own. Paper surface coating causes, as expected, a layer which has completely different permeability characteristics compared to the other layer of the sheet. The latest developments in sensor technologies have made it possible to reliably measure gas flow in well controlled conditions, allowing us to investigate the gas penetration of open structures, such as cigarette paper, tissue or sack paper, and in the low permeability range analyze even fully greaseproof papers, silicon papers, heavily coated papers and boards or even detect defects in barrier coatings ! Even nitrogen or helium may be used as the gas, giving us completely new possibilities to rank the products or to find correlation to critical process or converting parameters. All the modern paper machines include many on-line measuring instruments which are used to give the necessary information for automatic process control systems. hence, the reliability of this information obtained from different sensors is vital for good optimizing and process stability. If any of these on-line sensors do not operate perfectly ass planned (having even small measurement error or malfunction ), the process control will set the machine to operate away from the optimum , resulting loss of profit or eventual problems in quality or runnability. To assure optimum operation of the paper machines, a novel quality assurance policy for the on-line measurements has been developed, including control procedures utilizing traceable, accredited standards for the best reliability and performance.