• Title/Summary/Keyword: interpretation key

Search Result 228, Processing Time 0.031 seconds

The Principle of Justifiable Granularity and an Optimization of Information Granularity Allocation as Fundamentals of Granular Computing

  • Pedrycz, Witold
    • Journal of Information Processing Systems
    • /
    • v.7 no.3
    • /
    • pp.397-412
    • /
    • 2011
  • Granular Computing has emerged as a unified and coherent framework of designing, processing, and interpretation of information granules. Information granules are formalized within various frameworks such as sets (interval mathematics), fuzzy sets, rough sets, shadowed sets, probabilities (probability density functions), to name several the most visible approaches. In spite of the apparent diversity of the existing formalisms, there are some underlying commonalities articulated in terms of the fundamentals, algorithmic developments and ensuing application domains. In this study, we introduce two pivotal concepts: a principle of justifiable granularity and a method of an optimal information allocation where information granularity is regarded as an important design asset. We show that these two concepts are relevant to various formal setups of information granularity and offer constructs supporting the design of information granules and their processing. A suite of applied studies is focused on knowledge management in which case we identify several key categories of schemes present there.

Logistic Regression Type Small Area Estimations Based on Relative Error

  • Hwang, Hee-Jin;Shin, Key-Il
    • The Korean Journal of Applied Statistics
    • /
    • v.24 no.3
    • /
    • pp.445-453
    • /
    • 2011
  • Almost all small area estimations are obtained by minimizing the mean squared error. Recently relative error prediction methods have been developed and adapted to small area estimation. Usually the estimators obtained by using relative error prediction is called a shrinkage estimator. Especially when data set consists of large range values, the shrinkage estimator is known as having good statistical properties and an easy interpretation. In this paper we study the shrinkage estimators based on logistic regression type estimators for small area estimation. Some simulation studies are performed and the Economically Active Population Survey data of 2005 is used for comparison.

Co-ordination between R&D and Human Resource in the post catching-up era

  • Hwang, Gyu-Hee
    • International Journal of Contents
    • /
    • v.8 no.3
    • /
    • pp.42-51
    • /
    • 2012
  • Korea has entered into the Post Catching-up era and the necessity of new innovation strategy in response is being raised. This study argues the necessity of new innovation strategy and discusses the issue of co-ordination between R&D and Human Resource for that as the key factor. From empirical analysis, there seems to be restricted inflow of outstanding human resources to manufacturing sectors and lowering effectiveness of major even with the improved compatibility between major and job. Especially, it is severe in the industries with high R&D investment. It can be interpreted as an incoordination of the technological innovation efforts from the aspect of R&D investment with human resource utilization from the aspect of new human resource. The analytical result and interpretation suggests that there should be more active improvement on the co-ordination between innovative manufacturers' efforts and human resource utilization, in order to keep sustainable development.

Modeling methods used in bioenergy production processes: A review

  • Akroum, Hamza;Akroum-Amrouche, Dahbia;Aibeche, Abderrezak
    • Advances in Computational Design
    • /
    • v.5 no.3
    • /
    • pp.323-347
    • /
    • 2020
  • The enhancements of bioenergy production effectiveness require the comprehensively experimental study of several parameters affecting these bioprocesses. The interpretation of the obtained experimental results and the estimation of optimum yield are extremely complicated such as misinterpreting the results of an experiment. The use of mathematical modeling and statistical experimental designs can consistently supply the predictions of the potential yield and the identification of defining parameters and also the understanding of key relationships between factors and responses. This paper summarizes several mathematical models used to achieve an adequate overall and maximal production yield and rate, to screen, to optimize, to identify, to describe and to provide useful information for the effect of several factors on bioenergy production processes. The usefulness, the validity and, the feasibility of each strategy for studying and optimizing the bioenergy-producing processes were discussed and confirmed by the good correlation between predicted and measured values.

Conceptual Change: An Interpretation by Radical Constructivism(I) (개념변화: 급진적 구성주의에 의한 해석(I))

  • 유병길
    • Journal of Korean Elementary Science Education
    • /
    • v.19 no.1
    • /
    • pp.85-99
    • /
    • 2000
  • Researches have shown that learning science frequently requires the process of conceptual change. As a result, many of the constructivist teaching and loaming approaches focus on this kind of loaming. In approaches that focus on conceptual change, cognitive conflict strategies play a key role. Students, however, still have much difficulty in loaming science. Theoretically, it underlies Piaget's genetic epistemology in which disequilibration demands an interplay between assimilation and accommodation until equilibrium is restored. Also, radical constructivism has its roots in a variety of disciplines, but has been most profoundly influenced by the theories of lean Piaget as interpreted and extended by Glasersfeld. This study is intended to interpret the conceptual change from radical constructivist perspective and explain difficulties of conceptual change which students have in learning science.

  • PDF

Applying Expert System to Statistical Process Control in Semiconductor Manufacturing (반도체 수율 향상을 위한 통계적 공정 제어에 전문가 시스템의 적용에 관한 연구)

  • 윤건상;최문규;김훈모;조대호;이칠기
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.15 no.10
    • /
    • pp.103-112
    • /
    • 1998
  • The evolution of semiconductor manufacturing technology has accelerated the reduction of device dimensions and the increase of integrated circuit density. In order to improve yield within a short turn around time and maintain it at high level, a system that can rapidly determine problematic processing steps is needed. The statistical process control detects abnormal process variation of key parameters. Expert systems in SPC can serve as a valuable tool to automate the analysis and interpretation of control charts. A set of IF-THEN rules was used to formalize knowledge base of special causes. This research proposes a strategy to apply expert system to SPC in semiconductor manufacturing. In analysis, the expert system accomplishes the instability detection of process parameter, In diagnosis, an engineer is supported by process analyzer program. An example has been used to demonstrate the expert system and the process analyzer.

  • PDF

Effect of Normalization on Detection of Differentially-Expressed Genes with Moderate Effects

  • Cho, Seo-Ae;Lee, Eun-Jee;Kim, Young-Chul;Park, Tae-Sung
    • Genomics & Informatics
    • /
    • v.5 no.3
    • /
    • pp.118-123
    • /
    • 2007
  • The current existing literature offers little guidance on how to decide which method to use to analyze one-channel microarray measurements when dealing with large, grouped samples. Most previous methods have focused on two-channel data;therefore they can not be easily applied to one-channel microarray data. Thus, a more reliable method is required to determine an appropriate combination of individual basic processing steps for a given dataset in order to improve the validity of one-channel expression data analysis. We address key issues in evaluating the effectiveness of basic statistical processing steps of microarray data that can affect the final outcome of gene expression analysis without focusingon the intrinsic data underlying biological interpretation.

Resistivity and Calibration Error Estimations for Small-Loop Electromagnetic Method

  • Sasaki, Yutaka;Son, Jeong-Sul;Kim, Chang-Ryol;Kim, Jung-Ho
    • 한국지구물리탐사학회:학술대회논문집
    • /
    • 2007.06a
    • /
    • pp.167-172
    • /
    • 2007
  • The frequency-domain small-loop electromagnetic (EM) instruments are increasingly used for shallow environmental and geotechnical surveys because of their portability and speed. However, it is well known that the data quality is generally so poor that quantitative interpretation of the data is not justified in many cases. We present an inversion method that allows the correction for the calibration errors and also constructs multidimensional resistivity models. The key point in this method is that the data are collected at least at two different heights. The forward modeling used in the inversion is based on an efficient 3-D finite-difference method, and its solution was checked against 2-D finite-element solution. The synthetic and real data examples demonstrate that the joint inversion recovers reliable resistivity models from multi-frequency data severely contaminated by the calibration errors.

  • PDF

A Literature Review on Pyoyubu (標幽賦) Written by Tu Han Kyoung (竇漢卿) (I) (두한경(竇漢卿)의 표유부(標幽賦)에 대한 연구 (I))

  • Won, Jin-Hee;Lee, In-Young
    • Korean Journal of Acupuncture
    • /
    • v.28 no.1
    • /
    • pp.113-123
    • /
    • 2011
  • Objectives : This study is to provide a clear interpretation of Pyoyubu (標幽賦) which was written by Tu han kyoung (竇漢卿) during the Kum-Won dynasty of China (A.D 1196-1280). Methods : The translation was based on Original Chimgudaesung (原本鍼灸大成) and revisals on Chimguchuiyoung (鍼灸聚英), Yukyoungbuik (類經附翼), New Chimgudaesung (新鍼灸大成), etc. The critical review part helps to better understand acupuncture & moxibustion world. Results & Conclusions : The book covered all of the concepts involved in acupuncture theory and techniques. It provides a foundation and remains a key reference work for the current theory of acupuncture. An in-depth study of the book leads as follow ; 1. To full understanding of the fundamental principles of these fields. 2. To drawing up clinical practice guidelines for doctors toward patients. 3. To promoting the beneficial effects of acupuncture treatment.

Methodological Review on Functional Neuroimaging Using Positron Emission Tomography (뇌기능 양전자방출단층촬영영상 분석 기법의 방법론적 고찰)

  • Park, Hae-Jeong
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.41 no.2
    • /
    • pp.71-77
    • /
    • 2007
  • Advance of neuroimaging technique has greatly influenced recent brain research field. Among various neuroimaging modalities, positron emission tomography has played a key role in molecular neuroimaging though functional MRI has taken over its role in the cognitive neuroscience. As the analysis technique for PET data is more sophisticated, the complexity of the method is more increasing. Despite the wide usage of the neuroimaging techniques, the assumption and limitation of procedures have not often been dealt with for the clinician and researchers, which might be critical for reliability and interpretation of the results. In the current paper, steps of voxel-based statistical analysis of PET including preprocessing, intensity normalization, spatial normalization, and partial volume correction will be revisited in terms of the principles and limitations. Additionally, new image analysis techniques such as surface-based PET analysis, correlational analysis and multimodal imaging by combining PET and DTI, PET and TMS or EEG will also be discussed.