• Title/Summary/Keyword: sampling model

Search Result 2,083, Processing Time 0.029 seconds

Korean women wage analysis using selection models (표본 선택 모형을 이용한 국내 여성 임금 데이터 분석)

  • Jeong, Mi Ryang;Kim, Mijeong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.5
    • /
    • pp.1077-1085
    • /
    • 2017
  • In this study, we have found the major factors which affect Korean women's wage analysing the data provided by 2015 Korea Labor Panel Survey (KLIPS). In general, wage data is difficult to analyze because random sampling is infeasible. Heckman sample selection model is the most widely used method for analysing the data with sample selection. Heckman proposed two kinds of selection models: the one is the model with maximum likelihood method and the other is the Heckman two stage model. Heckman two stage model is known to be robust to the normal assumption of bivariate error terms. Recently, Marchenko and Genton (2012) proposed the Heckman selectiont model which generalizes the Heckman two stage model and concluded that Heckman selection-t model is more robust to the error assumptions. Employing the two models, we carried out the analysis of the data and we compared those results.

RPC MODEL FOR ORTHORECTIFYING VHRS IMAGE

  • Ke, Luong Chinh
    • Proceedings of the KSRS Conference
    • /
    • v.2
    • /
    • pp.631-634
    • /
    • 2006
  • Three main important sources for establishing GIS are the orthomap in scale 1:5 000 with Ground Sampling Distance of 0,5m; DEM/DTM data with height error of ${\pm}$1,0m and topographic map in scale 1: 10 000. The new era with Very High Resolution Satellite (VHRS) images as IKONOS, QuickBird, EROS, OrbView and other ones having Ground Sampling Distance (GSD) even lower than 1m has been in potential for producing orthomap in large scale 1:5 000, to update existing maps, to compile general-purpose or thematic maps and for GIS. The accuracy of orthomap generated from VHRS image affects strongly on GIS reliability. Nevertheless, orthomap accuracy taken from VHRS image is at first dependent on chosen sensor geometrical models. This paper presents, at fist, theoretical basic of the Rational Polynomial Coefficient (RPC) model installed in the commercial ImageStation Systems, realized for orthorectifying VHRS images. The RPC model of VHRS image is a replacement camera mode that represents the indirect relation between terrain and its image acquired on the flight orbit. At the end of this paper the practical accuracies of IKONOS and QuickBird image orthorectified by RPC model on Canadian PCI Geomatica System have been presented. They are important indication for practical application of producing digital orthomaps.

  • PDF

A Design of Intelligent and Evolving Receiver Based on Stochastic Morphological Sampling Theorem (Stochastic Morphological Sampling Theorem을 이용한 지능형 진화형 수신기 구현)

  • 박재현;이경록송문호김운경
    • Proceedings of the IEEK Conference
    • /
    • 1998.06a
    • /
    • pp.46-49
    • /
    • 1998
  • In this paper, we introduce the notion of intelligent communication by introducing a novel intelligent receiver model. This receiver is continually evolving and learns and improves in performance as it compiles its experience over time. In digital communication context, in a typical training mode, it jearns the concept of "1" as is deteriorated by arbitrary (not necessarily additive as is typically assumed) disturbance and /or modulation. After learning "1", in test mode, it classifies the received signal "1" and "0" almost completely. The intelligent receiver as implemented is grounded on the recently introduced Stochastic Morphological Sampling Theorem(SMST), a distribution-free result which gives theoretical bounds on the sample complexity(training size) needed for the required performance parameters such as accuracy($\varepsilon$) and confidence($\delta$). Based on this theorem, we demonstrate --almost irrespective of channel and modulation model-- the number of samples needed to learn the concept of "1" is not too "large" and the resulting universal receiver structure, that corresponding to classical Nearest Neighbor rule in Pattern Recognition Theory, is trivial. We check the surprising efficiency and validity of this model through some simple simulations. and validity of this model through some simple simulations.

  • PDF

Development of Stochastic Finite Element Model for Underground Structure with Discontinuous Rock Mass Using Latin Hypercube Sampling Technique (LHS기법을 이용한 불연속암반구조물의 확률유한요소해석기법개발)

  • 최규섭;정영수
    • Computational Structural Engineering
    • /
    • v.10 no.4
    • /
    • pp.143-154
    • /
    • 1997
  • Astochastic finite element model which reflects both the effect of discontinuities and the uncertainty of material properties in underground rock mass has been developed. Latin Hypercube Sampling technique has been mobilized and compared with the Monte Carlo simulation method. To consider the effect of discontinuities, the joint finite element model, which is known to be suitable to explain faults, cleavage, things of that nature, has been used in this study. To reflect the uncertainty of material properties, multi-random variables are assumed as the joint normal stiffness and the joint shear stiffness, which could be simulated in terms of normal distribution. The developed computer program in this study has been verified by practical example and has been applied to analyze the circular cavern with discontinuous rock mass.

  • PDF

Effects of Latin hypercube sampling on surrogate modeling and optimization

  • Afzal, Arshad;Kim, Kwang-Yong;Seo, Jae-won
    • International Journal of Fluid Machinery and Systems
    • /
    • v.10 no.3
    • /
    • pp.240-253
    • /
    • 2017
  • Latin hypercube sampling is widely used design-of-experiment technique to select design points for simulation which are then used to construct a surrogate model. The exploration/exploitation properties of surrogate models depend on the size and distribution of design points in the chosen design space. The present study aimed at evaluating the performance characteristics of various surrogate models depending on the Latin hypercube sampling (LHS) procedure (sample size and spatial distribution) for a diverse set of optimization problems. The analysis was carried out for two types of problems: (1) thermal-fluid design problems (optimizations of convergent-divergent micromixer coupled with pulsatile flow and boot-shaped ribs), and (2) analytical test functions (six-hump camel back, Branin-Hoo, Hartman 3, and Hartman 6 functions). The three surrogate models, namely, response surface approximation, Kriging, and radial basis neural networks were tested. The important findings are illustrated using Box-plots. The surrogate models were analyzed in terms of global exploration (accuracy over the domain space) and local exploitation (ease of finding the global optimum point). Radial basis neural networks showed the best overall performance in global exploration characteristics as well as tendency to find the approximate optimal solution for the majority of tested problems. To build a surrogate model, it is recommended to use an initial sample size equal to 15 times the number of design variables. The study will provide useful guidelines on the effect of initial sample size and distribution on surrogate construction and subsequent optimization using LHS sampling plan.

Improving Inspection Systems for Radio Stations: An Emphasis on the ISO 2859-1 Sampling Method (무선국 검사제도 개선방안에 관한 연구: ISO 2859-1 샘플링 검사기법을 중심으로)

  • Hyojung Kim;Yuri Kim;Sina Park;Seunghwan Jung;Seongjoon Kim
    • Journal of Korean Society for Quality Management
    • /
    • v.51 no.4
    • /
    • pp.515-530
    • /
    • 2023
  • Purpose : This research aims to develop a data-driven inspection policy for radio stations utilizing the KS Q ISO 2859-1 sampling method, addressing potential regulatory relaxations and impending management challenges. Methods : Using radio station inspection big data from the past six years, we established a simulation model to evaluate the current policy. A new inspection sampling policy framework was designed based on the KS Q ISO 2859-1 method. The study compares the performance of the current and proposed inspection systems, offering insights for an improved inspection strategy. Results : This study introduced a simulation model for inspection system based on the KS Q ISO 2859-1 sampling method. Through various experimental designs, key performance indicators such as non-detection rate and sample proportion were derived, providing foundational data for the new inspection policy. Conclusion : Using big data from radio station inspections, we evaluated current inspection systems and quantitatively compared a new system across diverse scenarios. Our simulation model effectively verified the feasibility and efficiency of the proposed framework. For practical implementation, essential factors such as lot size, inspection cycle, and AQL standards need precise definition and consideration. Enhancing radio station inspections requires a policy-driven approach that factors in socio-economic impacts and solicits feedback from industry participants. Future study should also explore various perspectives related to legislative, institutional, and operational aspects of inspection organizations.

An Accelerated Life Test Sampling Plan for Bulk Material (벌크재료 가속시험샘플링검사방식설계)

  • Kim Jong-Geol;Kim Dong-Cheol
    • Proceedings of the Safety Management and Science Conference
    • /
    • 2006.04a
    • /
    • pp.411-419
    • /
    • 2006
  • This paper aims at designing an accelerated life test sampling plan for bulk material and showing its application for an arc-welded gas pipe. It is an integrated model of the accelerated life test procedure and bulk sampling procedure. The accelerated life tests were performed by the regulation, RSD 0005 of ATS at KITECH and bulk sampling was used for acceptance. Design parameters might be total sample size(segments and increments), stress level and so on. We focus on deciding the sample size by minimizing the asymptotic variance of test statistic as well as satisfying consumer's risk under Weibull life time distribution with primary information on shape parameter.

  • PDF

Reliability Estimation Using Kriging Metamodel (크리깅 메타모델을 이용한 신뢰도 계산)

  • Cho Tae-Min;Ju Byeong-Hyeon;Jung Do-Hyun;Lee Byung-Chai
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.30 no.8 s.251
    • /
    • pp.941-948
    • /
    • 2006
  • In this study, the new method for reliability estimation is proposed using kriging metamodel. Kriging metamodel can be determined by appropriate sampling range and sampling numbers because there are no random errors in the Design and Analysis of Computer Experiments(DACE) model. The first kriging metamodel is made based on widely ranged sampling points. The Advanced First Order Reliability Method(AFORM) is applied to the first kriging metamodel to estimate the reliability approximately. Then, the second kriging metamodel is constructed using additional sampling points with updated sampling range. The Monte-Carlo Simulation(MCS) is applied to the second kriging metamodel to evaluate the reliability. The proposed method is applied to numerical examples and the results are almost equal to the reference reliability.

Classification of Class-Imbalanced Data: Effect of Over-sampling and Under-sampling of Training Data (계급불균형자료의 분류: 훈련표본 구성방법에 따른 효과)

  • 김지현;정종빈
    • The Korean Journal of Applied Statistics
    • /
    • v.17 no.3
    • /
    • pp.445-457
    • /
    • 2004
  • Given class-imbalanced data in two-class classification problem, we often do over-sampling and/or under-sampling of training data to make it balanced. We investigate the validity of such practice. Also we study the effect of such sampling practice on boosting of classification trees. Through experiments on twelve real datasets it is observed that keeping the natural distribution of training data is the best way if you plan to apply boosting methods to class-imbalanced data.

A minimum cost sampling inspection plan for destructive testing (破壤檢査詩의 最小費용 샘플링 檢査方式)

  • 趙星九;裵道善
    • Journal of the Korean Statistical Society
    • /
    • v.7 no.1
    • /
    • pp.27-43
    • /
    • 1978
  • This paper deals with the problem of obtaining a minimum cost acceptance sampling plan for destructive testing. The cost model is constructed under the assumption that the sampling procedure takes the following form; 1) lots rejected on the first sample are acreened with a non-destructive testing, 2) the screening is assumed to be imperfect, and therefore, after the screening, a second sample is taken to determine whether to accept the lot of to scrap it. The usual sampling procedures for destructive testing can be regarded as special cases of the above one. Utilizing Hald's Bayesian approach, procedures for finding the global optimal sampling plans are given. However, when the lot size is large, the global plan is very different to obtain even with the aid of an electronic computer. Therefore a method of finding suboptimal plan is suggested. An example with uniform prior is also given.

  • PDF