• Title/Summary/Keyword: sampling model

Search Result 2,083, Processing Time 0.031 seconds

Experimental Analysis of Equilibrization in Binary Classification for Non-Image Imbalanced Data Using Wasserstein GAN

  • Wang, Zhi-Yong;Kang, Dae-Ki
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.11 no.4
    • /
    • pp.37-42
    • /
    • 2019
  • In this paper, we explore the details of three classic data augmentation methods and two generative model based oversampling methods. The three classic data augmentation methods are random sampling (RANDOM), Synthetic Minority Over-sampling Technique (SMOTE), and Adaptive Synthetic Sampling (ADASYN). The two generative model based oversampling methods are Conditional Generative Adversarial Network (CGAN) and Wasserstein Generative Adversarial Network (WGAN). In imbalanced data, the whole instances are divided into majority class and minority class, where majority class occupies most of the instances in the training set and minority class only includes a few instances. Generative models have their own advantages when they are used to generate more plausible samples referring to the distribution of the minority class. We also adopt CGAN to compare the data augmentation performance with other methods. The experimental results show that WGAN-based oversampling technique is more stable than other approaches (RANDOM, SMOTE, ADASYN and CGAN) even with the very limited training datasets. However, when the imbalanced ratio is too small, generative model based approaches cannot achieve satisfying performance than the conventional data augmentation techniques. These results suggest us one of future research directions.

CMAC Learning Controller Implementation With Multiple Sampling Rate: An Inverted Pendulum Example (다중 샘플링 타임을 갖는 CMAC 학습 제어기 실현: 역진자 제어)

  • Lee, Byoung-Soo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.4
    • /
    • pp.279-285
    • /
    • 2007
  • The objective of the research is two fold. The first is to design and propose a stable and robust learning control algorithm. The controller is CMAC Learning Controller which consists of a model-based controller, such as LQR or PID, as a reference control and a CMAC. The second objective is to implement a reference control and CMAC at two different sampling rates. Generally, a conventional controller is designed based on a mathematical plant model. However, increasing complexity of the plant and accuracy requirement on mathematical models nearly prohibits the application of the conventional controller design approach. To avoid inherent complexity and unavoidable uncertainty in modeling, biology mimetic methods have been developed. One of such attempts is Cerebellar Model Articulation Computer(CMAC) developed by Albus. CMAC has two main disadvantages. The first disadvantage of CMAC is increasing memory requirement with increasing number of input variables and with increasing accuracy demand. The memory needs can be solved with cheap memories due to recent development of new memory technology. The second disadvantage is a demand for processing powers which could be an obstacle especially when CMAC should be implemented in real-time. To overcome the disadvantages of CMAC, we propose CMAC learning controller with multiple sampling rates. With this approach a conventional controller which is a reference to CMAC at high enough sampling rate but CMAC runs at the processor's unoccupied time. To show efficiency of the proposed method, an inverted pendulum controller is designed and implemented. We also demonstrate it's possibility as an industrial control solution and robustness against a modeling uncertainty.

Assessing the Impact of Sampling Intensity on Land Use and Land Cover Estimation Using High-Resolution Aerial Images and Deep Learning Algorithms (고해상도 항공 영상과 딥러닝 알고리즘을 이용한 표본강도에 따른 토지이용 및 토지피복 면적 추정)

  • Yong-Kyu Lee;Woo-Dam Sim;Jung-Soo Lee
    • Journal of Korean Society of Forest Science
    • /
    • v.112 no.3
    • /
    • pp.267-279
    • /
    • 2023
  • This research assessed the feasibility of using high-resolution aerial images and deep learning algorithms for estimating the land-use and land-cover areas at the Approach 3 level, as outlined by the Intergovernmental Panel on Climate Change. The results from different sampling densities of high-resolution (51 cm) aerial images were compared with the land-cover map, provided by the Ministry of Environment, and analyzed to estimate the accuracy of the land-use and land-cover areas. Transfer learning was applied to the VGG16 architecture for the deep learning model, and sampling densities of 4 × 4 km, 2 × 4 km, 2 × 2 km, 1 × 2 km, 1 × 1 km, 500 × 500 m, and 250 × 250 m were used for estimating and evaluating the areas. The overall accuracy and kappa coefficient of the deep learning model were 91.1% and 88.8%, respectively. The F-scores, except for the pasture category, were >90% for all categories, indicating superior accuracy of the model. Chi-square tests of the sampling densities showed no significant difference in the area ratios of the land-cover map provided by the Ministry of Environment among all sampling densities except for 4 × 4 km at a significance level of p = 0.1. As the sampling density increased, the standard error and relative efficiency decreased. The relative standard error decreased to ≤15% for all land-cover categories at 1 × 1 km sampling density. These results indicated that a sampling density more detailed than 1 x 1 km is appropriate for estimating land-cover area at the local level.

Development of Integrated Variable Sampling Interval EngineeringProcess Control & Statistical Process Control System (가변 샘플링간격 EPC/SPC 결합시스템의 개발)

  • Lee, Sung-Jae;Seo, Sun-Keun
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.32 no.3
    • /
    • pp.210-218
    • /
    • 2006
  • Traditional statistical process control (SPC) applied to discrete part industry in the form of control charts can look for and eliminate assignable causes by process monitoring. On the other hand, engineering process control (EPC) applied to the process industry in the form of feedback control can maintain the process output on the target by continual adjustment of input variable. This study presents controlling and monitoring rules adopted by variable sampling interval (VSI) to change sampling intervals in a predetermined fashion on the predicted process levels under integrated EPC and SPC systems. Twelve rules classified by EPC schemes(MMSE, constrained PI, bounded or deadband adjustment policy) and type of sampling interval combined with EWMA chart of SPC are proposed under IMA (1,1) disturbance model and zero-order (responsive) dynamic system. Properties of twelve control rules under three patterns of process change (sudden shift, drift and random shift) are evaluated and discussed through simulation and control rules for integrated VSI EPC and SPC systems are recommended.

Distributing data in Virtual-reality: factors influencing purchase intention of cutting tools

  • JITKUSOLRUNGRUENG, Nitichai;VONGURAI, Rawin
    • Journal of Distribution Science
    • /
    • v.19 no.9
    • /
    • pp.41-52
    • /
    • 2021
  • Purpose: Virtual reality is a unique technology to distribute data and demonstrates user's understanding towards complex products. The objective of this research is to investigate the impact of virtual reality on real world purchase intention of automotive cutting tools in Thailand's exhibitions. Hence, the research framework was constructed by telepresence, perception narrative, authenticity, trustworthiness, functional value, aesthetics, and purchase intention. Research design, data and methodology: Samples were collected from 500 visitors who participated in the selected top two metalworking exhibitions. Mix sampling approach is applied by using non-probability sampling methods of purposive or judgmental sampling, quota sampling, and convenience sampling method, respectively to reach target samples. Confirmatory Factor Analysis (CFA) and Structural Equation Model (SEM) were used to analyze and confirm goodness-of-fit of the model and hypothesis testing. Results: The results indicate that authenticity, functional value, and trustworthiness induced higher experiential value towards purchase intention. Those variables are stimulated by telepresence and perception narrative towards VR experience. Conclusions: Consumer's purchase intention towards VR experience on engineering cutting tools rely on consumer's sense of authenticity, trustworthiness, and functional value. Hence, marketing practitioners in automotive companies are encouraged to develop VR which focusing on significant factors to enhance consumers purchase intention.

Multirate LQG Control Based on the State Expansion (상태 공간 확장에 의한 멀티레이트 LQG 제어)

  • 이진우;오준호
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.5 no.2
    • /
    • pp.131-138
    • /
    • 1999
  • In discrete-time controlled system, sampling time is one of the critical parameters for control performance. It is useful to employ different sampling rates into the system considering the feasibility of measuring system or actuating system. The systems with the different sampling rates in their input and output channels are named multirate system. Even though the original continuous-time system is time-invariant, it is realized as time-varying state equation depending on multirate sampling mechanism. By means of the augmentation of the inputs and the outputs over one Period, the time-varying system equation can be constructed into the time-invariant equation. In this paper, an alternative time-invariant model is proposed, the design method and the stability of the LQG (Linear Quadratic Gaussian) control scheme for the realization are presented. The realization is flexible to construct to the sampling rate variations, the closed-loop system is shown to be asymptotically stable even in the inter-sampling intervals and it has smaller computation in on-line control loop than the previous time-invariant realizations.

  • PDF

A Bayesian inference for fixed effect panel probit model

  • Lee, Seung-Chun
    • Communications for Statistical Applications and Methods
    • /
    • v.23 no.2
    • /
    • pp.179-187
    • /
    • 2016
  • The fixed effects panel probit model faces "incidental parameters problem" because it has a property that the number of parameters to be estimated will increase with sample size. The maximum likelihood estimation fails to give a consistent estimator of slope parameter. Unlike the panel regression model, it is not feasible to find an orthogonal reparameterization of fixed effects to get a consistent estimator. In this note, a hierarchical Bayesian model is proposed. The model is essentially equivalent to the frequentist's random effects model, but the individual specific effects are estimable with the help of Gibbs sampling. The Bayesian estimator is shown to reduce reduced the small sample bias. The maximum likelihood estimator in the random effects model is also efficient, which contradicts Green (2004)'s conclusion.

Determination of Sampling Points Based on Curvature distribution (곡률 기반의 측정점 결정 알고리즘 개발)

  • 박현풍;손석배;이관행
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2000.11a
    • /
    • pp.295-298
    • /
    • 2000
  • In this research, a novel sampling strategy for a CMM to inspect freeform surfaces is proposed. Unlike primitive surfaces, it is not easy to determine the number of sampling points and their locations for inspecting freeform surfaces. Since a CMM operates with slower speed in measurement than optical measuring devices, it is important to optimize the number and the locations of sampling points in the inspection process. When a complete inspection of a surface is required, it becomes more critical. Among various factors to cause shape errors of a final product, curvature characteristic is essential due to its effect such as stair-step errors in rapid prototyping and interpolation errors in NC tool paths generation. Shape errors are defined in terms of the average and standard deviation of differences between an original model and a produced part. Proposed algorithms determine the locations of sampling points by analyzing curvature distribution of a given surface. Based on the curvature distribution, a surface area is divided into several sub-areas. In each sub-area, sampling points are located as further as possible. The optimal number of sub-areas. In each sub-area, sampling points are located as further as possible. The optimal number os sub-areas is determined by estimating the average of curvatures. Finally, the proposed method is applied to several surfaces that have shape errors for verification.

  • PDF

A Study on Estimation of Parameters in Bivariate Exponential Distribution

  • Kim, Jae Joo;Park, Byung-Gu
    • Journal of Korean Society for Quality Management
    • /
    • v.15 no.1
    • /
    • pp.20-32
    • /
    • 1987
  • Estimation for the parameters of a bivariate exponential (BVE) model of Marshall and Olkin (1967) is investigated for the cases of complete sampling and time-truncated parallel sampling. Maximum likelihood estimators, method of moment estimators and Bayes estimators for the parameters of a BVE model are obtained and compared with each other. A Monte Cario simulation study for a moderate sized samples indicates that the Bayes estimators of parameters perform better than their maximum likelihood and method of moment estimators.

  • PDF

Semiparametric Bayesian Regression Model for Multiple Event Time Data

  • Kim, Yongdai
    • Journal of the Korean Statistical Society
    • /
    • v.31 no.4
    • /
    • pp.509-518
    • /
    • 2002
  • This paper is concerned with semiparametric Bayesian analysis of the proportional intensity regression model of the Poisson process for multiple event time data. A nonparametric prior distribution is put on the baseline cumulative intensity function and a usual parametric prior distribution is given to the regression parameter. Also we allow heterogeneity among the intensity processes in different subjects by using unobserved random frailty components. Gibbs sampling approach with the Metropolis-Hastings algorithm is used to explore the posterior distributions. Finally, the results are applied to a real data set.