• Title/Summary/Keyword: Sampling accuracy

Search Result 700, Processing Time 0.021 seconds

Influence Analysis of Sampling Points on Accuracy of Storage Reliability Estimation for One-shot Systems (원샷 시스템의 저장 신뢰성 추정 정확성에 대한 샘플링 시점의 영향 분석)

  • Chung, Yong H.;Oh, Bong S.;Lee, Hong C.;Park, Hee N.;Jang, Joong S.;Park, Sang C.
    • Journal of Applied Reliability
    • /
    • v.16 no.1
    • /
    • pp.32-40
    • /
    • 2016
  • Purpose: The purpose of this study is to analyze the effect of sampling points on accuracy of storage reliability estimation for one-shot systems by assuming a weibull distribution as a storage reliability distribution. Also propose method for determining of sampling points for increase the accuracy of reliability estimation. Methods: Weibull distribution was divided into three sections for confirming the possible to estimate the parameters of the weibull distribution only some section's sample. Generate quantal response data for failure data. And performed parameter estimation with quantal response data. Results: If reduce sample point interval of 1 section, increase the accuracy of reliability estimation although sampling only section 1. Even reduce total number of sampling point, reducing sampling time interval of the 1 zone improve the accuracy of reliability estimation. Conclusion: Method to increase the accuracy of reliability estimation is increasing number of sampling and the sampling points. But apply this method to One-shot system is difficult because test cost of one-shot system is expensive. So propose method of accuracy of storage reliability estimation of one-shot system by adjustment of the sampling point. And by dividing the section it could reduce the total sampling point.

A Cost Effective Reference Data Sampling Algorithm Using Fractal Analysis

  • Lee, Byoung-Kil;Eo, Yang-Dam;Jeong, Jae-Joon;Kim, Yong-Il
    • ETRI Journal
    • /
    • v.23 no.3
    • /
    • pp.129-137
    • /
    • 2001
  • A random sampling or systematic sampling method is commonly used to assess the accuracy of classification results. In remote sensing, with these sampling methods, much time and tedious work are required to acquire sufficient ground truth data. So, a more effective sampling method that can represent the characteristics of the population is required. In this study, fractal analysis is adopted as an index for reference sampling. The fractal dimensions of the whole study area and the sub-regions are calculated to select sub-regions that have the most similar dimensionality to that of the whole area. Then the whole area's classification accuracy is compared with those of sub-regions, and it is verified that the accuracies of selected sub-regions are similar to that of whole area. A new kind of reference sampling method using the above procedure is proposed. The results show that it is possible to reduce sampling area and sample size, while keeping the same level of accuracy as the existing methods.

  • PDF

Effect Analysis of Sample Size and Sampling Periods on Accuracy of Reliability Estimation Methods for One-shot Systems using Multiple Comparisons (다중비교를 이용한 샘플수와 샘플링 시점수의 원샷 시스템 신뢰도 추정방법 정확성에 대한 영향 분석)

  • Son, Young-Kap
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.15 no.4
    • /
    • pp.435-441
    • /
    • 2012
  • This paper provides simulation-based results of effect analysis of sample size and sampling periods on accuracy of reliability estimation methods using multiple comparisons with analysis of variance. Sum of squared errors in estimated reliability measures were evaluated through applying seven estimation methods for one-shot systems to simulated quantal-response data. Analysis of variance was implemented to investigate change in these errors according to variations of sample size and sampling periods for each estimation method, and then the effect analysis on accuracy in reliability estimation was performed using multiple comparisons based on sample size and sampling periods. An efficient way to allocate both sample size and sampling periods for reliability estimation tests of one-shot systems is proposed in this paper from the effect analysis results.

The Effect of Transformer Leakage Inductance on the Steady State Performance of Push-pull based Converter with Continuous Current

  • Chen, Qian;Zheng, Trillion Q.;Li, Yan;Shao, Tiancong
    • Journal of Power Electronics
    • /
    • v.13 no.3
    • /
    • pp.349-361
    • /
    • 2013
  • As a result of the advantages such as high efficiency, continuous current and high stability margin, push-pull converter with continuous current (PPCWCC) is competitive for battery discharge regulator (BDR) which plays an important role in power conditioning unit (PCU). Leakage inductance yields current spike in low-ripple current of PPCWCCs. The operating modes are added due to leakage inductance. Therefore the steady state performance is affected, which is embodied in the spike of low-ripple current. PPCWCCs which are suitable for BDR can be separated into three types by current spike characteristics. Three representative topologies IIs1, IIcb2 and Is3 are analyzed in order to investigate the factors on the magnitude and duration of spike. Equivalent current sampling method (ECSM) which eliminates the sampling time delay and achieves excellent dynamic performance is adopted to prevent the spike disturbance on current sampling. However, ECSM reduces the sampling accuracy and telemetry accuracy due to neglecting the spike. In this paper, ECSM used in PPCWCCs is summarized. The current sampling error is analyzed in quality and quantity, which provides the foundation for offsetting and enhancing the telemetry accuracy. Finally, current sampling error rate of three topologies is compared by experiment results, which verify the theoretical analysis.

Choice of Efficient Sampling Rate for GNSS Signal Generation Simulators

  • Jinseon Son;Young-Jin Song;Subin Lee;Jong-Hoon Won
    • Journal of Positioning, Navigation, and Timing
    • /
    • v.12 no.3
    • /
    • pp.237-244
    • /
    • 2023
  • A signal generation simulator is an economical and useful solution in Global Navigation Satellite System (GNSS) receiver design and testing. A software-defined radio approach is widely used both in receivers and simulators, and its flexible structure to adopt to new signals is ideally suited to the testing of a receiver and signal processing algorithm in the signal design phase of a new satellite-based navigation system before the deployment of satellites in space. The generation of highly accurate delayed sampled codes is essential for generating signals in the simulator, where its sampling rate should be chosen to satisfy constraints such as Nyquist criteria and integer and non-commensurate properties in order not to cause any distortion of original signals. A high sampling rate increases the accuracy of code delay, but decreases the computational efficiency as well, and vice versa. Therefore, the selected sampling rate should be as low as possible while maintaining a certain level of code delay accuracy. This paper presents the lower limits of the sampling rate for GNSS signal generation simulators. In the simulation, two distinct code generation methods depending on the sampling position are evaluated in terms of accuracy versus computational efficiency to show the lower limit of the sampling rate for several GNSS signals.

A New Statistical Sampling Method for Reducing Computing time of Machine Learning Algorithms (기계학습 알고리즘의 컴퓨팅시간 단축을 위한 새로운 통계적 샘플링 기법)

  • Jun, Sung-Hae
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.21 no.2
    • /
    • pp.171-177
    • /
    • 2011
  • Accuracy and computing time are considerable issues in machine learning. In general, the computing time for data analysis is increased in proportion to the size of given data. So, we need a sampling approach to reduce the size of training data. But, the accuracy of constructed model is decreased by going down the data size simultaneously. To solve this problem, we propose a new statistical sampling method having similar performance to the total data. We suggest a rule to select optimal sampling techniques according to given data structure. This paper shows a sampling method for reducing computing time with keeping the most of accuracy using cluster sampling, stratified sampling, and systematic sampling. We verify improved performance of proposed method by accuracy and computing time between sample data and total data using objective machine learning data sets.

High precision Gating Algorithm for Predictive Current Control of Phase Controlled Rectifier (위상제어 정류기의 예측전류제어를 위한 새로운 고정밀 게이팅 알고리즘)

  • 정세종;송승호
    • The Transactions of the Korean Institute of Electrical Engineers B
    • /
    • v.53 no.3
    • /
    • pp.206-211
    • /
    • 2004
  • In phase controlled rectifier, it's been known that a fast response is achieved by predictive current control without any overshoot. The frequent sampling period is essential to improve the firing accuracy in conventional predict current control. However, improving the firing accuracy if difficult to reduce the period of sampling efficiently because current sampling and predictive current control is carried out in every period and the ON-OFF current control is performed by comparing two different one. To improve the firing accuracy at the predictive current control, the calculated firing angle is loaded into the high-accuracy hardware timer. So the calculation of exact crossing point between the predictive and actual current is the most important. In this paper, the flow chart for proposed firing angle calculation algorithm is obtained for the fastest current control performance in transient state. The performance of proposed algorithm is verified through simulations and experiments.

A Cost Effective Reference Data Sampling Algorithm Using Fractal Analysis (프랙탈 분석을 통한 비용효과적인 기준 자료추출알고리즘에 관한 연구)

  • 김창재
    • Spatial Information Research
    • /
    • v.8 no.1
    • /
    • pp.171-182
    • /
    • 2000
  • Random sampling or systematic sampling method is commonly used to assess the accuracy of classification results. In remote sensing, with these sampling method, much time and tedious works are required to acquire sufficient ground truth data. So , a more effective sampling method that can retain the characteristics of the population is required. In this study, fractal analysis is adopted as an index for reference sampling . The fractal dimensions of the whole study area and the sub-regions are calculated to choose sub-regions that have the most similar dimensionality to that of whole-area. Then the whole -area s classification accuracy is compared to those of sub-regions, respectively, and it is verified that the accuracies of selected sub regions are similar to that of full-area . Using the above procedure, a new kind of reference sampling method is proposed. The result shows that it is possible to reduced sampling area and sample size keeping up the same results as existing methods in accuracy tests. Thus, the proposed method is proved cost-effective for reference data sampling.

  • PDF

Particle Swarm Optimization Using Adaptive Boundary Correction for Human Activity Recognition

  • Kwon, Yongjin;Heo, Seonguk;Kang, Kyuchang;Bae, Changseok
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.6
    • /
    • pp.2070-2086
    • /
    • 2014
  • As a kind of personal lifelog data, activity data have been considered as one of the most compelling information to understand the user's habits and to calibrate diagnoses. In this paper, we proposed a robust algorithm to sampling rates for human activity recognition, which identifies a user's activity using accelerations from a triaxial accelerometer in a smartphone. Although a high sampling rate is required for high accuracy, it is not desirable for actual smartphone usage, battery consumption, or storage occupancy. Activity recognitions with well-known algorithms, including MLP, C4.5, or SVM, suffer from a loss of accuracy when a sampling rate of accelerometers decreases. Thus, we start from particle swarm optimization (PSO), which has relatively better tolerance to declines in sampling rates, and we propose PSO with an adaptive boundary correction (ABC) approach. PSO with ABC is tolerant of various sampling rate in that it identifies all data by adjusting the classification boundaries of each activity. The experimental results show that PSO with ABC has better tolerance to changes of sampling rates of an accelerometer than PSO without ABC and other methods. In particular, PSO with ABC is 6%, 25%, and 35% better than PSO without ABC for sitting, standing, and walking, respectively, at a sampling period of 32 seconds. PSO with ABC is the only algorithm that guarantees at least 80% accuracy for every activity at a sampling period of smaller than or equal to 8 seconds.