• Title/Summary/Keyword: Systematic error

Search Result 496, Processing Time 0.027 seconds

A STUDY ON THE GROSS ERROR DETECTION AND ELIMINATION IN BUNDLE BLOCK ADJUSTMENT (번들블럭조정에 있어서 과대오차 탐색 및 제거에 관한 연구)

  • 유복모;조기성;신성웅
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.9 no.1
    • /
    • pp.47-54
    • /
    • 1991
  • In this study, the accuracy of three dimensional location was improved by self calibration bundle method with additional parameter, which is to correct systematic error through detection and elimination of the gross error from updated reference variance for observation values in photogram-metry. In this study, with the result of comparing accuracy of each method, correcting systematic error is more effective after gross error detection and when observation values are contained more than two gross error the point with maximum correlation value is detected by masking effect of least square adjustment.

  • PDF

A Comparison of Systematic Sampling Designs for Forest Inventory

  • Yim, Jong Su;Kleinn, Christoph;Kim, Sung Ho;Jeong, Jin-Hyun;Shin, Man Yong
    • Journal of Korean Society of Forest Science
    • /
    • v.98 no.2
    • /
    • pp.133-141
    • /
    • 2009
  • This study was conducted to support for determining an efficient sampling design for forest resources assessments in South Korea with respect to statistical efficiency. For this objective, different systematic sampling designs were simulated and compared based on an artificial forest population that had been built from field sample data and satellite data in Yang-Pyeong County, Korea. Using the k-NN technique, two thematic maps (growing stock and forest cover type per pixel unit) across the test area were generated; field data (n=191) and Landsat ETM+ were used as source data. Four sampling designs (systematic sampling, systematic sampling for post-stratification, systematic cluster sampling, and stratified systematic sampling) were employed as optimum sampling design candidates. In order to compute error variance, the Monte Carlo simulation was used (k=1,000). Then, sampling error and relative efficiency were compared. When the objective of an inventory was to obtain estimations for the entire population, systematic cluster sampling was superior to the other sampling designs. If its objective is to obtain estimations for each sub-population, post-stratification gave a better estimation. In order to successfully perform this procedure, it requires clear definitions of strata of interest per field observation unit for efficient stratification.

A systematic approach to the control logic design and PLC programming of a industrial lift (산업용 리프트의 제어로직 설계 및 PLC 프로그래밍을 위한 체계화 연구)

  • 박노억
    • Proceedings of the Korean Society of Machine Tool Engineers Conference
    • /
    • 1999.10a
    • /
    • pp.452-457
    • /
    • 1999
  • The recent control system has been changed into the type of PLC(Programmable Logic Controller) control. Up to now, systematic approach of PLC programming or control logic design is not suggested. In this study, the design process of the control logic is systematized and concrete process in each step is suggested. This systematized approach lead developer to be convenient to implement control system. When some error is occurred in the system, this approach enable the developer to analyze the reason of error rapidly and the system is amended according to systematic information of the analysis. The example of control system implementation following this approach is introduced.

  • PDF

Co-Simulation for Systematic and Statistical Correction of Multi-Digital-to-Analog-Convertor Systems

  • Park, Youngcheol;Yoon, Hoijin
    • Journal of electromagnetic engineering and science
    • /
    • v.17 no.1
    • /
    • pp.39-43
    • /
    • 2017
  • In this paper, a systematic and statistical calibration technique was implemented to calibrate a high-speed signal converting system containing multiple digital-to-analog converters (DACs). The systematic error (especially the imbalance between DACs) in the current combining network of the multi-DAC system was modeled and corrected by calculating the path coefficients for individual DACs with wideband reference signals. Furthermore, by applying a Kalman filter to suppress noise from quantization and clock jitter, accurate coefficients with minimum noise were identified. For correcting an arbitrary waveform generator with two DACs, a co-simulation platform was implemented to estimate the system degradation and its corrected performance. Simulation results showed that after correction with 4.8 Gbps QAM signal, the signal-to-noise-ratio improved by approximately 4.5 dB and the error-vector-magnitude improved from 4.1% to 1.12% over 0.96 GHz bandwidth.

Effect of Swirl Flow Disturbance on Uncertainty of Flow Rate Measurement by Venturi (선회유동 교란에 따른 벤투리 유량측정의 불확실성 해석)

  • Lee, Jung-Ho;Yoon, Seok-Ho;Yu, Cheong-Hwan;Park, Sang-Jin;Chung, Chang-Hwan
    • The KSFM Journal of Fluid Machinery
    • /
    • v.12 no.6
    • /
    • pp.18-25
    • /
    • 2009
  • Venturi has long been an attractive method of measuring flow rate in a variety of engineering applications since pressure loss is relatively small compared with other measuring methods. The current study focuses on making detailed uncertainty estimations as the upstream flow disturbance affects uncertainty levels of the flow rate measurement. Upstream flow disturbance can be determined by 9 different swirl generators. Measurement uncertainty of flow rate has been estimated by a quantitative uncertainty analysis which is based on the ANSI/ASME PTC 19.1-2005 standard. The results of flow rate uncertainty analysis show that the case with systematic error has higher than that without systematic error. Especially the result with systematic error exhibits that the uncertainty of flow rate was gradually increased by swirl flow disturbance. The uncertainty of flow rate measurement can be mainly affected by differential pressure and discharge coefficient. Flow disturbance can be also reduced by increasing of the upstream straight length of Venturi.

Development of accuracy for the statical inclinometer by error analysis (다축 수준기의 오차분석을 통한 측정 정밀도 향상)

  • Lee J.K.;Park J.J.;Cho N.G.
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2005.06a
    • /
    • pp.1797-1802
    • /
    • 2005
  • In this study, we were developed an accuracy of the proposed two dimensional statical inclinometer what used a position sensitive detector(PSD) by an error analysis. The inclinometer consists of a laser source, a mass, an optic-fiber, and a PSD. The gravity direction on a base platform of the inclinometer is changed by an unknown inclination angle. And a laser spot is moved from the origin to another position of a PSD following a variation of an optical path by the gravity. These processes enable the inclinometer to estimate the inclination angle from distance information of the moving spot. A design methodology on the basis of a sensitivity analysis was applied to improve the measurement performance such as a full measuring range and a resolution. But it still has error factors, so we analyze the uncertainty of the inclinometer to evaluate the systematic errors from alignments, assembly error and so on. The experimental performance evaluation about the design objectives as a measuring range and a resolution was performed. And the validity and the feasibility of the design process were certified by an experimental process. Systematic errors eliminated to improve the accuracy of the inclinometer by the corrected measuring model from the calibration process between the inclination angle and the PSD position instead of the nominal measuring model. The ANOVA(analysis of variance) confirmed the effect of eliminating the systematic errors in the inclinometer. From these methodologies, the proposed inclinometer was able to measure with a high resolution(35.14sec) and a wide range(from $-15^{\circ}\;to\;15^{\circ}$

  • PDF

MEG Measurement Using a 40-channel SQUID System (40 채널 SQUID 시스템을 이용한 뇌자도 측정)

  • Kwon, H.;Lee, Y.H.;Kim, J.M.;Kim, K.W.;Park, Y.K.
    • Progress in Superconductivity
    • /
    • v.4 no.1
    • /
    • pp.19-26
    • /
    • 2002
  • We have earlier developed a 40-channel SQUID system. An important figure of merit of a MEG system is the localization error, within which the underlying current source can be localized. With this system, we investigated the localization error in terms of the standard deviation of the coordinates of the ECDs and the systematic error due to inadequate modeling. To do this, we made localization of single current dipoles from tangential components of auditory evoked fields. Equivalent current dipoles (ECD) at N1m peak were estimated based on a locally fitted spherical conductor model. In addition, we made skull phantom and simulation measurements to investigate the contribution of various errors to the localization error. It was found that the background noise was the main source of the errors that could explain the observed standard deviation. Further, the amount of systematic error, when modeling the head with a spherical conductor, was much less than the standard deviation due to the background noise. We also demonstrated the performance of the system by measuring the evoked fields to grammatical violation in sentence comprehension.

  • PDF

dynamic localization of a mobile robot using a rotating sonar and a map (회전 초음파 센서와 지도를 이용한 이동 로보트의 동적 절대 위치 추정)

  • 양해용;정학영;이장규
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1997.10a
    • /
    • pp.544-547
    • /
    • 1997
  • In this paper, we propose a dynamic localization method using a rotating sonar and a map. The proposed method is implemented by using extended Kalman filter. The state equation is based on the encoder propagation model and the encoder error model, and the measurement equation is a map-based measurement equation using a rotating sonar sensor. By utilizing sonar beam characteristics, map-based measurements are updated while AMR is moving continuously. By modeling and estimating systematic errors of a differential encoder, the position is successfully estimated even the interval of the map-based measurement. Monte-Carlo simulation shows that the proposed global position estimator has the performance of a few millimeter order in position error and of a few tenth degrees in heading error and of compensating systematic errors of the differential encoder well.

  • PDF

Output Behavior of Build-Up Force Measuring System (BUILD-UP 힘측정 시스템의 출력거동)

  • 강대임;송후근;홍창선
    • Transactions of the Korean Society of Mechanical Engineers
    • /
    • v.19 no.9
    • /
    • pp.2194-2205
    • /
    • 1995
  • In order to reduce the systematic error of a build-up system, we have proposed a new test procedure in which all force transducers in a build-up system are rotated by 90.deg. with a base platen fixed on a force standard machine. The setting positions of force transducers on the output of a build-up system were investigated using an orthogonal array. The effects of the parallelism of a build-up system and of the bending moment sensitivity of a force transducer were considered. The experimental results show that the setting position of the base platen hardly affects the output of the build-up system, but the setting positions of force transducers affects it strongly. It reveals that the new test procedure reduces effectively the systematic error of a build-up system.

Task Load Analysis of KTX Operation by Using NASA-TLX Method (NASA-TLX 방법에 의한 KTX 운전 직무부하 분석)

  • Jung, Won-Dea;Ko, Jong-Hyun;Park, Jin-Kyun;Kwak, Sang-Log;Lim, Seoung-Su
    • Proceedings of the KSR Conference
    • /
    • 2006.11b
    • /
    • pp.1082-1087
    • /
    • 2006
  • Human factors still plays a significant role in railway accidents. The accidents often resulted from multiple causes of hardware failures and human errors. So to ensure the safety of railway operations, human error should be effectively prevented and managed. Among several factors influencing human performance, task load (or task complexity) is well known as a major contributor to human error. In order to reduce the potential of human error, a systematic analysis should be undertaken to evaluate task load and to reduce it by modifying task process and/or education&training. In this paper, we proposed a systematic framework for railway industry to perform task analysis and to evaluate task load, and applied it to KTX operational tasks. According to the application study, we identified 14 generic task types of KTX operation. And also this paper shows the quantitative task load of those generic tasks which were analyzed by NASA-TLX method.

  • PDF