• Title/Summary/Keyword: Process-error model

Search Result 1,158, Processing Time 0.029 seconds

Radio Propagation Model and Spatial Correlation Method-based Efficient Database Construction for Positioning Fingerprints (위치추정 전자지문기법을 위한 전파전달 모델 및 공간상관기법 기반의 효율적인 데이터베이스 생성)

  • Cho, Seong Yun;Park, Joon Goo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.7
    • /
    • pp.774-781
    • /
    • 2014
  • This paper presents a fingerprint database construction method for WLAN RSSI (Received Signal Strength Indicator)-based indoor positioning. When RSSI is used for indoor positioning, the fingerprint method can achieve more accurate positioning than trilateration and centroid methods. However, a FD (Fingerprint Database) must be constructed before positioning. This step is a very laborious process. To reduce the drawbacks of the fingerprint method, a radio propagation model-based FD construction method is presented. In this method, an FD can be constructed by a simulator. Experimental results show that the constructed FD-based positioning has a 3.17m (CEP) error. In this paper, a spatial correlation method is presented to estimate the NLOS(Non-Line of Sight) error included in the FD constructed by a simulator. As a result, the NLOS error of the FD is reduced and the performance of the error compensated FD-based positioning is improved. The experimental results show that the enhanced FD-based positioning has a 2.58m (CEP) error that is a reasonable performance for indoor LBS (Location Based Service).

A Case Study on the Cross-Well Travel-Time Tomography Regulated by the Error in the Measurement of the First Arrival Time (초동 주시 측정 오차로 제어된 공대공 주시 토모그래피 사례연구)

  • Lee, Doo-Sung
    • Geophysics and Geophysical Exploration
    • /
    • v.12 no.3
    • /
    • pp.233-238
    • /
    • 2009
  • An inversion method regulated by the error in the measurement of the first arrival time was developed, and we conducted a feasibility study by applying the method to a real cross-well seismic data. The inversion is a two-step regulation process; 1) derive the measurement error bound based on the resolution of the velocity image want to derive, and exclude the records whose picking error is larger than the error bound, 2) set the travel time residual to zero if the residual is less than the measurement error. This process prevents the trivial residuals are accumulated and contribute to the velocity-model update. Comparison of two velocity images, one by using all records and another by using the regulate inversion method, shows that the later velocity image exhibits less numerical artefacts, and it also indicates that, according to the Fermat's principle, the latter image is a more feasible velocity model.

AGAPE-ET: A Predictive Human Error Analysis Methodology for Emergency Tasks in Nuclear Power Plants (원자력발전소 비상운전 직무의 인간오류분석 및 평가 방법 AGAPE-ET의 개발)

  • 김재환;정원대
    • Journal of the Korean Society of Safety
    • /
    • v.18 no.2
    • /
    • pp.104-118
    • /
    • 2003
  • It has been criticized that conventional human reliability analysis (HRA) methodologies for probabilistic safety assessment (PSA) have been focused on the quantification of human error probability (HEP) without detailed analysis of human cognitive processes such as situation assessment or decision-making which are crticial to successful response to emergency situations. This paper introduces a new human reliability analysis (HRA) methodology, AGAPE-ET (A guidance And Procedure for Human Error Analysis for Emergency Tasks), focused on the qualitative error analysis of emergency tasks from the viewpoint of the performance of human cognitive function. The AGAPE-ET method is based on the simplified cognitive model and a taxonomy of influencing factors. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of PIFs on the cognitive function. Then, overall human error analysis process is designed considering the cognitive demand of the required task. The application to an emergency task shows that the proposed method is useful to identify task vulnerabilities associated with the performance of emergency tasks.

Development of a Multiple Linear Regression Model to Analyze Traffic Volume Error Factors in Radar Detectors

  • Kim, Do Hoon;Kim, Eung Cheol
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.39 no.5
    • /
    • pp.253-263
    • /
    • 2021
  • Traffic data collected using advanced equipment are highly valuable for traffic planning and efficient road operation. However, there is a problem regarding the reliability of the analysis results due to equipment defects, errors in the data aggregation process, and missing data. Unlike other detectors installed for each vehicle lane, radar detectors can yield different error types because they detect all traffic volume in multilane two-way roads via a single installation external to the roadway. For the traffic data of a radar detector to be representative of reliable data, the error factors of the radar detector must be analyzed. This study presents a field survey of variables that may cause errors in traffic volume collection by targeting the points where radar detectors are installed. Video traffic data are used to determine the errors in traffic measured by a radar detector. This study establishes three types of radar detector traffic errors, i.e., artificial, mechanical, and complex errors. Among these types, it is difficult to determine the cause of the errors due to several complex factors. To solve this problem, this study developed a radar detector traffic volume error analysis model using a multiple linear regression model. The results indicate that the characteristics of the detector, road facilities, geometry, and other traffic environment factors affect errors in traffic volume detection.

A Segmented Model with Upside-Down Bathtub Shaped Failure Intensity (Upside-Down 욕조 곡선 형태의 고장 강도를 가지는 세분화 모형)

  • Park, Woo-Jae;Kim, Sang-Boo
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.23 no.6_2
    • /
    • pp.1103-1110
    • /
    • 2020
  • In this study, a segmented model with Upside-Down bathtub shaped failure intensity for a repairable system are proposed under the assumption that the occurrences of the failures of a repairable system follow the Non-Homogeneous Poisson Process. The proposed segmented model is the compound model of S-PLP and LIP (Segmented Power Law Process and Logistic Intensity Process), that fits the separate failure intensity functions on each segment of time interval. The maximum likelihood estimation is used for estimating the parameters of the S-PLP and LIP model. The case study of system A shows that the S-PLP and LIP model fits better than the other models when compared by AICc (Akaike Information Criterion corrected) and MSE (Mean Squared Error). And it also implies that the S-PLP and LIP model can be useful for explaining the failure intensities of similar systems.

A comparative study on learning effects based on the reliability model depending on Makeham distribution (Makeham분포에 의존한 신뢰성모형에 근거한 학습효과 특성에 관한 비교 연구)

  • Kim, Hee-Cheul;Cheul, Shin-Hyun
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.9 no.5
    • /
    • pp.496-502
    • /
    • 2016
  • In this study, we investigated the comparative NHPP software model based on learning techniques that operators in the process of software testing and development of software products that can be applied to software test tool. The life distribution was applied Makeham distribution based on finite fault NHPP. Software error detection techniques known in advance, but influencing factors for considering the errors found automatically and learning factors, by prior experience, to find precisely the error factor setting up the testing manager are presented comparing the problem. As a result, the learning factor is larger than automatic error that is usually well-organized model could be established. This paper, a trust characterization of applying using time among failures and parameter approximation using maximum likelihood estimation, after the effectiveness of the data through trend examination model selection were well-organized using the mean square error and $R^2$. From this paper, the software operators must be considered life distribution by the basic knowledge of the software to confirm failure modes which may be helped.

The Comparative Study for the Property of Learning Effect based on Delay ed Software S-Shaped Reliability Model (지연된 소프트웨어 S-형태 신뢰성모형에 의존된 학습효과 특성에 관한 비교 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • Convergence Security Journal
    • /
    • v.11 no.6
    • /
    • pp.73-80
    • /
    • 2011
  • In this study, software products developed in the course of testing, software managers in the process of testing software and tools for effective learning effects perspective has been studied using the NHPP software. The delayed software S-shaped reliability model applied to distribution was based on finite failure NHPP. Software error detection techniques known in advance, but influencing factors for considering the errors found automatically and learning factors, by prior experience, to find precisely the error factor setting up the testing manager are presented comparing the problem. As a result, the learning factor is greater than automatic error that is generally efficient model could be confirmed. This paper, numerical example of applying using time between failures and parameter estimation using maximum likelihood estimation method, after the efficiency of the data through trend analysis model selection were efficient using the mean square error and $R^2$(coefficient of determination).

The Study of NHPP Software Reliability Model from the Perspective of Learning Effects (학습 효과 기법을 이용한 NHPP 소프트웨어 신뢰도 모형에 관한 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • Convergence Security Journal
    • /
    • v.11 no.1
    • /
    • pp.25-32
    • /
    • 2011
  • In this study, software products developed in the course of testing, software managers in the process of testing software test and test tools for effective learning effects perspective has been studied using the NHPP software. The Weibull distribution applied to distribution was based on finite failure NHPP. Software error detection techniques known in advance, but influencing factors for considering the errors found automatically and learning factors, by prior experience, to find precisely the error factor setting up the testing manager are presented comparing the problem. As a result, the learning factor is greater than automatic error that is generally efficient model could be confirmed. This paper, a numerical example of applying using time between failures and parameter estimation using maximum likelihood estimation method, after the efficiency of the data through trend analysis model selection were efficient using the mean square error and $R_{sq}$.

An adaptive predictive control of distillation process using bilinear model (쌍일차 모델을 이용한 증류공정의 적응예측제어)

  • Lo, Kyun;Yeo, Yeong-Koo;Song, Hyung-Keun;Yoon, En-Sup
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1991.10a
    • /
    • pp.99-104
    • /
    • 1991
  • An adaptive predictive control method for SISO and MIMO plants is proposed. In this method, future predictions of process output based on a bilinear CARIMA model are used to calculate the control input. Also, a classical recursive adaptation algorithm, equation error method, is used to decrease the uncertainty of the process model. As a result of the application on distillation process, the ability of the set-point tracking and the disturbance rejection is acceptable to apply to the industrial distillation processes.

  • PDF

Change Point Estimators in Monitoring the Parameters of an AR(1) plus an Additional Random Error Model

  • Lee, Jae-Heon;Lee, Ho-Yun
    • Journal of the Korean Data and Information Science Society
    • /
    • v.18 no.4
    • /
    • pp.963-972
    • /
    • 2007
  • When a control chart signals that a special cause is present, process engineers must initiate a search for and an identification of the special cause. Knowing the time of the process change could lead to identify the special cause more quickly, and to take the appropriate actions immediately to improve quality. In this paper, we propose the maximum likelihood estimator (MLE) for the process change point when a control chart is used in monitoring the parameters of a process in which the observations can be modeled as a first-order autoregressive(AR(1)) process plus an additional random error.

  • PDF