• Title/Summary/Keyword: sampling strategy

Search Result 408, Processing Time 0.025 seconds

A Study on Workers' Exposure to Organic Solvents in Petroleum Refinery (원유정제업 작업자들의 유기용제에 대한 노출 평가)

  • Choi, Sang Jun;Paik, Nam Won;Kim, Jin Kyoung;Choi, Yeon Ki;Jung, Hyun Hee;Heo, Sung-Min
    • Journal of Korean Society of Occupational and Environmental Hygiene
    • /
    • v.15 no.1
    • /
    • pp.27-35
    • /
    • 2005
  • This study was carried out to evaluate the characteristics of petroleum refinery workers' exposure to organic solvents. Exposure assessment was conducted by full shift-based long term personal sampling(TWA-P) and task-based short term personal sampling(STEL-P) strategy. Major organic solvents that workers can be exposed are various, varying from C3~C12, and this study focused on 11 kinds including benzene, considering toxicity and concentration level. In comparison with two sampling results, STEL-P shows a significant(p<0.001) excess of exposure level rather than TWA-P. As the potential risk index for benzene is calculated as 16, benzene should be set the highest priority for control in petroleum refinery industry. The tasks with the highest benzene exposure level were de-watering(AM;99.8 ppm), draining(AM;19.6ppm), sampling(AM;16.2ppm), and manual gauging(AM;15.02ppm). Petroleum refinery workers' exposure pattern to organic solvents differs by tasks performed, and some task has a high risk of temporary extreme exposure. Therefore, traditional 8-hour TWA sampling strategy have possibility of underestimation of exposure level of workers in petroleum refinery.

Sequential Feasible Domain Sampling of Kriging Metamodel by Using Penalty Function (벌칙함수 기반 크리깅메타모델의 순차적 유용영역 실험계획)

  • Lee Tae-Hee;Seong Jun-Yeob;Jung Jae-Jun
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.30 no.6 s.249
    • /
    • pp.691-697
    • /
    • 2006
  • Metamodel, model of model, has been widely used to improve an efficiency of optimization process in engineering fields. However, global metamodels of constraints in a constrained optimization problem are required good accuracy around neighborhood of optimum point. To satisfy this requirement, more sampling points must be located around the boundary and inside of feasible region. Therefore, a new sampling strategy that is capable of identifying feasible domain should be applied to select sampling points for metamodels of constraints. In this research, we suggeste sequential feasible domain sampling that can locate sampling points likely within feasible domain by using penalty function method. To validate the excellence of feasible domain sampling, we compare the optimum results from the proposed method with those form conventional global space-filling sampling for a variety of optimization problems. The advantages of the feasible domain sampling are discussed further.

A Loss Minimization Control Strategy for Direct Torque Controlled Interior Permanent Magnet Synchronous Motors

  • Siahbalaee, Jafar;Vaez-Zadeh, Sadegh;Tahami, Farzad
    • Journal of Power Electronics
    • /
    • v.9 no.6
    • /
    • pp.940-948
    • /
    • 2009
  • The main objective of this a paper is to improve the efficiency of permanent magnet synchronous motors (PMSMs) by using an improved direct torque control (DTC) strategy. The basic idea behind the proposed strategy is to predict the impact of a small change in the stator flux amplitude at each sampling period to decrease electrical loss before the change is applied. Accordingly, at every sampling time, a voltage vector is predicted and applied to the machine to fulfill the flux change. The motor drive simulations confirm a significant improvement in efficiency as well as a very fast and smooth response under the proposed strategy.

Study on the Effect of Training Data Sampling Strategy on the Accuracy of the Landslide Susceptibility Analysis Using Random Forest Method (Random Forest 기법을 이용한 산사태 취약성 평가 시 훈련 데이터 선택이 결과 정확도에 미치는 영향)

  • Kang, Kyoung-Hee;Park, Hyuck-Jin
    • Economic and Environmental Geology
    • /
    • v.52 no.2
    • /
    • pp.199-212
    • /
    • 2019
  • In the machine learning techniques, the sampling strategy of the training data affects a performance of the prediction model such as generalizing ability as well as prediction accuracy. Especially, in landslide susceptibility analysis, the data sampling procedure is the essential step for setting the training data because the number of non-landslide points is much bigger than the number of landslide points. However, the previous researches did not consider the various sampling methods for the training data. That is, the previous studies selected the training data randomly. Therefore, in this study the authors proposed several different sampling methods and assessed the effect of the sampling strategies of the training data in landslide susceptibility analysis. For that, total six different scenarios were set up based on the sampling strategies of landslide points and non-landslide points. Then Random Forest technique was trained on the basis of six different scenarios and the attribute importance for each input variable was evaluated. Subsequently, the landslide susceptibility maps were produced using the input variables and their attribute importances. In the analysis results, the AUC values of the landslide susceptibility maps, obtained from six different sampling strategies, showed high prediction rates, ranges from 70 % to 80 %. It means that the Random Forest technique shows appropriate predictive performance and the attribute importance for the input variables obtained from Random Forest can be used as the weight of landslide conditioning factors in the susceptibility analysis. In addition, the analysis results obtained using specific sampling strategies for training data show higher prediction accuracy than the analysis results using the previous random sampling method.

Adaptive Sampling for ECG Detection Based on Compression Dictionary

  • Yuan, Zhongyun;Kim, Jong Hak;Cho, Jun Dong
    • JSTS:Journal of Semiconductor Technology and Science
    • /
    • v.13 no.6
    • /
    • pp.608-616
    • /
    • 2013
  • This paper presents an adaptive sampling method for electrocardiogram (ECG) signal detection. First, by employing the strings matching process with compression dictionary, we recognize each segment of ECG with different characteristics. Then, based on the non-uniform sampling strategy, the sampling rate is determined adaptively. As the results of simulation indicated, our approach reconstructed the ECG signal at an optimized sampling rate with the guarantee of ECG integrity. Compared with the existing adaptive sampling technique, our approach acquires an ECG signal at a 30% lower sampling rate. Finally, the experiment exhibits its superiority in terms of energy efficiency and memory capacity performance.

Feedwater Flowrate Estimation Based on the Two-step De-noising Using the Wavelet Analysis and an Autoassociative Neural Network

  • Gyunyoung Heo;Park, Seong-Soo;Chang, Soon-Heung
    • Nuclear Engineering and Technology
    • /
    • v.31 no.2
    • /
    • pp.192-201
    • /
    • 1999
  • This paper proposes an improved signal processing strategy for accurate feedwater flowrate estimation in nuclear power plants. It is generally known that ∼2% thermal power errors occur due to fouling Phenomena in feedwater flowmeters. In the strategy Proposed, the noises included in feedwater flowrate signal are classified into rapidly varying noises and gradually varying noises according to the characteristics in a frequency domain. The estimation precision is enhanced by introducing a low pass filter with the wavelet analysis against rapidly varying noises, and an autoassociative neural network which takes charge of the correction of only gradually varying noises. The modified multivariate stratification sampling using the concept of time stratification and MAXIMIN criteria is developed to overcome the shortcoming of a general random sampling. In addition the multi-stage robust training method is developed to increase the quality and reliability of training signals. Some validations using the simulated data from a micro-simulator were carried out. In the validation tests, the proposed methodology removed both rapidly varying noises and gradually varying noises respectively in each de-noising step, and 5.54% root mean square errors of initial noisy signals were decreased to 0.674% after de-noising. These results indicate that it is possible to estimate the reactor thermal power more elaborately by adopting this strategy.

  • PDF

Use of Dynamic Reliability Method in Assessing Accident Management Strategy

  • Jae, Moosung
    • International Journal of Reliability and Applications
    • /
    • v.2 no.1
    • /
    • pp.27-36
    • /
    • 2001
  • This Paper proposes a new methodology for assessing the reliability of an accident management, which Is based on the reliability physics and the scheme to generate dynamic event tree. The methodology consists of 3 main steps: screening; uncertainty propagation; and probability estimation. Sensitivity analysis is used for screening the variables of significance. Latin Hypercube sampling technique and MAAP code are used for uncertainty propagation, and the dynamic event tree generation method is used for the estimation of non-success probability of implementing an accident management strategy. This approach is applied in assessing the non-success probability of implementing a cavity flooding strategy, which is to supply water into the reactor cavity using emergency fire systems during the sequence of station blackout at the reference plant.

  • PDF

SPATIAL AND TEMPORAL INFLUENCES ON SOIL MOISTURE ESTIMATION

  • Kim, Gwang-seob
    • Water Engineering Research
    • /
    • v.3 no.1
    • /
    • pp.31-44
    • /
    • 2002
  • The effect of diurnal cycle, intermittent visit of observation satellite, sensor installation, partial coverage of remote sensing, heterogeneity of soil properties and precipitation to the soil moisture estimation error were analyzed to present the global sampling strategy of soil moisture. Three models, the theoretical soil moisture model, WGR model proposed Waymire of at. (1984) to generate rainfall, and Turning Band Method to generate two dimensional soil porosity, active soil depth and loss coefficient field were used to construct sufficient two-dimensional soil moisture data based on different scenarios. The sampling error is dominated by sampling interval and design scheme. The effect of heterogeneity of soil properties and rainfall to sampling error is smaller than that of temporal gap and spatial gap. Selecting a small sampling interval can dramatically reduce the sampling error generated by other factors such as heterogeneity of rainfall, soil properties, topography, and climatic conditions. If the annual mean of coverage portion is about 90%, the effect of partial coverage to sampling error can be disregarded. The water retention capacity of fields is very important in the sampling error. The smaller the water retention capacity of the field (small soil porosity and thin active soil depth), the greater the sampling error. These results indicate that the sampling error is very sensitive to water retention capacity. Block random installation gets more accurate data than random installation of soil moisture gages. The Walnut Gulch soil moisture data show that the diurnal variation of soil moisture causes sampling error between 1 and 4 % in daily estimation.

  • PDF

Measuring stratification effects for multistage sampling (다단추출 표본설계의 층효율성 연구)

  • Taehoon Kim;KeeJae Lee;Inho Park
    • The Korean Journal of Applied Statistics
    • /
    • v.36 no.4
    • /
    • pp.337-347
    • /
    • 2023
  • Sampling designs often use stratified sampling, where elements or clusters of the study population are divided into strata and an independent sample is chosen from each stratum. The stratification strategy consists of stratification and sample allocation, which are important issues that are repeatedly considered in survey sampling. Although a stratified multistage sample design is often used in practice, the literature tends to discuss simple sampling in terms of stratum effects or stratum efficiency. This study examines an existing stratum efficiency measure for two-stage sampling and further proposes additional stratum efficiency measures using the design effect model. The proposed measures are used to evaluate the stratification strategy of the sample design for high school students of the 4th Korean National Environmental Health Survey (KoNEHS).

Determination of Soil Sample Size Based on Gy's Particulate Sampling Theory (Gy의 입자성 물질 시료채취이론에 근거한 토양 시료 채취량 결정)

  • Bae, Bum-Han
    • Journal of Soil and Groundwater Environment
    • /
    • v.16 no.6
    • /
    • pp.1-9
    • /
    • 2011
  • A bibliographical review of Gy sampling theory for particulate materials was conducted to provide readers with useful means to reduce errors in soil contamination investigation. According to the Gy theory, the errors caused by the heterogeneous nature of soil include; the fundamental error (FE) caused by physical and chemical constitutional heterogeneity, the grouping and segregation error (GE) aroused from gravitational force, long-range heterogeneous fluctuation error ($CE_2$), the periodic heterogeneity fluctuation error ($CE_3$), and the materialization error (ME) generated during physical process of sample treatment. However, the accurate estimation of $CE_2$ and $CE_3$ cannot be estimated easily and only increasing sampling locations can reduce the magnitude of the errors. In addition, incremental sampling is the only method to reduce GE while grab sampling should be avoided as it introduces uncertainty and errors to the sampling process. Correct preparation and operation of sampling tools are important factors in reducing the incremental delimitation error (DE) and extraction error (EE) which are resulted from physical processes in the sampling. Therefore, Gy sampling theory can be used efficiently in planning a strategy for soil investigations of non-volatile and non-reactive samples.