• Title/Summary/Keyword: point sampling

Search Result 821, Processing Time 0.03 seconds

A Permanent Magnet Pole Shape Optimization for a 6MW BLDC Motor by using Response Surface Method (II) (RSM을 이용한 6MW BLDC용 영구자석의 형상 최적화 연구 (II))

  • Woo, Sung-Hyun;Chung, Hyun-Koo;Shin, Pan-Seok
    • Proceedings of the KIEE Conference
    • /
    • 2008.07a
    • /
    • pp.701-702
    • /
    • 2008
  • An adaptive response surface method with Latin Hypercube sampling strategy is employed to optimize a magnet pole shape of large scale BLDC motor to minimize the cogging torque. The proposed algorithm consists of the multi-objective Pareto optimization and (1+${\lambda}$) evolution strategy to find the global optimal points with relatively fewer sampling data. In the adaptive RSM, an adaptive sampling point insertion method is developed utilizing the design sensitivities computed by using finite element method to get a reasonable response surface with a relatively small number of sampling points. The developed algorithm is applied to the shape optimization of PM poles for 6 MW BLDC motor, and the cogging torque is reduced to 19% of the initial one.

  • PDF

Application of Bayesian Computational Techniques in Estimation of Posterior Distributional Properties of Lognormal Distribution

  • Begum, Mun-Ni;Ali, M. Masoom
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.1
    • /
    • pp.227-237
    • /
    • 2004
  • In this paper we presented a Bayesian inference approach for estimating the location and scale parameters of the lognormal distribution using iterative Gibbs sampling algorithm. We also presented estimation of location parameter by two non iterative methods, importance sampling and weighted bootstrap assuming scale parameter as known. The estimates by non iterative techniques do not depend on the specification of hyper parameters which is optimal from the Bayesian point of view. The estimates obtained by more sophisticated Gibbs sampler vary slightly with the choices of hyper parameters. The objective of this paper is to illustrate these tools in a simpler setup which may be essential in more complicated situations.

  • PDF

Adaptive kernel method for evaluating structural system reliability

  • Wang, G.S.;Ang, A.H.S.;Lee, J.C.
    • Structural Engineering and Mechanics
    • /
    • v.5 no.2
    • /
    • pp.115-126
    • /
    • 1997
  • Importance sampling methods have been developed with the aim of reducing the computational costs inherent in Monte Carlo methods. This study proposes a new algorithm called the adaptive kernel method which combines and modifies some of the concepts from adaptive sampling and the simple kernel method to evaluate the structural reliability of time variant problems. The essence of the resulting algorithm is to select an appropriate starting point from which the importance sampling density can be generated efficiently. Numerical results show that the method is unbiased and substantially increases the efficiency over other methods.

Monitoring Benthic AIgal Communides:A Comparison of Targeted and Coefficient Sampling Methods

  • Edwards, Matthew S.;Tinker, Martin T.
    • ALGAE
    • /
    • v.24 no.2
    • /
    • pp.111-120
    • /
    • 2009
  • Choosing an appropriate sample unit is a fundamental decision in the design of ecological studies. While numer-ous methods have been developed to estimate organism abundance, they differ in cost, accuracy and precision.Using both field data and computer simulation modeling, we evaluated the costs and benefits associated with twomethods commonly used to sample benthic organisms in temperatc kelp forests. One of these methods, theTargeted Sampling method, relies on different sample units, each "targeted" for a specific species or group ofspecies while the other method relies on coefficients that represent ranges of bottom cover obtained from visual esti-mates within standardized sample units. Both the field data and the computer simulations suggest that both meth-ods yield remarkably similar estimates of organisnm abundance and among-site variability, although the Coefficientmethod slightly underestimates variability armong sample units when abundances are low. In contrast, the twomethods differ considerably in the effort needed to sample these communities; the Targeted Sampling requiresmore time and twice the persormel to complete. We conclude that the Coeffident Sampling metliod may be bettcrfor environmental monitoring programs where changes in mean abundance are of central conccm and resources arelimiting, but that the Targeted sampling methods may be better for ecological studies where quantitative reIation-ships among species and small-scale variability in abundance are of central concern.

Machining Tool Path Generation for Point Set

  • Park, Se-Youn;Shin, Ha-Yong
    • International Journal of CAD/CAM
    • /
    • v.8 no.1
    • /
    • pp.45-53
    • /
    • 2009
  • As the point sampling technology evolves rapidly, there has been increasing need in generating tool path from dense point set without creating intermediate models such as triangular meshes or surfaces. In this paper, we present a new tool path generation method from point set using Euclidean distance fields based on Algebraic Point Set Surfaces (APSS). Once an Euclidean distance field from the target shape is obtained, it is fairly easy to generate tool paths. In order to compute the distance from a point in the 3D space to the point set, we locally fit an algebraic sphere using moving least square method (MLS) for accurate and simple calculation. This process is repeated until it converges. The main advantages of our approach are : (1) tool paths are computed directly from point set without making triangular mesh or surfaces and their offsets, and (2) we do not have to worry about no local interference at concave region compared to the other methods using triangular mesh or surface model. Experimental results show that our approach can generate accurate enough tool paths from a point set in a robust manner and efficiently.

A Hybrid Algorithm for Online Location Update using Feature Point Detection for Portable Devices

  • Kim, Jibum;Kim, Inbin;Kwon, Namgu;Park, Heemin;Chae, Jinseok
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.2
    • /
    • pp.600-619
    • /
    • 2015
  • We propose a cost-efficient hybrid algorithm for online location updates that efficiently combines feature point detection with the online trajectory-based sampling algorithm. Our algorithm is designed to minimize the average trajectory error with the minimal number of sample points. The algorithm is composed of 3 steps. First, we choose corner points from the map as sample points because they will most likely cause fewer trajectory errors. By employing the online trajectory sampling algorithm as the second step, our algorithm detects several missing and important sample points to prevent unwanted trajectory errors. The final step improves cost efficiency by eliminating redundant sample points on straight paths. We evaluate the proposed algorithm with real GPS trajectory data for various bus routes and compare our algorithm with the existing one. Simulation results show that our algorithm decreases the average trajectory error 28% compared to the existing one. In terms of cost efficiency, simulation results show that our algorithm is 29% more cost efficient than the existing one with real GPS trajectory data.

A Study on Effectiveness Enhancement of Organization thru Service Quality of Service Desk (Service Desk의 서비스 품질이 조직의 업무 효율성 증대에 미치는 영향에 관한 연구)

  • Kim, Dong-Chul;Gim, Gwang-Yong;Rim, Seong-Taek
    • Journal of Information Technology Services
    • /
    • v.8 no.4
    • /
    • pp.17-40
    • /
    • 2009
  • This study is to guide to estimate service quality on ITSM operation based on ITIL(IT Infrastructure Library). ITIL v2 and v3 provide how to integrate framework of IT service and business process. It is widely used as ITSM operation base. As a latest IT trend, ITSM covers Hardware, Software, SaaS, Network, Call center, Helpdesk, ASP portal and IT operation. Servicedesk is selected as target area where valid sampling is addressable and service change is rapid. Traditional Helpdesk was focusing on technical support to solve internal IT issues passively. But it was evolved into Service Desk which focus in process and provide integrated service from customer's business view point preventively and proactively. Accordingly outsourcing types business are normally performed by group of professional capability. Service quality is measured under the SLA(Service Level Agreement). This study utilized SERVQUAL model as service quality measuring tool developed by Parasuraman, Zeitaml and Berry to find critical factors to satisfy customer. And test was processed regarding effectiveness of IT organization and customer view point thru sampling. Though valid parameters can be changed by ITSM areas under the SERVQUAL models, they naturally can be accepted as a index of service quality measurement after sampling test with acceptable significance. And I recommend to follow this study as a preparation before official SLA.

Implementation of Digital Signal Processing Board Suitable for a Semi-active Laser Tracking to Detect a Laser Pulse Repetition Frequency and Optimization of a Target Coordinates (반능동형 레이저 유도 추적에 적합한 레이저 펄스 반복 주파수 검출을 위한 디지털 신호처리 보드 구현 및 표적 좌표 최적화)

  • Lee, Young-Ju;Kim, Yong-Pyung
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.64 no.4
    • /
    • pp.573-577
    • /
    • 2015
  • In this paper, we propose a signal processing board suitable for a semi-active laser tracking to detect an optical signal generated from the laser target designator by applying an analog trigger signal, the quadrant photodetector and a high speed ADC(analog-digital converter) sampling technique. We improved the stability by applying the averaging method to minimize the measurement error of a gaussian pulse. To evaluate the performances of the proposed methods, we implemented a prototype board and performed experiments. As a result, we implemented a frequency counter with an error 14.9ns in 50ms. PRF error code has a stability of less than 1.5% compared to the NATO standard. Applying the three point averaging method to ADC sampling, the stability of 28% in X-axis and 22% in Y-axis than one point sampling was improved.

A Recurrent Neural Network Training and Equalization of Channels using Sigma-point Kalman Filter (시그마포인트 칼만필터를 이용한 순환신경망 학습 및 채널등화)

  • Kwon, Oh-Shin
    • Proceedings of the KIEE Conference
    • /
    • 2007.04a
    • /
    • pp.3-5
    • /
    • 2007
  • This paper presents decision feedback equalizers using a recurrent neural network trained algorithm using extended Kalman filter(EKF) and sigma-point Kalman filter(SPKF). EKF is propagated, analytically through the first-order linearization of the nonlinear system. This can introduce large errors in the true posterior mean and covariance of the Gaussian random variable. The SPKF addresses this problem by using a deterministic sampling approach. The features of the proposed recurrent neural equalizer And we investigate the bit error rate(BER) between EKF and SPKF.

  • PDF

Application of Bootstrap Method for Change Point Test based on Kernel Density Estimator

  • Kim, Dae-Hak
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.1
    • /
    • pp.107-117
    • /
    • 2004
  • Change point testing problem is considered. Kernel density estimators are used for constructing proposed change point test statistics. The proposed method can be used to the hypothesis testing of not only parameter change but also distributional change. Bootstrap method is applied to get the sampling distribution of proposed test statistic. Small sample Monte Carlo Simulation were also conducted in order to show the performance of proposed method.

  • PDF