• Title/Summary/Keyword: point sampling

Search Result 818, Processing Time 0.034 seconds

A Low Bit Rate Speech Coder Based on the Inflection Point Detection

  • Iem, Byeong-Gwan
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.15 no.4
    • /
    • pp.300-304
    • /
    • 2015
  • A low bit rate speech coder based on the non-uniform sampling technique is proposed. The non-uniform sampling technique is based on the detection of inflection points (IP). A speech block is processed by the IP detector, and the detected IP pattern is compared with entries of the IP database. The address of the closest member of the database is transmitted with the energy of the speech block. In the receiver, the decoder reconstructs the speech block using the received address and the energy information of the block. As results, the coder shows fixed data rate contrary to the existing speech coders based on the non-uniform sampling. Through computer simulation, the usefulness of the proposed technique is shown. The SNR performance of the proposed method is approximately 5.27 dB with the data rate of 1.5 kbps.

A Fixed Rate Speech Coder Based on the Filter Bank Method and the Inflection Point Detection

  • Iem, Byeong-Gwan
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.16 no.4
    • /
    • pp.276-280
    • /
    • 2016
  • A fixed rate speech coder based on the filter bank and the non-uniform sampling technique is proposed. The non-uniform sampling is achieved by the detection of inflection points (IPs). A speech block is band passed by the filter bank, and the subband signals are processed by the IP detector, and the detected IP patterns are compared with entries of the IP database. For each subband signal, the address of the closest member of the database and the energy of the IP pattern are transmitted through channel. In the receiver, the decoder recovers the subband signals using the received addresses and the energy information, and reconstructs the speech via the filter bank summation. As results, the coder shows fixed data rate contrary to the existing speech coders based on the non-uniform sampling. Through computer simulation, the usefulness of the proposed technique is confirmed. The signal-to-noise ratio (SNR) performance of the proposed method is comparable to that of the uniform sampled pulse code modulation (PCM) below 20 kbps data rate.

Reliability Estimation Using Two-Staged Kriging Metamodel and Genetic Algorithm (2단 크리깅 메타모델과 유전자 알고리즘을 이용한 신뢰도 계산)

  • Cho, Tae-Min;Ju, Byeong-Hyeon;Jung, Do-Hyun;Lee, Byung-Chai
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.30 no.9 s.252
    • /
    • pp.1116-1123
    • /
    • 2006
  • In this study, the effective method for reliability estimation is proposed using tow-staged kriging metamodel and genetic algorithm. Kriging metamodel can be determined by appropriate sampling range and the number of sampling points. The first kriging metamodel is made based on the proposed sampling points. The advanced f'=rst order reliability method is applied to the first kriging metamodel to determine the reliability and most probable failure point(MPFP) approximately. Then, the second kriging metamodel is constructed using additional sampling points near the MPFP. These points are selected using genetic algorithm that have the maximum mean squared error. The Monte-Carlo simulation is applied to the second kriging metamodel to estimate the reliability. The proposed method is applied to numerical examples and the results are almost equal to the reference reliability.

Candidate Points and Representative Cross-Validation Approach for Sequential Sampling (후보점과 대표점 교차검증에 의한 순차적 실험계획)

  • Kim, Seung-Won;Jung, Jae-Jun;Lee, Tae-Hee
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.31 no.1 s.256
    • /
    • pp.55-61
    • /
    • 2007
  • Recently simulation model becomes an essential tool for analysis and design of a system but it is often expensive and time consuming as it becomes complicate to achieve reliable results. Therefore, high-fidelity simulation model needs to be replaced by an approximate model, the so-called metamodel. Metamodeling techniques include 3 components of sampling, metamodel and validation. Cross-validation approach has been proposed to provide sequnatially new sample point based on cross-validation error but it is very expensive because cross-validation must be evaluated at each stage. To enhance the cross-validation of metamodel, sequential sampling method using candidate points and representative cross-validation is proposed in this paper. The candidate and representative cross-validation approach of sequential sampling is illustrated for two-dimensional domain. To verify the performance of the suggested sampling technique, we compare the accuracy of the metamodels for various mathematical functions with that obtained by conventional sequential sampling strategies such as maximum distance, mean squared error, and maximum entropy sequential samplings. Through this research we team that the proposed approach is computationally inexpensive and provides good prediction performance.

Sediment Discharge Based on a Time-Integrated Point Sample (연속점 채취를 이용한 유사량 계산)

  • 정관수
    • Water for future
    • /
    • v.29 no.2
    • /
    • pp.129-141
    • /
    • 1996
  • A procedure for computing total suspended sediment load is presented based on a single point-integrated sample, a power velocity distribution, and Laursen's sediment concentration distribution equation. The procedure was tested with field data from the Rio Grande River. Computed concentrations agreed well with depth-integrated measurements corrected for unmeasured load using nominal values of $\beta$, $\kappa$ and w. Even better agreement was obtained when site-specific data were used to define the x and z exponents of the velocity and concentration distributions. The difference between total suspended load computed using a single measurement and this procedure and conventional computations based on depthintegrated measurements is well within sampling error. There are major advantages in estimating total suspended load using a single time-integrated suspended-sediment point sample. Less field time is required; sampling costs are greatly reduced; and sampling can be more frequent and better timed to measure the changing sediment load. Single-point sampling makes automatic sampling procedures more feasible.

  • PDF

Variables Sampling Plans for the Weibull Distribution under Progressive Failure Censoring (점진적 정수 중단 하에서의 와이블분포에 대한 계량형 샘플링검사)

  • Lee, Sang-Ho;Jeon, Chi-Hyeok;Balamurali, S.
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 2005.05a
    • /
    • pp.922-926
    • /
    • 2005
  • Progressively censored variables sampling plans are proposed for the lot acceptance of parts whose life follows Weibull distribution with known shape parameter. Progressive type-II censoring gives us not only time to failure but also degradation information. So, one can construct more flexible and more cost effective sampling plans. Design parameters of our sampling plan are determined by using the usual two-point approach.

  • PDF

Precision servo control of a computer hard disk (컴퓨터 하드 디스크의 정밀 서보 제어)

  • 전도영
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1996.10b
    • /
    • pp.286-289
    • /
    • 1996
  • Two servo control algorithms are suggested to reduce the tracking error of a computer hard disk drive. One is the repetitive control to reduce the repeatable tracking error which is not explicitly taken into account in the design of a conventional controller. This algorithm was successfully applied to a commercial disk using a fixed point DSP. The other is the multi-rate sampling control which generates the control output between each sampling times since the sampling time of hard disk drives is limited. These algorithms were shown effectively to reduce tracking errors.

  • PDF

Bayesian Multiple Change-Point for Small Data (소량자료를 위한 베이지안 다중 변환점 모형)

  • Cheon, Soo-Young;Yu, Wenxing
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.2
    • /
    • pp.237-246
    • /
    • 2012
  • Bayesian methods have been recently used to identify multiple change-points. However, the studies for small data are limited. This paper suggests the Bayesian noncentral t distribution change-point model for small data, and applies the Metropolis-Hastings-within-Gibbs Sampling algorithm to the proposed model. Numerical results of simulation and real data show the performance of the new model in terms of the quality of the resulting estimation of the numbers and positions of change-points for small data.

Monitoring with VSR Charts and Change Point Estimation

  • Lee, Jae-Heon;Park, Chang-Soon
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2005.05a
    • /
    • pp.191-196
    • /
    • 2005
  • Knowing the time of the process change could lead to quicker identification of the responsible special cause and less process down time, and it could help to reduce the probability of incorrectly identifying the special cause. In this paper, we propose a MLE of the process change point when control charts with the fixed sampling rate (FSR) scheme or the variable sampling rate (VSR) scheme monitor a process to detect changes in the process mean and/or variance of a normal quality variable.

  • PDF

A Study on Medial Surface Extraction from Point Samples on 3D Closed Surfaces in Shell Shapes (셸 형상의 3차원 폐곡면상에서 추출된 점데이터군으로부터 중립곡면 계산에 관한 연구)

  • Woo, Hyuck-Je
    • Korean Journal of Computational Design and Engineering
    • /
    • v.15 no.1
    • /
    • pp.33-42
    • /
    • 2010
  • In this study, new medial surface calculation methods using Voronoi diagrams are investigated for the point samples extracted on closed surface models. The medial surface is defined by the closure of all points having more than one closest point on the shape boundary. It is a one of essential geometric information in 3D and can be used in many areas such as 3D shape analysis, dimension reduction, freeform shape deformation, image processing, computer vision, FEM analysis, etc. In industrial parts, the idealized solid parts and shell shapes including sharp edges and vertices are frequently used. Other medial surface extraction methods using Voronoi diagram have inherent separation and branch problems, so that they are not appropriate to the sharp edged objects and have difficulties to be applied to industrial parts. In addition, the branched surfaces on sharp edges in shell shapes should be eliminated to obtain representative medial shapes. In order to avoid separation and branch problems, the new approach by analyzing the shapes and specially sampling on surfaces has been developed.