• Title/Summary/Keyword: Global/Local Branch History

Search Result 3, Processing Time 0.017 seconds

Branch Prediction with Speculative History and Its Effective Recovery Method (분기 정보의 추측적 사용과 효율적 복구 기법)

  • Kwak, Jong-Wook
    • The KIPS Transactions:PartA
    • /
    • v.15A no.4
    • /
    • pp.217-226
    • /
    • 2008
  • Branch prediction accuracy is critical for system performance in modern microprocessor architectures. The use of speculative update branch history provides substantial accuracy improvement in branch prediction. However, speculative update branch history is the information about uncommitted branch instruction and thus it may hurts program correctness, in case of miss-speculative execution. Therefore, speculative update branch history requires suitable recovery mechanisms to provide program correctness as well as performance improvement. In this paper, we propose recovery logics for speculative update branch history. The proposed solutions are recovery logics for both global history and local history. In simulation results, our solution provides performance improvement up to 5.64%. In addition, it guarantees the program correctness and almost 90% of additional hardware overhead is reduced, compared to previous works.

Design of a G-Share Branch Predictor for EISC Processor

  • Kim, InSik;Jun, JaeYung;Na, Yeoul;Kim, Seon Wook
    • IEIE Transactions on Smart Processing and Computing
    • /
    • v.4 no.5
    • /
    • pp.366-370
    • /
    • 2015
  • This paper proposes a method for improving a branch predictor for the extendable instruction set computer (EISC) processor. The original EISC branch predictor has several shortcomings: a small branch target buffer, absence of a global history, a one-bit local branch history, and unsupported prediction of branches following LERI, which is a special instruction to extend an immediate value. We adopt a G-share branch predictor and eliminate the existing shortcomings. We verified the new branch predictor on a field-programmable gate array with the Dhrystone benchmark. The newly proposed EISC branch predictor also accomplishes higher branch prediction accuracy than a conventional branch predictor.

Optimization of Gaussian Mixture in CDHMM Training for Improved Speech Recognition

  • Lee, Seo-Gu;Kim, Sung-Gil;Kang, Sun-Mee;Ko, Han-Seok
    • Speech Sciences
    • /
    • v.5 no.1
    • /
    • pp.7-21
    • /
    • 1999
  • This paper proposes an improved training procedure in speech recognition based on the continuous density of the Hidden Markov Model (CDHMM). Of the three parameters (initial state distribution probability, state transition probability, output probability density function (p.d.f.) of state) governing the CDHMM model, we focus on the third parameter and propose an efficient algorithm that determines the p.d.f. of each state. It is known that the resulting CDHMM model converges to a local maximum point of parameter estimation via the iterative Expectation Maximization procedure. Specifically, we propose two independent algorithms that can be embedded in the segmental K -means training procedure by replacing relevant key steps; the adaptation of the number of mixture Gaussian p.d.f. and the initialization using the CDHMM parameters previously estimated. The proposed adaptation algorithm searches for the optimal number of mixture Gaussian humps to ensure that the p.d.f. is consistently re-estimated, enabling the model to converge toward the global maximum point. By applying an appropriate threshold value, which measures the amount of collective changes of weighted variances, the optimized number of mixture Gaussian branch is determined. The initialization algorithm essentially exploits the CDHMM parameters previously estimated and uses them as the basis for the current initial segmentation subroutine. It captures the trend of previous training history whereas the uniform segmentation decimates it. The recognition performance of the proposed adaptation procedures along with the suggested initialization is verified to be always better than that of existing training procedure using fixed number of mixture Gaussian p.d.f.

  • PDF