• Title/Summary/Keyword: asymptotic performance,

Search Result 264, Processing Time 0.021 seconds

Performance of a Bayesian Design Compared to Some Optimal Designs for Linear Calibration (선형 캘리브레이션에서 베이지안 실험계획과 기존의 최적실험계획과의 효과비교)

  • 김성철
    • The Korean Journal of Applied Statistics
    • /
    • v.10 no.1
    • /
    • pp.69-84
    • /
    • 1997
  • We consider a linear calibration problem, $y_i = $$\alpha + \beta (x_i - x_0) + \epsilon_i$, $i=1, 2, {\cdot}{\cdot},n$ $y_f = \alpha + \beta (x_f - x_0) + \epsilon, $ where we observe $(x_i, y_i)$'s for the controlled calibration experiments and later we make inference about $x_f$ from a new observation $y_f$. The objective of the calibration design problem is to find the optimal design $x = (x_i, \cdots, x_n$ that gives the best estimates for $x_f$. We compare Kim(1989)'s Bayesian design which minimizes the expected value of the posterior variance of $x_f$ and some optimal designs from literature. Kim suggested the Bayesian optimal design based on the analysis of the characteristics of the expected loss function and numerical must be equal to the prior mean and that the sum of squares be as large as possible. The designs to be compared are (1) Buonaccorsi(1986)'s AV optimal design that minimizes the average asymptotic variance of the classical estimators, (2) D-optimal and A-optimal design for the linear regression model that optimize some functions of $M(x) = \sum x_i x_i'$, and (3) Hunter & Lamboy (1981)'s reference design from their paper. In order to compare the designs which are optimal in some sense, we consider two criteria. First, we compare them by the expected posterior variance criterion and secondly, we perform the Monte Carlo simulation to obtain the HPD intervals and compare the lengths of them. If the prior mean of $x_f$ is at the center of the finite design interval, then the Bayesian, AV optimal, D-optimal and A-optimal designs are indentical and they are equally weighted end-point design. However if the prior mean is not at the center, then they are not expected to be identical.In this case, we demonstrate that the almost Bayesian-optimal design was slightly better than the approximate AV optimal design. We also investigate the effects of the prior variance of the parameters and solution for the case when the number of experiments is odd.

  • PDF

A Test for Nonlinear Causality and Its Application to Money, Production and Prices (통화(通貨)·생산(生産)·물가(物價)의 비선형인과관계(非線型因果關係) 검정(檢定))

  • Baek, Ehung-gi
    • KDI Journal of Economic Policy
    • /
    • v.13 no.4
    • /
    • pp.117-140
    • /
    • 1991
  • The purpose of this paper is primarily to introduce a nonparametric statistical tool developed by Baek and Brock to detect a unidirectional causal ordering between two economic variables and apply it to interesting macroeconomic relationships among money, production and prices. It can be applied to any other causal structure, for instance, defense spending and economic performance, stock market index and market interest rates etc. A key building block of the test for nonlinear Granger causality used in this paper is the correlation. The main emphasis is put on nonlinear causal structure rather than a linear one because the conventional F-test provides high power against the linear causal relationship. Based on asymptotic normality of our test statistic, the nonlinear causality test is finally derived. Size of the test is reported for some parameters. When it is applied to a money, production and prices model, some evidences of nonlinear causality are found by the corrected size of the test. For instance, nonlinear causal relationships between production and prices are demonstrated in both directions, however, these results were ignored by the conventional F-test. A similar results between money and prices are obtained at high lag variables.

  • PDF

Adaptive Service Mode Conversion to Minimize Buffer Space Requirement in VOD Server (주문형 비디오 서버의 버퍼 최소화를 위한 가변적 서비스 모드 변환)

  • Won, Yu-Jip
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.28 no.5
    • /
    • pp.213-217
    • /
    • 2001
  • Excessive memory buffer requirement in continuous media playback is a serious impediment of wide spread usage of on-line multimedia service. Skewed access frequency of available video files provides an opportunity of re-using the date blocks which has been loaded by one session for later usage. We present novel algorithm which minimizes the buffer requirement in multiple sessions of multimedia playbacks. In continuous media playback originated from the disk, a certain amount of memory buffer is required to synchronize asynchronous disk. Read operation and synchronous playback operation. As aggregate playback bandwodth increases, larger amount of buffer needs to be allocated for this synchronization purpose. The focus of this work is to study the asymptotic behavior of the synchronization buffer requirement and to develop an algorithm coping with this excessive buffer requirement under bandwidth congestioon. We argue that in a large scale continuous media server, it may not be necessary to read the blocks for each session directly from the disk. The beauty of our work lies in the fact that it dynamically adapts to disk utilization of the server and finds the optimal way of servicinh the individual sessions while minimizing the overall buffer space requirement. Optimality of the proposed algorithm is shown by proof. The effectiveness and performance of the proposed scheme is examined via simulation.

  • PDF

The Asymptotic Throughput and Connectivity of Cognitive Radio Networks with Directional Transmission

  • Wei, Zhiqing;Feng, Zhiyong;Zhang, Qixun;Li, Wei;Gulliver, T. Aaron
    • Journal of Communications and Networks
    • /
    • v.16 no.2
    • /
    • pp.227-237
    • /
    • 2014
  • Throughput scaling laws for two coexisting ad hoc networks with m primary users (PUs) and n secondary users (SUs) randomly distributed in an unit area have been widely studied. Early work showed that the secondary network performs as well as stand-alone networks, namely, the per-node throughput of the secondary networks is ${\Theta}(1/\sqrt{n{\log}n})$. In this paper, we show that by exploiting directional spectrum opportunities in secondary network, the throughput of secondary network can be improved. If the beamwidth of secondary transmitter (TX)'s main lobe is ${\delta}=o(1/{\log}n)$, SUs can achieve a per-node throughput of ${\Theta}(1/\sqrt{n{\log}n})$ for directional transmission and omni reception (DTOR), which is ${\Theta}({\log}n)$ times higher than the throughput with-out directional transmission. On the contrary, if ${\delta}={\omega}(1/{\log}n)$, the throughput gain of SUs is $2{\pi}/{\delta}$ for DTOR compared with the throughput without directional antennas. Similarly, we have derived the throughput for other cases of directional transmission. The connectivity is another critical metric to evaluate the performance of random ad hoc networks. The relation between the number of SUs n and the number of PUs m is assumed to be $n=m^{\beta}$. We show that with the HDP-VDP routing scheme, which is widely employed in the analysis of throughput scaling laws of ad hoc networks, the connectivity of a single SU can be guaranteed when ${\beta}$ > 1, and the connectivity of a single secondary path can be guaranteed when ${\beta}$ > 2. While circumventing routing can improve the connectivity of cognitive radio ad hoc network, we verify that the connectivity of a single SU as well as a single secondary path can be guaranteed when ${\beta}$ > 1. Thus, to achieve the connectivity of secondary networks, the density of SUs should be (asymptotically) bigger than that of PUs.