• Title/Summary/Keyword: conditional variance

Search Result 83, Processing Time 0.026 seconds

Determination Conversion Weight of Convertible Bonds Using Mean/Value-at-Risk Optimization Models (평균/VaR 최적화 모형에 의한 전환사채 주식전환 비중 결정)

  • Park, Koohyun
    • Korean Management Science Review
    • /
    • v.30 no.3
    • /
    • pp.55-70
    • /
    • 2013
  • In this study we suggested two optimization models to determine conversion weight of convertible bonds. The problem of this study is same as that of Park and Shim [1]. But this study used Value-at-Risk (VaR) for risk measurement instead of CVaR, Conditional-Value-at-Risk. In comparison with conventional Markowitz portfolio models, which use the variance of return, our models used VaR. In 1996, Basel Committee on Banking Supervision recommended VaR for portfolio risk measurement. But there are difficulties in solving optimization models including VaR. Benati and Rizzi [5] proved NP-hardness of general portfolio optimization problems including VaR. We adopted their approach. But we developed efficient algorithms with time complexity O(nlogn) or less for our models. We applied examples of our models to the convertible bond issued by a semiconductor company Hynix.

Particle filter for model updating and reliability estimation of existing structures

  • Yoshida, Ikumasa;Akiyama, Mitsuyoshi
    • Smart Structures and Systems
    • /
    • v.11 no.1
    • /
    • pp.103-122
    • /
    • 2013
  • It is essential to update the model with reflecting observation or inspection data for reliability estimation of existing structures. Authors proposed updated reliability analysis by using Particle Filter. We discuss how to apply the proposed method through numerical examples on reinforced concrete structures after verification of the method with hypothetical linear Gaussian problem. Reinforced concrete structures in a marine environment deteriorate with time due to chloride-induced corrosion of reinforcing bars. In the case of existing structures, it is essential to monitor the current condition such as chloride-induced corrosion and to reflect it to rational maintenance with consideration of the uncertainty. In this context, updated reliability estimation of a structure provides useful information for the rational decision. Accuracy estimation is also one of the important issues when Monte Carlo approach such as Particle Filter is adopted. Especially Particle Filter approach has a problem known as degeneracy. Effective sample size is introduced to predict the covariance of variance of limit state exceeding probabilities calculated by Particle Filter. Its validity is shown by the numerical experiments.

ON THE FLUCTUATION IN THE RANDOM ASSIGNMENT PROBLEM

  • Lee, Sung-Chul;Su, Zhong-Gen
    • Communications of the Korean Mathematical Society
    • /
    • v.17 no.2
    • /
    • pp.321-330
    • /
    • 2002
  • Consider the random assignment (or bipartite matching) problem with iid uniform edge costs t(i, j). Let $A_{n}$ be the optimal assignment cost. Just recently does Aldous [2] give a rigorous proof that E $A_{n}$ longrightarrowζ(2). In this paper we establish the upper and lower bounds for Var $A_{n}$ , i.e., there exist two strictly positive but finite constants $C_1$ and $C_2$ such athat $C_1$ $n^{(-5}$2)/ (log n)$^{(-3}$2)/ $\leq$ Var $A_{n}$ $\leq$ $C_2$ $n^{-1}$ (log n)$^2$.EX>.

Importance sampling with splitting for portfolio credit risk

  • Kim, Jinyoung;Kim, Sunggon
    • Communications for Statistical Applications and Methods
    • /
    • v.27 no.3
    • /
    • pp.327-347
    • /
    • 2020
  • We consider a credit portfolio with highly skewed exposures. In the portfolio, small number of obligors have very high exposures compared to the others. For the Bernoulli mixture model with highly skewed exposures, we propose a new importance sampling scheme to estimate the tail loss probability over a threshold and the corresponding expected shortfall. We stratify the sample space of the default events into two subsets. One consists of the events that the obligors with heavy exposures default simultaneously. We expect that typical tail loss events belong to the set. In our proposed scheme, the tail loss probability and the expected shortfall corresponding to this type of events are estimated by a conditional Monte Carlo, which results in variance reduction. We analyze the properties of the proposed scheme mathematically. In numerical study, the performance of the proposed scheme is compared with an existing importance sampling method.

Estimation of Gini-Simpson index for SNP data

  • Kang, Joonsung
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.6
    • /
    • pp.1557-1564
    • /
    • 2017
  • We take genomic sequences of high-dimensional low sample size (HDLSS) without ordering of response categories into account. When constructing an appropriate test statistics in this model, the classical multivariate analysis of variance (MANOVA) approach might not be useful owing to very large number of parameters and very small sample size. For these reasons, we present a pseudo marginal model based upon the Gini-Simpson index estimated via Bayesian approach. In view of small sample size, we consider the permutation distribution by every possible n! (equally likely) permutation of the joined sample observations across G groups of (sizes $n_1,{\ldots}n_G$). We simulate data and apply false discovery rate (FDR) and positive false discovery rate (pFDR) with associated proposed test statistics to the data. And we also analyze real SARS data and compute FDR and pFDR. FDR and pFDR procedure along with the associated test statistics for each gene control the FDR and pFDR respectively at any level ${\alpha}$ for the set of p-values by using the exact conditional permutation theory.

Bivariate Dagum distribution

  • Muhammed, Hiba Z.
    • International Journal of Reliability and Applications
    • /
    • v.18 no.2
    • /
    • pp.65-82
    • /
    • 2017
  • Abstract. Camilo Dagum proposed several variants of a new model for the size distribution of personal income in a series of papers in the 1970s. He traced the genesis of the Dagum distributions in applied economics and points out parallel developments in several branches of the applied statistics literature. The main aim of this paper is to define a bivariate Dagum distribution so that the marginals have Dagum distributions. It is observed that the joint probability density function and the joint cumulative distribution function can be expressed in closed forms. Several properties of this distribution such as marginals, conditional distributions and product moments have been discussed. The maximum likelihood estimates for the unknown parameters of this distribution and their approximate variance-covariance matrix have been obtained. Some simulations have been performed to see the performances of the MLEs. One data analysis has been performed for illustrative purpose.

  • PDF

Bootstrap-Based Test for Volatility Shifts in GARCH against Long-Range Dependence

  • Wang, Yu;Park, Cheolwoo;Lee, Taewook
    • Communications for Statistical Applications and Methods
    • /
    • v.22 no.5
    • /
    • pp.495-506
    • /
    • 2015
  • Volatility is a variation measure in finance for returns of a financial instrument over time. GARCH models have been a popular tool to analyze volatility of financial time series data since Bollerslev (1986) and it is said that volatility is highly persistent when the sum of the estimated coefficients of the squared lagged returns and the lagged conditional variance terms in GARCH models is close to 1. Regarding persistence, numerous methods have been proposed to test if such persistency is due to volatility shifts in the market or natural fluctuation explained by stationary long-range dependence (LRD). Recently, Lee et al. (2015) proposed a residual-based cumulative sum (CUSUM) test statistic to test volatility shifts in GARCH models against LRD. We propose a bootstrap-based approach for the residual-based test and compare the sizes and powers of our bootstrap-based CUSUM test with the one in Lee et al. (2015) through simulation studies.

Local Uncertainty of Thickness of Consolidation Layer for Songdo New City (송도신도시 압밀층 두께의 국부적 불확실성 평가)

  • Kim, Dong-Hee;Ryu, Dong-Woo;Chae, Young-Ho;Lee, Woo-Jin
    • Journal of the Korean Geotechnical Society
    • /
    • v.28 no.1
    • /
    • pp.17-27
    • /
    • 2012
  • Since geologic data are often sampled at sparse locations, it is important not only to predict attribute values at unsampled locations but also to assess the uncertainty attached to the prediction. In this study the assessment of the local uncertainty of prediction for the thickness of the consolidation layer was performed by using the indicator approach. A conditional cumulative distribution function (ccdf) was first modeled, and then E-type estimates and the conditional variance were computed for the spatial distribution of the thickness of the consolidation layer. These results could be used to estimate the spatial distribution of secondary compression and to assess the local uncertainty of secondary compression for Songdo New City.

Integration of Kriging Algorithm and Remote Sensing Data and Uncertainty Analysis for Environmental Thematic Mapping: A Case Study of Sediment Grain Size Mapping (지표환경 주제도 작성을 위한 크리깅 기법과 원격탐사 자료의 통합 및 불확실성 분석 -입도분포지도 사례 연구-)

  • Park, No-Wook;Jang, Dong-Ho
    • Journal of the Korean Geographical Society
    • /
    • v.44 no.3
    • /
    • pp.395-409
    • /
    • 2009
  • The objective of this paper is to illustrate that kriging can provide an effective framework both for integrating remote sensing data and for uncertainty modeling through a case study of sediment grain size mapping with remote sensing data. Landsat TM data which show reasonable relationships with grain size values are used as secondary information for sediment grain size mapping near the eastern part of Anmyeondo and Cheonsuman bay. The case study results showed that uncertainty attached to prediction at unsampled locations was significantly reduced by integrating remote sensing data through the analysis of conditional variance from conditional cumulative distribution functions. It is expected that the kriging-based approach presented in this paper would be efficient integration and analysis methodologies for any environmental thematic mapping using secondary information as well as sediment grain size mapping.

A study on the Linkage of Volatility in Stock Markets under Global Financial Crisis (글로벌 금융위기하에서 주식시장 변동성의 연관성에 대한 연구)

  • Lee, Kyung-Hee;Kim, Kyung-Soo
    • Management & Information Systems Review
    • /
    • v.33 no.1
    • /
    • pp.139-155
    • /
    • 2014
  • This study is to examine the linkage of volatility between changes in the stock market of India and other countries through the integration of the world economy. The results were as follows: First, autocorrelation or serial correlation did not exist in the classic RS model, but long-term memory was present in the modified RS model. Second, unit root did not exist in the unit root test for all periods, and the series were a stable explanatory power and a long-term memory with the normal conditions in the ARFIMA model. Third, in the multivariate asymmetric BEKK and VAR model before the financial crisis, it showed that there was a strong influence of the own market of Taiwan and UK in the conditional mean equation, and a strong spillover effect from Japan to India, from Taiwan to China(Korea, US), from US(Japan) to UK in one direction. In the conditional variance equation, GARCH showed a strong spillover effect that indicated the same direction as the result of ARCH coefficient of the market itself. Asymmetric effects in three home markets and between markets existed. Fourth, after the financial crisis, in the conditional mean equation, only the domestic market in Taiwan showed strong influences, and strong spillover effects existed from India to US, from Taiwan to Japan, from Korea to Germany in one direction. In the conditional variance equation, strong spillover effects were the same as the result of the pre-crisis and asymmetric effect in the domestic market in UK was present, and one-way asymmetric effect existed in Germany from Taiwan. Therefore, the results of this study presented the linkage between the volatilities of the stock market of India and other countries through the integration of the world economy, observing and confirming the asymmetric reactions and return(volatility) spillover effects between the stock market of India and other countries.

  • PDF