• Title/Summary/Keyword: Value-at-Risk(VaR)

Search Result 65, Processing Time 0.024 seconds

The GARCH-GPD in market risks modeling: An empirical exposition on KOSPI

  • Atsmegiorgis, Cheru;Kim, Jongtae;Yoon, Sanghoo
    • Journal of the Korean Data and Information Science Society
    • /
    • v.27 no.6
    • /
    • pp.1661-1671
    • /
    • 2016
  • Risk analysis is a systematic study of uncertainties and risks we encounter in business, engineering, public policy, and many other areas. Value at Risk (VaR) is one of the most widely used risk measurements in risk management. In this paper, the Korean Composite Stock Price Index data has been utilized to model the VaR employing the classical ARMA (1,1)-GARCH (1,1) models with normal, t, generalized hyperbolic, and generalized pareto distributed errors. The aim of this paper is to compare the performance of each model in estimating the VaR. The performance of models were compared in terms of the number of VaR violations and Kupiec exceedance test. The GARCH-GPD likelihood ratio unconditional test statistic has been found to have the smallest value among the models.

A study on synthetic risk management on market risk of financial assets(focus on VaR model) (시장위험에 대한 금융자산의 종합적 위험관리(VaR모형 중심))

  • 김종권
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.22 no.49
    • /
    • pp.43-57
    • /
    • 1999
  • The recent trend is that risk management has more and more its importance. Neverthless, Korea's risk management is not developed. Even most banks does gap, duration in ALM for risk management, development and operation of VaR stressed at BIS have elementary level. In the case of Fallon and Pritsker, Marshall, gamma model is superior to delta model and Monte Carlo Simulation is improved at its result, as sample number is increased. And, nonparametric model is superior to parametric model. In the case of Korea's stock portfolio, VaR of Monte Carlo Simulation and Full Variance Covariance Model is less than that of Diagonal Model. The reason is that VaR of Full Variance Covariance Model is more precise than that of Diagonal Model. By the way, in the case of interest rate, result of monte carlo simulation is less than that of delta-gamma analysis on 95% confidence level. But, result of 99% is reversed. Therefore, result of which method is not dominated. It means two fact at forecast on volatility of stock and interest rate portfolio. First, in Delta-gamma method and Monte Carlo Simulation, assumption of distribution affects Value at Risk. Second, Value at Risk depends on test method. And, if option price is included, test results will have difference between the two. Therefore, If interest rate futures and option market is open, Korea's findings is supposed to like results of other advanced countries. And, every banks try to develop its internal model.

  • PDF

Value at Risk with Peaks over Threshold: Comparison Study of Parameter Estimation (Peacks over threshold를 이용한 Value at Risk: 모수추정 방법론의 비교)

  • Kang, Minjung;Kim, Jiyeon;Song, Jongwoo;Song, Seongjoo
    • The Korean Journal of Applied Statistics
    • /
    • v.26 no.3
    • /
    • pp.483-494
    • /
    • 2013
  • The importance of financial risk management has been highlighted after several recent incidences of global financial crisis. One of the issues in financial risk management is how to measure the risk; currently, the most widely used risk measure is the Value at Risk(VaR). We can consider to estimate VaR using extreme value theory if the financial data have heavy tails as the recent market trend. In this paper, we study estimations of VaR using Peaks over Threshold(POT), which is a common method of modeling fat-tailed data using extreme value theory. To use POT, we first estimate parameters of the Generalized Pareto Distribution(GPD). Here, we compare three different methods of estimating parameters of GPD by comparing the performance of the estimated VaR based on KOSPI 5 minute-data. In addition, we simulate data from normal inverse Gaussian distributions and examine two parameter estimation methods of GPD. We find that the recent methods of parameter estimation of GPD work better than the maximum likelihood estimation when the kurtosis of the return distribution of KOSPI is very high and the simulation experiment shows similar results.

Value at Risk of portfolios using copulas

  • Byun, Kiwoong;Song, Seongjoo
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.1
    • /
    • pp.59-79
    • /
    • 2021
  • Value at Risk (VaR) is one of the most common risk management tools in finance. Since a portfolio of several assets, rather than one asset portfolio, is advantageous in the risk diversification for investment, VaR for a portfolio of two or more assets is often used. In such cases, multivariate distributions of asset returns are considered to calculate VaR of the corresponding portfolio. Copulas are one way of generating a multivariate distribution by identifying the dependence structure of asset returns while allowing many different marginal distributions. However, they are used mainly for bivariate distributions and are not widely used in modeling joint distributions for many variables in finance. In this study, we would like to examine the performance of various copulas for high dimensional data and several different dependence structures. This paper compares copulas such as elliptical, vine, and hierarchical copulas in computing the VaR of portfolios to find appropriate copula functions in various dependence structures among asset return distributions. In the simulation studies under various dependence structures and real data analysis, the hierarchical Clayton copula shows the best performance in the VaR calculation using four assets. For marginal distributions of single asset returns, normal inverse Gaussian distribution was used to model asset return distributions, which are generally high-peaked and heavy-tailed.

A rolling analysis on the prediction of value at risk with multivariate GARCH and copula

  • Bai, Yang;Dang, Yibo;Park, Cheolwoo;Lee, Taewook
    • Communications for Statistical Applications and Methods
    • /
    • v.25 no.6
    • /
    • pp.605-618
    • /
    • 2018
  • Risk management has been a crucial part of the daily operations of the financial industry over the past two decades. Value at Risk (VaR), a quantitative measure introduced by JP Morgan in 1995, is the most popular and simplest quantitative measure of risk. VaR has been widely applied to the risk evaluation over all types of financial activities, including portfolio management and asset allocation. This paper uses the implementations of multivariate GARCH models and copula methods to illustrate the performance of a one-day-ahead VaR prediction modeling process for high-dimensional portfolios. Many factors, such as the interaction among included assets, are included in the modeling process. Additionally, empirical data analyses and backtesting results are demonstrated through a rolling analysis, which help capture the instability of parameter estimates. We find that our way of modeling is relatively robust and flexible.

A numerical study on portfolio VaR forecasting based on conditional copula (조건부 코퓰라를 이용한 포트폴리오 위험 예측에 대한 실증 분석)

  • Kim, Eun-Young;Lee, Tae-Wook
    • Journal of the Korean Data and Information Science Society
    • /
    • v.22 no.6
    • /
    • pp.1065-1074
    • /
    • 2011
  • During several decades, many researchers in the field of finance have studied Value at Risk (VaR) to measure the market risk. VaR indicates the worst loss over a target horizon such that there is a low, pre-specified probability that the actual loss will be larger (Jorion, 2006, p.106). In this paper, we compare conditional copula method with two conventional VaR forecasting methods based on simple moving average and exponentially weighted moving average for measuring the risk of the portfolio, consisting of two domestic stock indices. Through real data analysis, we conclude that the conditional copula method can improve the accuracy of portfolio VaR forecasting in the presence of high kurtosis and strong correlation in the data.

Estimation and Performance Analysis of Risk Measures using Copula and Extreme Value Theory (코퓰러과 극단치이론을 이용한 위험척도의 추정 및 성과분석)

  • Yeo, Sung-Chil
    • The Korean Journal of Applied Statistics
    • /
    • v.19 no.3
    • /
    • pp.481-504
    • /
    • 2006
  • VaR, a tail-related risk measure is now widely used as a tool for a measurement and a management of financial risks. For more accurate measurement of VaR, recently we are particularly concerned about the approach based on extreme value theory rather than the traditional method based on the assumption of normal distribution. However, many studies about the approaches using extreme value theory was done only for the univariate case. In this paper, we discuss portfolio risk measurements with modelling multivariate extreme value distributions by combining copulas and extreme value theory. We also discuss the estimation of ES together with VaR as portfolio risk measures. Finally, we investigate the relative superiority of EVT-copula approach than variance-covariance method through the back-testing of an empirical data.

Performance Analysis of Volatility Models for Estimating Portfolio Value at Risk (포트폴리오 VaR 측정을 위한 변동성 모형의 성과분석)

  • Yeo, Sung Chil;Li, Zhaojing
    • The Korean Journal of Applied Statistics
    • /
    • v.28 no.3
    • /
    • pp.541-559
    • /
    • 2015
  • VaR is now widely used as an important tool to evaluate and manage financial risks. In particular, it is important to select an appropriate volatility model for the rate of return of financial assets. In this study, both univariate and multivariate models are considered to evaluate VaR of the portfolio composed of KOSPI, Hang-Seng, Nikkei indexes, and their performances are compared through back testing techniques. Overall, multivariate models are shown to be more appropriate than univariate models to estimate the portfolio VaR, in particular DCC and ADCC models are shown to be more superior than others.

Performance of VaR Estimation Using Point Process Approach (점과정 기법을 이용한 VaR추정의 성과)

  • Yeo, Sung-Chil;Moon, Seoung-Joo
    • The Korean Journal of Applied Statistics
    • /
    • v.23 no.3
    • /
    • pp.471-485
    • /
    • 2010
  • VaR is used extensively as a tool for risk management by financial institutions. For convenience, the normal distribution is usually assumed for the measurement of VaR, but recently the method using extreme value theory is attracted for more accurate VaR estimation. So far, GEV and GPD models are used for probability models of EVT for the VaR estimation. In this paper, the PP model is suggested for improved VaR estimation as compared to the traditonal EV models such as GEV and GPD models. In view of the stochastic process, the PP model is regarded as a generalized model which include GEV and GPD models. In the empirical analysis, the PP model is shown to be superior to GEV and GPD models for the performance of VaR estimation.

A Study on VaR Stability for Operational Risk Management (운영리스크 VaR 추정값의 안정성검증 방법 연구)

  • Kim, Hyun-Joong;Kim, Woo-Hwan;Lee, Sang-Cheol;Im, Jong-Ho;Cho, Sang-Hee;Kim, Ah-Hyoun
    • Communications for Statistical Applications and Methods
    • /
    • v.15 no.5
    • /
    • pp.697-708
    • /
    • 2008
  • Operational risk is defined as the risk of loss resulting from inadequate or failed internal processes, people and systems, or external events. The advanced measurement approach proposed by Basel committee uses loss distribution approach(LDA) which quantifies operational loss based on bank's own historical data and measurement system. LDA involves two distribution fittings(frequency and severity) and then generates aggregate loss distribution by employing mathematical convolution. An objective validation for the operational risk measurement is essential because the operational risk measurement allows flexibility and subjective judgement to calculate regulatory capital. However, the methodology to verify the soundness of the operational risk measurement was not fully developed because the internal operational loss data had been extremely sparse and the modeling of extreme tail was very difficult. In this paper, we propose a methodology for the validation of operational risk measurement based on bootstrap confidence intervals of operational VaR(value at risk). We derived two methods to generate confidence intervals of operational VaR.