• Title/Summary/Keyword: Real variance estimation

Search Result 62, Processing Time 0.031 seconds

Validity assessment of VaR with Laplacian distribution (라플라스 분포 기반의 VaR 측정 방법의 적정성 평가)

  • Byun, Bu-Guen;Yoo, Do-Sik;Lim, Jongtae
    • Journal of the Korean Data and Information Science Society
    • /
    • v.24 no.6
    • /
    • pp.1263-1274
    • /
    • 2013
  • VaR (value at risk), which represents the expectation of the worst loss that may occur over a period of time within a given level of confidence, is currently used by various financial institutions for the purpose of risk management. In the majority of previous studies, the probability of return has been modeled with normal distribution. Recently Chen et al. (2010) measured VaR with asymmetric Laplacian distribution. However, it is difficult to estimate the mode, the skewness, and the degree of variance that determine the shape of an asymmetric Laplacian distribution with limited data in the real-world market. In this paper, we show that the VaR estimated with (symmetric) Laplacian distribution model provides more accuracy than those with normal distribution model or asymmetric Laplacian distribution model with real world stock market data and with various statistical measures.

Activity-based Approaches for Travel Demand Modeling: Reviews on Developments and Implementations (교통수요 예측을 위한 활동기반 접근 방법: 경향과 적용현황 고찰)

  • Lim, Kwang-Kyun;Kim, Sigon;Chung, SungBong
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.33 no.2
    • /
    • pp.719-727
    • /
    • 2013
  • Four-step travel-demand modeling based on a trip-level has been widely used over many decades. However, there has been a wide variance between forecasted- and real-travel demands, which leads less reliable on the model implications. A primary reason is that person's real travel behavior is not properly captured throughout the model developments. An activity-based modeling (ABM) approach was proposed and developed toward increasing the accuracy and reality of person's travel behavior in the U.S. since 1990', and stands as a good alternative to replace the existing trip-based approach. The paper contributes to the understanding of how the ABM approaches are dissimilar to the trip-based modeling approach in terms of estimation units, estimation process, their pros and cons and etc. We examined three activity-based travel demand model systems (DaySim, CT-Ramp, and CEMDAP) that are most commonly applied by many MPOs (Metropolitan Planning Organization). We found that the ABM approach can effectively explain multi-dimensional travel decision-makings and be expected to increase the predictive accuracy. Overall, the ABM approach can be a good substitute for the existing travel-demand methods having unreliable forecasts.

Generation of radar rainfall data for hydrological and meteorological application (II) : radar rainfall ensemble (수문기상학적 활용을 위한 레이더 강우자료 생산(II) : 레이더 강우앙상블)

  • Kim, Tae-Jeong;Lee, Dong-Ryul;Jang, Sang-Min;Kwon, Hyun-Han
    • Journal of Korea Water Resources Association
    • /
    • v.50 no.1
    • /
    • pp.17-28
    • /
    • 2017
  • A recent increase in extreme weather events and flash floods associated with the enhanced climate variability results in an increase in climate-related disasters. For these reasons, various studies based on a high resolution weather radar system have been carried out. The weather radar can provide estimates of precipitation in real-time over a wide area, while ground-based rain gauges only provides a point estimate in space. Weather radar is thus capable of identifying changes in rainfall structure as it moves through an ungauged basin. However, the advantage of the weather radar rainfall estimates has been limited by a variety of sources of uncertainty in the radar reflectivity process, including systematic and random errors. In this study, we developed an ensemble radar rainfall estimation scheme using the multivariate copula method. The results presented in this study confirmed that the proposed ensemble technique can effectively reproduce the rainfall statistics such as mean, variance and skewness (more importantly the extremes) as well as the spatio-temporal structure of rainfall fields.

Detection Method for Bean Cotyledon Locations under Vinyl Mulch Using Multiple Infrared Sensors

  • Lee, Kyou-Seung;Cho, Yong-jin;Lee, Dong-Hoon
    • Journal of Biosystems Engineering
    • /
    • v.41 no.3
    • /
    • pp.263-272
    • /
    • 2016
  • Purpose: Pulse crop damage due to wild birds is a serious problem, to the extent that the rate of damage during the period of time between seeding and the stage of cotyledon reaches 45.4% on average. This study investigated a method of fundamentally blocking birds from eating crops by conducting vinyl mulching after seeding and identifying the growing locations for beans to perform punching. Methods: Infrared (IR) sensors that could measure the temperature without contact were used to recognize the locations of soybean cotyledons below vinyl mulch. To expand the measurable range, 10 IR sensors were arranged in a linear array. A sliding mechanical device was used to reconstruct the two-dimensional spatial variance information of targets. Spatial interpolation was applied to the two-dimensional temperature distribution information measured in real time to improve the resolution of the bean coleoptile locations. The temperature distributions above the vinyl mulch for five species of soybeans over a period of six days from the appearance of the cotyledon stage were analyzed. Results: During the experimental period, cases where bean cotyledons did and did not come into contact with the bottom of the vinyl mulch were both observed, and depended on the degree of growth of the bean cotyledons. Although the locations of bean cotyledons could be estimated through temperature distribution analyses in cases where they came into contact with the bottom of the vinyl mulch, this estimation showed somewhat large errors according to the time that had passed after the cotyledon stage. The detection results were similar for similar types of crops. Thus, this method could be applied to crops with similar growth patterns. According to the results of 360 experiments that were conducted (five species of bean ${\times}$ six days ${\times}$ four speed levels ${\times}$ three repetitions), the location detection performance had an accuracy of 36.9%, and the range of location errors was 0-4.9 cm (RMSE = 3.1 cm). During a period of 3-5 days after the cotyledon stage, the location detection performance had an accuracy of 59% (RMSE = 3.9 cm). Conclusions: In the present study, to fundamentally solve the problem of damage to beans from birds in the early stage after seeding, a working method was proposed in which punching is carried out after seeding, thereby breaking away from the existing method in which seeding is carried out after punching. Methods for the accurate detection of soybean growing locations were studied to allow punching to promote the continuous growth of soybeans that had reached the cotyledon stage. Through experiments using multiple IR sensors and a sliding mechanical device, it was found that the locations of the crop could be partially identified 3-5 days after reaching the cotyledon stage regardless of the kind of pulse crop. It can be concluded that additional studies of robust detection methods considering environmental factors and factors for crop growth are necessary.

Linear programming models using a Dantzig type risk for portfolio optimization (Dantzig 위험을 사용한 포트폴리오 최적화 선형계획법 모형)

  • Ahn, Dayoung;Park, Seyoung
    • The Korean Journal of Applied Statistics
    • /
    • v.35 no.2
    • /
    • pp.229-250
    • /
    • 2022
  • Since the publication of Markowitz's (1952) mean-variance portfolio model, research on portfolio optimization has been conducted in many fields. The existing mean-variance portfolio model forms a nonlinear convex problem. Applying Dantzig's linear programming method, it was converted to a linear form, which can effectively reduce the algorithm computation time. In this paper, we proposed a Dantzig perturbation portfolio model that can reduce management costs and transaction costs by constructing a portfolio with stable and small (sparse) assets. The average return and risk were adjusted according to the purpose by applying a perturbation method in which a certain part is invested in the existing benchmark and the rest is invested in the assets proposed as a portfolio optimization model. For a covariance estimation, we proposed a Gaussian kernel weight covariance that considers time-dependent weights by reflecting time-series data characteristics. The performance of the proposed model was evaluated by comparing it with the benchmark portfolio with 5 real data sets. Empirical results show that the proposed portfolios provide higher expected returns or lower risks than the benchmark. Further, sparse and stable asset selection was obtained in the proposed portfolios.

Towards high-accuracy data modelling, uncertainty quantification and correlation analysis for SHM measurements during typhoon events using an improved most likely heteroscedastic Gaussian process

  • Qi-Ang Wang;Hao-Bo Wang;Zhan-Guo Ma;Yi-Qing Ni;Zhi-Jun Liu;Jian Jiang;Rui Sun;Hao-Wei Zhu
    • Smart Structures and Systems
    • /
    • v.32 no.4
    • /
    • pp.267-279
    • /
    • 2023
  • Data modelling and interpretation for structural health monitoring (SHM) field data are critical for evaluating structural performance and quantifying the vulnerability of infrastructure systems. In order to improve the data modelling accuracy, and extend the application range from data regression analysis to out-of-sample forecasting analysis, an improved most likely heteroscedastic Gaussian process (iMLHGP) methodology is proposed in this study by the incorporation of the outof-sample forecasting algorithm. The proposed iMLHGP method overcomes this limitation of constant variance of Gaussian process (GP), and can be used for estimating non-stationary typhoon-induced response statistics with high volatility. The first attempt at performing data regression and forecasting analysis on structural responses using the proposed iMLHGP method has been presented by applying it to real-world filed SHM data from an instrumented cable-stay bridge during typhoon events. Uncertainty quantification and correlation analysis were also carried out to investigate the influence of typhoons on bridge strain data. Results show that the iMLHGP method has high accuracy in both regression and out-of-sample forecasting. The iMLHGP framework takes both data heteroscedasticity and accurate analytical processing of noise variance (replace with a point estimation on the most likely value) into account to avoid the intensive computational effort. According to uncertainty quantification and correlation analysis results, the uncertainties of strain measurements are affected by both traffic and wind speed. The overall change of bridge strain is affected by temperature, and the local fluctuation is greatly affected by wind speed in typhoon conditions.

Reliability Analysis of Plane Stress Element According to Limit State Equations (한계상태방정식에 따른 평면응력요소의 신뢰성해석)

  • Park, Seok Jae;Choi, Wae Ho;Kim, Yo Suk;Shin, Yeong-Soo
    • Journal of Korean Society of Steel Construction
    • /
    • v.13 no.5
    • /
    • pp.567-575
    • /
    • 2001
  • In order to consider statistical properties of probability variables used in the structural analysis the conventional approach using the safety factor based on past experience usually estimated the safety of a structure Also the real structures could only be analyzed with the error in estimation of loads material characters and the dimensions of the members. But the errors should be considered systematically in the structural analysis Safety of structure could not precisely be appraised by the traditional structural design concept Recently new aproach based on the probability concept has been applied to the assessment of structural safety using the reliability concept Thus the computer program by the Probabilitstic FEM is developed by incorporating the probabilistic concept into the conventional FEM method. This paper estimated for the reliability of a plane stress structure by Advanced First-Order Second Moment method using von Mises, Tresca and Mohr-Coulomb failure criterions. Verification of the reliability index and failure probability of attained by the Monte Carlo Simulation method with the von Mises criterion were same as PFEM, but the Monte Carlo Simulation were very time-consuming. The variance of member thickness and load could influence the reliability and failure probability most sensitively among the design variables from the results of the parameter analysis. The proper failure criterion according to characteristic of materials must be used for safe design.

  • PDF

Development of the RP and SP Combined using Error Component Method (Error Component 방법을 이용한 RP.SP 결합모형 개발)

  • 김강수;조혜진
    • Journal of Korean Society of Transportation
    • /
    • v.21 no.2
    • /
    • pp.119-130
    • /
    • 2003
  • SP data have been widely used in assessing new transport policies and transport related plans. However, one of criticisms of using SP is that respondents may show different reaction between hypothetical experiments and real life. In order to overcome the problem, combination of SP and RP data has been suggested and the combined methods have been being developed. The purpose of this paper is to suggest a new SP and RP combined method using error component method and to verify the method. The error component method decomposes IID extreme value error into non-IID error component(s) and an IID error component. The method estimates both of component parameters and utility parameters in order to obtain relative variance of SP data and RP data. The artificial SP and RP data was created by using simulation and used for the analysis, and the estimation results of the error component method were compared with those of existing SP and RP combined methods. The results show that regardless of data size, the parameters of the error component method models are similar to those assumed parameters much more than those of the existing SP and RP combined models, indicating usefulness of the error component method. Also the values of time for error component method are more similar to those assumed values than those of the existing combined models. Therefore, we can conclude that the error component method is useful in combining SP and RP data and more efficient than the existing methods.

A comparison of imputation methods using nonlinear models (비선형 모델을 이용한 결측 대체 방법 비교)

  • Kim, Hyein;Song, Juwon
    • The Korean Journal of Applied Statistics
    • /
    • v.32 no.4
    • /
    • pp.543-559
    • /
    • 2019
  • Data often include missing values due to various reasons. If the missing data mechanism is not MCAR, analysis based on fully observed cases may an estimation cause bias and decrease the precision of the estimate since partially observed cases are excluded. Especially when data include many variables, missing values cause more serious problems. Many imputation techniques are suggested to overcome this difficulty. However, imputation methods using parametric models may not fit well with real data which do not satisfy model assumptions. In this study, we review imputation methods using nonlinear models such as kernel, resampling, and spline methods which are robust on model assumptions. In addition, we suggest utilizing imputation classes to improve imputation accuracy or adding random errors to correctly estimate the variance of the estimates in nonlinear imputation models. Performances of imputation methods using nonlinear models are compared under various simulated data settings. Simulation results indicate that the performances of imputation methods are different as data settings change. However, imputation based on the kernel regression or the penalized spline performs better in most situations. Utilizing imputation classes or adding random errors improves the performance of imputation methods using nonlinear models.

Estimation of drift force by real ship using multiple regression analysis (다중회귀분석에 의한 실선의 표류력 추정)

  • AHN, Jang-Young;KIM, Kwang-il;KIM, Min-Son;LEE, Chang-Heon
    • Journal of the Korean Society of Fisheries and Ocean Technology
    • /
    • v.57 no.3
    • /
    • pp.236-245
    • /
    • 2021
  • In this study, a drifting test using a experimental vessel (2,966 tons) in the northern waters of Jeju was carried out for the first time in order to obtain the fundamental data for drift. During the test, it was shown that the average leeway speed and direction by GPS position were 0.362 m/s and 155.54° respectively and the leeway rate for wind speed was 8.80%. The analysis of linear regression modes about leeway speed and direction of the experimental vessel indicated that wind or current (i.e. explanatory variable) had a greater influence upon response variable (e.g. leeway speed or direction) with the speed of the wind and current rather than their directions. On the other hand, the result of multiple regression model analysis was able to predict that the direction was negative, and it was demonstrated that predicted values of leeway speed and direction using an experimental vessel is to be more influential by current than wind while the leeway speed through variance and covariance was positive. In terms of the leeway direction of the experimental vessel, the same result of the leeway speed appeared except for a possibility of the existence of multi-collinearity. Then, it can be interpreted that the explanatory variables were less descriptive in the predicted values of the leeway direction. As a result, the prediction of leeway speed and direction can be demonstrated as following equations. Ŷ1= 0.4031-0.0032X1+0.0631X2-0.0010X3+0.4110X4 Ŷ2= 0.4031-0.6662X1+27.1955X2-0.6787X3-420.4833X4 However, many drift tests using actual vessels and various drifting objects will provide reasonable estimations, so that they can help search and rescue fishing gears as well.