• Title/Summary/Keyword: Return Decomposition

Search Result 23, Processing Time 0.024 seconds

A return mapping algorithm for plane stress and degenerated shell plasticity

  • Liu, Z.;Al-Bermani, F.G.A.
    • Structural Engineering and Mechanics
    • /
    • v.3 no.2
    • /
    • pp.185-192
    • /
    • 1995
  • A numerical algorithm for plane stress and shell elasto-plasticity is presented in this paper. The proposed strain decomposition (SD) algorithm is an elastic predictor/plastic corrector algorithm, and in the context of operator splitting, is a return mapping algorithm. However, it differs significantly from other return mapping algorithms in that only the necessary response functions are used without invoking their gradients, and the stress increment is updated only at the end of the time step. This makes the proposed SD algorithm more suitable for materials with complex yield surfaces and will guard against error accumulation during the time step. Comparative analyses of structural systems using the proposed strain decomposition (SD) algorithm and the iterative radial return (IRR) algorithm are presented. The results demonstrate the accuracy and usefulness of the proposed algorithm.

Financial Flexibility on Required Returns: Vector Autoregression Return Decomposition Approach

  • YIM, Sang-Giun
    • The Journal of Industrial Distribution & Business
    • /
    • v.11 no.5
    • /
    • pp.7-16
    • /
    • 2020
  • Purpose: Prior studies empirically examine how financial flexibility is related to required returns by using realized returns and considering cash holdings as net debts, but they fail to find consistent results. Conjecturing that inappropriate proxy of required returns and aggregation of cash and debts caused the inconsistent results, this study revisits this topic by using a refined proxy of required returns and separating cash holdings from debts. Research design, data and methodology: This study uses a multivariate regression model to investigate the relationship between required returns on cash holdings and financial leverage. The required returns are estimated using the return decomposition method by vector autoregression model. Empirical tests use US stock market data from1968 to 2011. Results: Empirical results reveal that both cash holdings and leverage are positively related to required returns. The positive relation is stronger in economic downturns than in economic upturns. Conclusions: Three major findings are drawn. First, risky firms prefer large cash balance. Second, information shocks in the realized returns caused failure of prior studies to find consistent positive relationship between leverage and realized returns. Third, cash and leverage are related to required returns in the same direction; therefore, cash cannot be considered as negative debts.

East Asian five stock market linkages (아시아 주식수익률의 동조화에 대한 연구)

  • Jung, Heon-Yong
    • Management & Information Systems Review
    • /
    • v.27
    • /
    • pp.131-147
    • /
    • 2008
  • The study examines common component existing in five Asian countries from 1991 to 2007. To do this, the daily stock market indices of Korea, Malaysia, Thailand, Indonesia, and the Philippines were used. Using a Vector Autoregressive Model this paper analyzes causal relations and dynamic interactions between five Asian stock markets. The findings in this study indicate that level of five Asian stock markets' stock return linkages are low. First, from the statistics for pair-wise Granger causality tests, I find Granger-causal relationship between Korea and Indonesia and between Malaysia and and Indonesia. Second, from the results of response function and the statistics of variance decomposition, I find that week shocks to Korean stock market return on Malaysia, Indonesia, Thailand, and the Philippines stock market returns. The results indicate increased Asian stock market linkages but the level is very low. This implies that the benefits of diversification within the five Asian stock markets are still existed.

  • PDF

Waveform Decomposition of Airborne Bathymetric LiDAR by Estimating Potential Peaks (잠재적 피크 추정을 통한 항공수심라이다 웨이브폼 분해)

  • Kim, Hyejin;Lee, Jaebin;Kim, Yongil;Wie, Gwangjae
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.6_1
    • /
    • pp.1709-1718
    • /
    • 2021
  • The waveform data of the Airborne Bathymetric LiDAR (ABL; LiDAR: Light Detection And Ranging) system provides data with improved accuracy, resolution, and reliability compared to the discrete-return data, and increases the user's control over data processing. Furthermore, we are able to extract additional information about the return signal. Waveform decomposition is a technique that separates each echo from the received waveform with a mixture of water surface and seabed reflections, waterbody backscattering, and various noises. In this study, a new waveform decomposition technique based on a Gaussian model was developed to improve the point extraction performance from the ABL waveform data. In the existing waveform decomposition techniques, the number of decomposed echoes and decomposition performance depend on the peak detection results because they use waveform peaks as initial values. However, in the study, we improved the approximation accuracy of the decomposition model by adding the estimated potential peak candidates to the initial peaks. As a result of an experiment using waveform data obtained from the East Coast from the Seahawk system, the precision of the decomposition model was improved by about 37% based on evaluating RMSE compared to the Gaussian decomposition method.

Risk Characteristic on Fat-tails of Return Distribution: An Evidence of the Korean Stock Market

  • Eom, Cheoljun
    • Asia-Pacific Journal of Business
    • /
    • v.11 no.4
    • /
    • pp.37-48
    • /
    • 2020
  • Purpose - This study empirically investigates whether the risk property included in fat-tails of return distributions is systematic or unsystematic based on the devised statistical methods. Design/methodology/approach - This study devised empirical designs based on two traditional methods: principal component analysis (PCA) and the testing method of portfolio diversification effect. The fatness of the tails in return distributions is quantitatively measured by statistical probability. Findings - According to the results, the risk property in the fat-tails of return distributions has the economic meanings of eigenvalues having a value greater than 1 through PCA, and also systematic risk that cannot be removed through portfolio diversification. In other words, the fat-tails of return distributions have the properties of the common factors, which may explain the changes of stock returns. Meanwhile, the fatness of the tails in the portfolio return distributions shows the asymmetric relationship of common factors on the tails of return distributions. The negative tail in the portfolio return distribution has a much closer relation with the property of common factors, compared to the positive tail. Research implications or Originality - This empirical evidence may complement the existing studies related to tail risk which is utilized in pricing models as a common factor.

Complexity Estimation Based Work Load Balancing for a Parallel Lidar Waveform Decomposition Algorithm

  • Jung, Jin-Ha;Crawford, Melba M.;Lee, Sang-Hoon
    • Korean Journal of Remote Sensing
    • /
    • v.25 no.6
    • /
    • pp.547-557
    • /
    • 2009
  • LIDAR (LIght Detection And Ranging) is an active remote sensing technology which provides 3D coordinates of the Earth's surface by performing range measurements from the sensor. Early small footprint LIDAR systems recorded multiple discrete returns from the back-scattered energy. Recent advances in LIDAR hardware now make it possible to record full digital waveforms of the returned energy. LIDAR waveform decomposition involves separating the return waveform into a mixture of components which are then used to characterize the original data. The most common statistical mixture model used for this process is the Gaussian mixture. Waveform decomposition plays an important role in LIDAR waveform processing, since the resulting components are expected to represent reflection surfaces within waveform footprints. Hence the decomposition results ultimately affect the interpretation of LIDAR waveform data. Computational requirements in the waveform decomposition process result from two factors; (1) estimation of the number of components in a mixture and the resulting parameter estimates, which are inter-related and cannot be solved separately, and (2) parameter optimization does not have a closed form solution, and thus needs to be solved iteratively. The current state-of-the-art airborne LIDAR system acquires more than 50,000 waveforms per second, so decomposing the enormous number of waveforms is challenging using traditional single processor architecture. To tackle this issue, four parallel LIDAR waveform decomposition algorithms with different work load balancing schemes - (1) no weighting, (2) a decomposition results-based linear weighting, (3) a decomposition results-based squared weighting, and (4) a decomposition time-based linear weighting - were developed and tested with varying number of processors (8-256). The results were compared in terms of efficiency. Overall, the decomposition time-based linear weighting work load balancing approach yielded the best performance among four approaches.

Packet Scheduling in Interactive Satellite Return Channels for Mobile Multimedia Services Using Hybrid CDMA/TDMA

  • Lee Ki-Dong;Kim Ho-Kyom;Lee Ho-Jin
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 2003.05a
    • /
    • pp.744-748
    • /
    • 2003
  • Developing an interactive satellite multi­media system, such as a digital video broadcasting (DVB) return channel via satellite (RCS) system, is gaining popularity over the world To accommodate the increasing traffic demand we are motivated to investigate an alternative for improving return channel utilization We develop an efficient method for optimal packet scheduling in an interactive satellite multimedia system using hybrid CDMA/TDMA channels. We formulate the timeslot-code assignment problem as a binary integer programming (BIP) problem, where the throughput maximization is the objective, and decompose this BIP problem into two sub-problems for the purpose of solution efficiency. With this decomposition, we promote the computational efficiency in finding the optimal solution of the original BIP problem Since 2001, ETRI has been involved in a development project where we have successfully completed an initial system integration test on broadband mobile Internet access via Ku-band channels using the proposed resource allocation algorithm.

  • PDF

The Pricing of Accruals Quality with Expected Returns: Vector Autoregression Return Decomposition Approach

  • YIM, Sang-Giun
    • The Journal of Industrial Distribution & Business
    • /
    • v.11 no.3
    • /
    • pp.7-17
    • /
    • 2020
  • Purpose: This study reexamines the test on the pricing of accruals quality. Theory suggests that information risk is a priced risk factor. Using accruals quality as the proxy for information risk, researchers have tested the pricing of information risk. The results are inconsistent potentially because of the information shock in the realized returns that are used as the proxy for expected returns. Based on this argument, this study revisits this issue excluding information-shock-free measure of expected returns. Research design, data and methodology: This study estimates expected returns using the vector autoregression model. This method extracts information shocks more thoroughly than the methods in prior studies; therefore, the concern regarding information shock is minimized. As risk premiums are larger in recession periods than in expansion periods, recession and expansion subsamples were used to confirm the robustness of the main findings. For the pricing test, this study uses two-stage cross-sectional regression. Results: Empirical results find evidence that accruals quality is a priced risk factor. Furthermore, this study finds that the pricing of accruals quality is observed only in recession periods. Conclusions: This study supports the argument that accruals quality, as well as the pricing of information risk, is a priced risk factor.

Decomposition and Super-efficiency in the Korean Life Insurance Industry Employing DEA

  • Lee, Hyung-Suk;Kim, Ki-Seog
    • International Journal of Contents
    • /
    • v.4 no.3
    • /
    • pp.1-9
    • /
    • 2008
  • The Korean life insurance industry has undergone profound changes, such as the beginning of the variable insurance in July 2001 and the bancassurance enforcement in August 2003. However, little empirical research has analyzed data that includes the bancassurance of life insurance companies operating in Korea. In response to this lack of research, this paper applies DEA (data envelopment analysis) models to measure and decompose their efficiency. We discovered that life insurance companies operating in Korea are a little different in their composition ratio of inputs and outputs, due to the increased variety of distribution channels and new products. We provided efficiency scores, return to scale, and reference frequencies. We also decomposed CCR, BCC, and SBM efficiency into scale efficiency and MIX efficiency. So, we try to investigate whether the sources of inefficiency were caused by the inefficient operation of DMU, disadvantageous conditions, the difference of the composition ratio in inputs and outputs with reference sets, or any combination of the above. Most companies in the sample display had either constant or decreasing returns to scale. The efficiency rankings were less consistent among models and efficient DMUs. In response to this problem, we used the super-efficiency model to rank them and then compared the rankings of the DMUs among the various models. It was also concluded that the availability of panel data, rather than cross-sectional data, would greatly improve the validity of the efficiency estimates.

A forensic study of the Lubbock-Reese downdraft of 2002

  • Holmes, J.D.;Hangan, H.M.;Schroeder, J.L.;Letchford, C.W.;Orwig, K.D.
    • Wind and Structures
    • /
    • v.11 no.2
    • /
    • pp.137-152
    • /
    • 2008
  • This paper discusses engineering aspects of the rear-flank downdraft that was recorded near Lubbock, Texas on 4 June 2002, and produced a gust wind speed nearly equal to the design value (50-year return period) for the region. The general characteristics of the storm, and the decomposition of the time histories into deterministic 'running mean' and random turbulence components are discussed. The fluctuating wind speeds generated by the event can be represented as a dominant low-frequency 'running mean' with superimposed random turbulence of higher frequencies. Spectral and correlation characteristics of the residual turbulence are found to be similar to those of high-frequency turbulence in boundary-layer winds. However, the low-frequency components in the running-mean wind speeds are spatially homogeneous, in contrast to the low-frequency turbulence found in synoptic boundary-layer winds. With respect to transmission line design, this results in significantly higher 'span reduction factors'.