• Title/Summary/Keyword: utility estimation

Search Result 174, Processing Time 0.026 seconds

Estimation and Prediction of Financial Distress: Non-Financial Firms in Bursa Malaysia

  • HIONG, Hii King;JALIL, Muhammad Farhan;SENG, Andrew Tiong Hock
    • The Journal of Asian Finance, Economics and Business
    • /
    • v.8 no.8
    • /
    • pp.1-12
    • /
    • 2021
  • Altman's Z-score is used to measure a company's financial health and to predict the probability that a company will collapse within 2 years. It is proven to be very accurate to forecast bankruptcy in a wide variety of contexts and markets. The goal of this study is to use Altman's Z-score model to forecast insolvency in non-financial publicly traded enterprises. Non-financial firms are a significant industry in Malaysia, and current trends of consolidation and long-term government subsidies make assessing the financial health of such businesses critical not just for the owners, but also for other stakeholders. The sample of this study includes 84 listed companies in the Kuala Lumpur Stock Exchange. Of the 84 companies, 52 are considered high risk, and 32 are considered low-risk companies. Secondary data for the analysis was gathered from chosen companies' financial reports. The findings of this study show that the Altman model may be used to forecast a company's financial collapse. It dispelled any reservations about the model's legitimacy and the utility of applying it to predict the likelihood of bankruptcy in a company. The findings of this study have significant consequences for investors, creditors, and corporate management. Portfolio managers may make better selections by not investing in companies that have proved to be in danger of failing if they understand the variables that contribute to corporate distress.

Optimal threshold using the correlation coefficient for the confusion matrix (혼동행렬의 상관계수를 이용한 최적분류점)

  • Hong, Chong Sun;Oh, Se Hyeon;Choi, Ye Won
    • The Korean Journal of Applied Statistics
    • /
    • v.35 no.1
    • /
    • pp.77-91
    • /
    • 2022
  • The optimal threshold estimation is considered in order to discriminate the mixture distribution in the fields of Biostatistics and credit evaluation. There exists well-known various accuracy measures that examine the discriminant power. Recently, Matthews correlation coefficient and the F1 statistic were studied to estimate optimal thresholds. In this study, we explore whether these accuracy measures are appropriate for the optimal threshold to discriminate the mixture distribution. It is found that some accuracy measures that depend on the sample size are not appropriate when two sample sizes are much different. Moreover, an alternative method for finding the optimal threshold is proposed using the correlation coefficient that defines the ratio of the confusion matrix, and the usefulness and utility of this method are also discusses.

Ground Deformation Evaluation during Vertical Shaft Construction through Digital Image Analysis

  • Woo, Sang-Kyun;Woo, Sang Inn;Kim, Joonyoung;Chu, Inyeop
    • KEPCO Journal on Electric Power and Energy
    • /
    • v.7 no.2
    • /
    • pp.285-293
    • /
    • 2021
  • The construction of underground structures such as power supply lines, communication lines, utility tunnels has significantly increased worldwide for improving urban aesthetics ensuring citizen safety, and efficient use of underground space. Those underground structures are usually constructed along with vertical cylindrical shafts to facilitate their construction and maintenance. When constructing a vertical shaft through the open-cut method, the walls are mostly designed to be flexible, allowing a certain level of displacement. The earth pressure applied to the flexible walls acts as an external force and its accurate estimation is essential for reasonable and economical structure design. The earth pressure applied to the flexible wall is closely interrelated to the displacement of the surrounding ground. This study simulated stepwise excavation for constructing a cylindrical vertical shaft through a centrifugal model experiment. One quadrant of the axisymmetric vertical shaft and the ground were modeled, and ground excavation was simulated by shrinking the vertical shaft. The deformation occurring on the entire ground during the excavation was continuously evaluated through digital image analysis. The digital image analysis evaluated complex ground deformation which varied with wall displacement, distance from the wall, and ground depth. When the ground deformation data accumulate through the method used in this study, they can be used for developing shaft wall models in future for analyzing the earth pressure acting on them.

A MULTI-OBJECTIVE OPTIMIZATION FOR CAPITAL STRUCTURE IN PRIVATELY-FINANCED INFRASTRUCTURE PROJECTS

  • S.M. Yun;S.H. Han;H. Kim
    • International conference on construction engineering and project management
    • /
    • 2007.03a
    • /
    • pp.509-519
    • /
    • 2007
  • Private financing is playing an increasing role in public infrastructure construction projects worldwide. However, private investors/operators are exposed to the financial risk of low profitability due to the inaccurate estimation of facility demand, operation income, maintenance costs, etc. From the operator's perspective, a sound and thorough financial feasibility study is required to establish the appropriate capital structure of a project. Operators tend to reduce the equity amount to minimize the level of risk exposure, while creditors persist to raise it, in an attempt to secure a sufficient level of financial involvement from the operators. Therefore, it is important for creditors and operators to reach an agreement for a balanced capital structure that synthetically considers both profitability and repayment capacity. This paper presents an optimal capital structure model for successful private infrastructure investment. This model finds the optimized point where the profitability is balanced with the repayment capacity, with the use of the concept of utility function and multi-objective GA (Generic Algorithm)-based optimization. A case study is presented to show the validity of the model and its verification. The research conclusions provide a proper capital structure for privately-financed infrastructure projects through a proposed multi-objective model.

  • PDF

PRICE ESTIMATION VIA BAYESIAN FILTERING AND OPTIMAL BID-ASK PRICES FOR MARKET MAKERS

  • Hyungbin Park;Junsu Park
    • Journal of the Korean Mathematical Society
    • /
    • v.61 no.5
    • /
    • pp.875-898
    • /
    • 2024
  • This study estimates the true price of an asset and finds the optimal bid/ask prices for market makers. We provide a novel state-space model based on the exponential Ornstein-Uhlenbeck volatility and the Heston models with Gaussian noise, where the traded price and volume are available, but the true price is not observable. An objective of this study is to use Bayesian filtering to estimate the posterior distribution of the true price, given the traded price and volume. Because the posterior density is intractable, we employ the guided particle filtering algorithm, with which adaptive rejection metropolis sampling is used to generate samples from the density function of an unknown distribution. Given a simulated sample path, the posterior expectation of the true price outperforms the traded price in estimating the true price in terms of both the mean absolute error and root-mean-square error metrics. Another objective is to determine the optimal bid/ask prices for a market maker. The profit-and-loss of the market maker is the difference between the true price and its bid/ask prices multiplied by the traded volume or bid/ask size of the market maker. The market maker maximizes the expected utility of the PnL under the posterior distribution. We numerically calculate the optimal bid/ask prices using the Monte Carlo method, finding that its spread widens as the market maker becomes more risk-averse, and the bid/ask size and the level of uncertainty increase.

Estimation of fire Experiment Prediction by Utility Tunnels Fire Experiment and Simulation (지하공동구 화재 실험 및 시뮬레이션에 의한 화재 설칠 예측 평가)

  • 윤명오;고재선;박형주;박성은
    • Fire Science and Engineering
    • /
    • v.15 no.1
    • /
    • pp.23-33
    • /
    • 2001
  • The utility tunnels are the important facility as a mainstay of country because of the latest communication developments. However, the utilities tunnel is difficult to deal with in case of a fire accident. When a cable burns, the black smoke containing poisonous gas will be reduced. This black smoke goes into the tunnel, and makes it difficult to extinguish the fire. Therefore, when there was a fire in the utility tunnel, the central nerves of the country had been paralyzed, such as property damage, communication interruption, in addition to inconvenience for people. This paper is based on the fire occurred in the past, and reenacting the fire by making the real utilities tunnel model. The aim of this paper is the scientific analysis of the character image of the fire, and the verification of each fire protection system whether it works well after process of setting up a fire protection system in the utilities tunnel at a constant temperature. The fire experiment was equipped with the linear heat detector, the fire door, the connection water spray system and the ventilation system in the utilities tunnel. Fixed portion of an electric power supply cable was coated with a fire retardant coating, and a heating tube was covered with a fireproof. The result showed that the highest temperature was $932^{\circ}c$ and the linear heat detector was working at the constant temperature, and it pointed at the place of the fire on the receiving board, and Fixed portion of the electric power supply cable coated with the fire retardant coating did not work as the fireproof. The heating tube was covered with the fireproof about 30 minutes.

  • PDF

The Cost and Adjustment Factors Estimation Method from the Perspective of Provider for Information System Maintenance Cost (공급자 관점의 정보시스템 유지보수 비용항목과 조정계수 산정방안)

  • Lee, ByoungChol;Rhew, SungYul
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.2 no.11
    • /
    • pp.757-764
    • /
    • 2013
  • The estimation of maintenance cost of information system so far has been conducted centered on the ordering body, so the problem of provider's having to cover the cost due to small cost compared to the amount of work is not solved. This study is a base study for estimating the maintenance cost of information system centered on provider, and it deduces cost items of maintenance and suggests adjustment factors for adjusting the gap between the ordering body and provider regarding the maintenance cost. In order to deduce the cost items of maintenance, this study adds the activities of the provider for maintenance to the base study of cost factors regarding the existing maintenance activity, divides, and classifies them into the fixed cost and variable cost. In order to adjust the gap between the ordering body and provider regarding the maintenance cost, this study found the adjustment factors such as the code, utility, and components created by the automatic tool that was not included when estimating the maintenance cost centered on the ordering body. After examining and analyzing K Company's data of maintenance performance for three years, it confirmed that the gap regarding the adjustment factors was about 13% in case of K Company.

Causal inference from nonrandomized data: key concepts and recent trends (비실험 자료로부터의 인과 추론: 핵심 개념과 최근 동향)

  • Choi, Young-Geun;Yu, Donghyeon
    • The Korean Journal of Applied Statistics
    • /
    • v.32 no.2
    • /
    • pp.173-185
    • /
    • 2019
  • Causal questions are prevalent in scientific research, for example, how effective a treatment was for preventing an infectious disease, how much a policy increased utility, or which advertisement would give the highest click rate for a given customer. Causal inference theory in statistics interprets those questions as inferring the effect of a given intervention (treatment or policy) in the data generating process. Causal inference has been used in medicine, public health, and economics; in addition, it has received recent attention as a tool for data-driven decision making processes. Many recent datasets are observational, rather than experimental, which makes the causal inference theory more complex. This review introduces key concepts and recent trends of statistical causal inference in observational studies. We first introduce the Neyman-Rubin's potential outcome framework to formularize from causal questions to average treatment effects as well as discuss popular methods to estimate treatment effects such as propensity score approaches and regression approaches. For recent trends, we briefly discuss (1) conditional (heterogeneous) treatment effects and machine learning-based approaches, (2) curse of dimensionality on the estimation of treatment effect and its remedies, and (3) Pearl's structural causal model to deal with more complex causal relationships and its connection to the Neyman-Rubin's potential outcome model.

Estimation of Overall Household Utility for Heavy Metal Reduction in Shrimp (새우류 중금속 저감에 대한 전체가구의 효용 추정)

  • Hyun Joung Jin;Ye Jin We
    • Journal of Food Hygiene and Safety
    • /
    • v.38 no.4
    • /
    • pp.255-263
    • /
    • 2023
  • The standards for heavy metal levels in crustaceans are 0.5 mg/kg and 1.0 mg/kg or lower for lead and cadmium, respectively. Further, the contamination levels of arsenic, mercury, methyl mercury, and tin are being continuously investigated, considering their current exposure levels. Shrimps are potentially exposed to heavy metals because they inhabit areas with abundant organic matter, such as sandy or muddy shores, places with a lot of seaweed, and estuaries. This study measured the monetary value of reducing consumer anxiety and increasing consumer confidence if the government prohibits the sale of shrimp species that exceed the threshold for specific heavy metals and of the top shrimp species for which no threshold for heavy metals is specified. We derived consumer willingness-to-pay (WTP). Combining the estimated WTP with the number of households in the country, the total value of benefits was estimated to be 363.9 billion won. The results of this study will provide an important empirical finding, showing to what extent specific policies regarding heavy metals in seafood can alleviate consumer anxiety and provide psychological reassurance.

Analysis of Climate Characteristics Observed over the Korean Peninsula for the Estimation of Climate Change Vulnerability Index (기후변화 취약성 지수 산출을 위한 한반도 관측 기후 특성 분석)

  • Nam, Ki-Pyo;Kang, Jeong-Eon;Kim, Cheol-Hee
    • Journal of Environmental Impact Assessment
    • /
    • v.20 no.6
    • /
    • pp.891-905
    • /
    • 2011
  • Climate vulnerability index is usually defined as a function of the climate exposure, sensitivity, and adaptive capacity, which requires adequate selection of proxy variables of each variable. We selected and used 9 proxy variables related to climate exposure in the literature, and diagnosed the adequacy of them for application in Korean peninsula. The selected proxy variables are: four variables from temperature, three from precipitation, one from wind speed, and one from relative humidity. We collected climate data over both previous year (1981~2010) and future climate scenario (A1B scenario of IPCC SERES) for 2020, 2050, and 2100. We introduced the spatial and temporal diagnostic statistical parameters, and evaluated both spatial and time variabilities in the relative scale. Of 9 proxy variables, effective humidity indicated the most sensitive to climate change temporally with the biggest spatial variability, implying a good proxy variable in diagnostics of climate change vulnerability in Korea. The second most sensitive variable is the frequency of strong wind speed with a decreasing trend, suggesting that it should be used carefully or may not be of broad utility as a proxy variable in Korea. The A1B scenario of future climate in 2020, 2050 and 2100 matches well with the extension of linear trend of observed variables during 1981~2010, indicating that, except for strong wind speed, the selected proxy variables can be effectively used in calculating the vulnerability index for both past and future climate over Korea. Other local variabilities for the past and future climate in association with climate exposure variables are also discussed here.