• Title/Summary/Keyword: Probabilistic Prediction

Search Result 280, Processing Time 0.027 seconds

Bayesian Network Analysis for the Dynamic Prediction of Financial Performance Using Corporate Social Responsibility Activities (베이지안 네트워크를 이용한 기업의 사회적 책임활동과 재무성과)

  • Sun, Eun-Jung
    • Management & Information Systems Review
    • /
    • v.34 no.5
    • /
    • pp.71-92
    • /
    • 2015
  • This study analyzes the impact of Corporate Social Responsibility (CSR) activities on financial performances using Bayesian Network. The research tries to overcome the issues of the uniform assumption of a linear function between financial performance and CSR activities in multiple regression analysis widely used in previous studies. It is required to infer a causal relationship between activities of CSR which have an impact on the financial performances. Identifying the relationship would empower the firms to improve their financial performance by informing the decision makers about the different CSR activities that influence the financial performance of the firms. This research proposes General Bayesian Network (GBN) and presents Markov Blanket induced from GBN. It is empirically demonstrated that all the proposals presented in this study are statistically significant by the results of the research conducted by Korean Economic Justice Institute (KEJI) under Citizen's Coalition for Economic Justice (CCEJ) which investigated approximately 200 companies in Korea based on Korean Economic Justice Institute Index (KEJI index) from 2005 to 2011. The Bayesian Network to effectively infer the properties affecting financial performances through the probabilistic causal relationship. Moreover, I found that there is a causal relationship among CSR activities variable; that is Environment protection is related to Customer protection, Employee satisfaction, and firm size; Soundness is related to Total CSR Evaluation Score, Debt-Assets Ratio. Though the what-if analysis, I suggest to the sensitive factor among the explanatory variables.

  • PDF

Influence of Modelling Approaches of Diffusion Coefficients on Atmospheric Dispersion Factors (확산계수의 모델링방법이 대기확산인자에 미치는 영향)

  • Hwang, Won Tae;Kim, Eun Han;Jeong, Hae Sun;Jeong, Hyo Joon;Han, Moon Hee
    • Journal of Radiation Protection and Research
    • /
    • v.38 no.2
    • /
    • pp.60-67
    • /
    • 2013
  • A diffusion coefficient is an important parameter in the prediction of atmospheric dispersion using a Gaussian plume model, and its modelling approach varies. In this study, dispersion coefficients recommended by the U. S. Nuclear Regulatory Commission's (U. S. NRC's) regulatory guide and the Canadian Nuclear Safety Commission's (CNSC's) regulatory guide, and used in probabilistic accident consequence analysis codes MACCS and MACCS2 have been investigated. Based on the atmospheric dispersion model for a hypothetical accidental release recommended by the U. S. NRC, its influence to atmospheric dispersion factor was discussed. It was found that diffusion coefficients are basically predicted from a Pasquill- Gifford curve, but various curve fitting equations are recommended or used. A lateral dispersion coefficient is corrected with consideration for the additional spread due to plume meandering in all models, however its modelling approach showed a distinctive difference. Moreover, a vertical dispersion coefficient is corrected with consideration for the additional plume spread due to surface roughness in all models, except for the U. S. NRC's recommendation. For a specified surface roughness, the atmospheric dispersion factors showed differences up to approximately 4 times depending on the modelling approach of a dispersion coefficient. For the same model, the atmospheric dispersion factors showed differences by 2 to 3 times depending on surface roughness.

Risk-Targeted Seismic Performance of Steel Ordinary Concentrically Braced Frames Considering Seismic Hazard (지진재해도를 고려한 철골 보통중심가새골조의 위험도기반 내진성능)

  • Shin, Dong-Hyeon;Hong, Suk-Jae;Kim, Hyung-Joon
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.30 no.5
    • /
    • pp.371-380
    • /
    • 2017
  • The risk-targeted seismic design concept was first included in ASCE/SEI 7-10 to address problems related to the uniform-hazard based seismic concept that has been constructed without explicitly considering probabilistic uncertainties in the collapse capacities of structures. However, this concept is not yet reflected to the current Korean building code(KBC) because of insufficient strong earthquake data occurred at the Korean peninsula and little information on the collapse capacities of structures. This study evaluates the risk-targeted seismic performance of steel ordinary concentrically braced frames(OCBFs). To do this, the collapse capacities of prototype steel OCBFs are assessed with various analysis parameters including building locations, building heights and soil conditions. The seismic hazard curves are developed using an empirical spectral shape prediction model that is capable of reflecting the characteristics of earthquake records. The collapse probabilities of the prototype steel OCBFs located at the Korean major cities are then evaluated using the risk integral concept. As a result, analysis parameters considerably influence the collapse probabilities of steel OCBFs. The collapse probabilities of taller steel OCBFs exceed the target seismic risk of 1 percent in 50 years, which the introduction of the height limitation of steel OCBFs into the future KBC should be considered.

Analysis and Prediction for Spatial Distribution of Functional Feeding Groups of Aquatic Insects in the Geum River (금강 수계 수서곤충 섭식기능군의 공간분포 분석 및 예측)

  • Kim, Ki-Dong;Park, Young-Jun;Nam, Sang-Ho
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.15 no.1
    • /
    • pp.99-118
    • /
    • 2012
  • The aim of this study is to define a correlation between spatial distribution characteristics of FFG(Functional Feeding Groups) of aquatic insects and related environmental factors in the Geum River based on the theory of RCC(River Continuum Concept). For that objective we had used SMRA(Stepwise Multiple Regression Analysis) method to analyze close relationship between the distribution of aquatic insects and the physical and chemical factors that may affect their inhabiting environment in the study area. And then, a probabilistic method named Frequency Ratio Model(FRM) and spatial analysis function of GIS were applied to produce a predictive distribution map of biota community considering their distribution characteristics according to the environmental factors as related variables. As a result of SMRA, the values of decision coefficient for factors of elevation, stream width, flow velocity, conductivity, temperature and percentage of sand showed higher than 0.5. Therefore these 6 environmental factors were considered as major factors that might affect the distribution characteristics of aquatic insects. Finally, we had calculated RMSE(Root Mean Square Error) between the predicted distribution map and prior survey database from other researches to verify the result of this study. The values of RMSE were calculated from 0.1892 to 0.4242 according to each FFG so we could find out a high reliability of this study. The results of this study might be used to develop a new estimation method for aquatic ecosystem with macro invertebrate community and also be used as preliminary data for conservation and restoration of stream habitats.

Simulation-Based Stochastic Markup Estimation System $(S^2ME)$ (시뮬레이션을 기반(基盤)으로 하는 영업이윤율(營業利潤率) 추정(推定) 시스템)

  • Yi, Chang-Yong;Kim, Ryul-Hee;Lim, Tae-Kyung;Kim, Wha-Jung;Lee, Dong-Eun
    • Proceedings of the Korean Institute of Building Construction Conference
    • /
    • 2007.11a
    • /
    • pp.109-113
    • /
    • 2007
  • This paper introduces a system, Simulation based Stochastic Markup Estimation System (S2ME), for estimating optimum markup for a project. The system was designed and implemented to better represent the real world system involved in construction bidding. The findings obtained from the analysis of existing assumptions used in the previous quantitative markup estimation methods were incorporated to improve the accuracy and predictability of the S2ME. The existing methods has four categories of assumption as follows; (1) The number of competitors and who is the competitors are known, (2) A typical competitor, who is fictitious, is assumed for easy computation, (3) the ratio of bid price against cost estimate (B/C) is assumed to follow normal distribution, (4) The deterministic output obtained from the probabilistic equation of existing models is assumed to be acceptable. However, these assumptions compromise the accuracy of prediction. In practice, the bidding patterns of the bidders are randomized in competitive bidding. To complement the lack of accuracy contributed by these assumptions, bidding project was randomly selected from the pool of bidding database in the simulation experiment. The probability to win the bid in the competitive bidding was computed using the profile of the competitors appeared in the selected bidding project record. The expected profit and probability to win the bid was calculated by selecting a bidding record randomly in an iteration of the simulation experiment under the assumption that the bidding pattern retained in historical bidding DB manifest revival. The existing computation, which is handled by means of deterministic procedure, were converted into stochastic model using simulation modeling and analysis technique as follows; (1) estimating the probability distribution functions of competitors' B/C which were obtained from historical bidding DB, (2) analyzing the sensitivity against the increment of markup using normal distribution and actual probability distribution estimated by distribution fitting, (3) estimating the maximum expected profit and optimum markup range. In the case study, the best fitted probability distribution function was estimated using the historical bidding DB retaining the competitors' bidding behavior so that the reliability was improved by estimating the output obtained from simulation experiment.

  • PDF

A study on reliability analysis model of the repair and replacement cycle of a building which utilizes Monte Carlo Simulation (몬테카를로 시뮬레이션을 활용한 건축물 수선교체주기 신뢰성 분석 모델에 관한 연구)

  • Kim, Jong-Rok;Jung, Young-Han;Son, Jae-Ho
    • Journal of the Korea Institute of Building Construction
    • /
    • v.10 no.2
    • /
    • pp.41-50
    • /
    • 2010
  • This study presented a model that can enable a reliability analysis for the repair and replacement cycle of a building by using background repair and replacement data and expert opinion as foundation data and applying Monte Carlo Simulation. The presented model offers the time of the repair and replacement of building elements for the period of a year, and supports the prediction of repair and replacement and expenses demand in advance while planning the maintenance of a building. In addition, the model will significantly reduce the risks to the building owner with regard to maintenance decisions. In addition, when a person in charge of the maintenance of large-scale building assets is having difficulties making decisions regarding the repair and replacement of existing building elements due to a lack of background data to support a long-term policy on the repair and replacement requirements, an engineering solution that can ensure the adequacy of this is provided. In summary, it can be largely divided into three study results. First, a method of estimating the repair and replacement cycle that can deal with the development of a construction system was developed. Second, a probabilistic methodology that can quantify the risk of the repair and replacement cycle was proposed. Third, the proposed model can be used as a means of supporting designer and constructor in making decisions for the life cycle plan of a building during a construction project.

Long Range Forecast of Garlic Productivity over S. Korea Based on Genetic Algorithm and Global Climate Reanalysis Data (전지구 기후 재분석자료 및 인공지능을 활용한 남한의 마늘 생산량 장기예측)

  • Jo, Sera;Lee, Joonlee;Shim, Kyo Moon;Kim, Yong Seok;Hur, Jina;Kang, Mingu;Choi, Won Jun
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.23 no.4
    • /
    • pp.391-404
    • /
    • 2021
  • This study developed a long-term prediction model for the potential yield of garlic based on a genetic algorithm (GA) by utilizing global climate reanalysis data. The GA is used for digging the inherent signals from global climate reanalysis data which are both directly and indirectly connected with the garlic yield potential. Our results indicate that both deterministic and probabilistic forecasts reasonably capture the inter-annual variability of crop yields with temporal correlation coefficients significant at 99% confidence level and superior categorical forecast skill with a hit rate of 93.3% for 2 × 2 and 73.3% for 3 × 3 contingency tables. Furthermore, the GA method, which considers linear and non-linear relationships between predictors and predictands, shows superiority of forecast skill in terms of both stability and skill scores compared with linear method. Since our result can predict the potential yield before the start of farming, it is expected to help establish a long-term plan to stabilize the demand and price of agricultural products and prepare countermeasures for possible problems in advance.

A study on prediction method for flood risk using LENS and flood risk matrix (국지 앙상블자료와 홍수위험매트릭스를 이용한 홍수위험도 예측 방법 연구)

  • Choi, Cheonkyu;Kim, Kyungtak;Choi, Yunseok
    • Journal of Korea Water Resources Association
    • /
    • v.55 no.9
    • /
    • pp.657-668
    • /
    • 2022
  • With the occurrence of localized heavy rain while river flow has increased, both flow and rainfall cause riverside flood damages. As the degree of damage varies according to the level of social and economic impact, it is required to secure sufficient forecast lead time for flood response in areas with high population and asset density. In this study, the author established a flood risk matrix using ensemble rainfall runoff modeling and evaluated its applicability in order to increase the damage reduction effect by securing the time required for flood response. The flood risk matrix constructs the flood damage impact level (X-axis) using flood damage data and predicts the likelihood of flood occurrence (Y-axis) according to the result of ensemble rainfall runoff modeling using LENS rainfall data and as well as probabilistic forecasting. Therefore, the author introduced a method for determining the impact level of flood damage using historical flood damage data and quantitative flood damage assessment methods. It was compared with the existing flood warning data and the damage situation at the flood warning points in the Taehwa River Basin and the Hyeongsan River Basin in the Nakdong River Region. As a result, the analysis showed that it was possible to predict the time and degree of flood risk from up to three days in advance. Hence, it will be helpful for damage reduction activities by securing the lead time for flood response.

Estimation of freeze damage risk according to developmental stage of fruit flower buds in spring (봄철 과수 꽃눈 발육 수준에 따른 저온해 위험도 산정)

  • Kim, Jin-Hee;Kim, Dae-jun;Kim, Soo-ock;Yun, Eun-jeong;Ju, Okjung;Park, Jong Sun;Shin, Yong Soon
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.21 no.1
    • /
    • pp.55-64
    • /
    • 2019
  • The flowering seasons can be advanced due to climate change that would cause an abnormally warm winter. Such warm winter would increase the frequency of crop damages resulted from sudden occurrences of low temperature before and after the vegetative growth stages, e.g., the period from germination to flowering. The degree and pattern of freezing damage would differ by the development stage of each individual fruit tree even in an orchard. A critical temperature, e.g., killing temperature, has been used to predict freeze damage by low-temperature conditions under the assumption that such damage would be associated with the development stage of a fruit flower bud. However, it would be challenging to apply the critical temperature to a region where spatial variation in temperature would be considerably high. In the present study, a phenological model was used to estimate major bud development stages, which would be useful for prediction of regional risks for the freeze damages. We also derived a linear function to calculate a probabilistic freeze risk in spring, which can quantitatively evaluate the risk level based solely on forecasted weather data. We calculated the dates of freeze damage occurrences and spatial risk distribution according to main production areas by applying the spring freeze risk function to apple, peach, and pear crops in 2018. It was predicted that the most extensive low-temperature associated freeze damage could have occurred on April 8. It was also found that the risk function was useful to identify the main production areas where the greatest damage to a given crop could occur. These results suggest that the freezing damage associated with the occurrence of low-temperature events could decrease providing early warning for growers to respond abnormal weather conditions for their farm.

A Study on Interactions of Competitive Promotions Between the New and Used Cars (신차와 중고차간 프로모션의 상호작용에 대한 연구)

  • Chang, Kwangpil
    • Asia Marketing Journal
    • /
    • v.14 no.1
    • /
    • pp.83-98
    • /
    • 2012
  • In a market where new and used cars are competing with each other, we would run the risk of obtaining biased estimates of cross elasticity between them if we focus on only new cars or on only used cars. Unfortunately, most of previous studies on the automobile industry have focused on only new car models without taking into account the effect of used cars' pricing policy on new cars' market shares and vice versa, resulting in inadequate prediction of reactive pricing in response to competitors' rebate or price discount. However, there are some exceptions. Purohit (1992) and Sullivan (1990) looked into both new and used car markets at the same time to examine the effect of new car model launching on the used car prices. But their studies have some limitations in that they employed the average used car prices reported in NADA Used Car Guide instead of actual transaction prices. Some of the conflicting results may be due to this problem in the data. Park (1998) recognized this problem and used the actual prices in his study. His work is notable in that he investigated the qualitative effect of new car model launching on the pricing policy of the used car in terms of reinforcement of brand equity. The current work also used the actual price like Park (1998) but the quantitative aspect of competitive price promotion between new and used cars of the same model was explored. In this study, I develop a model that assumes that the cross elasticity between new and used cars of the same model is higher than those amongst new cars and used cars of the different model. Specifically, I apply the nested logit model that assumes the car model choice at the first stage and the choice between new and used cars at the second stage. This proposed model is compared to the IIA (Independence of Irrelevant Alternatives) model that assumes that there is no decision hierarchy but that new and used cars of the different model are all substitutable at the first stage. The data for this study are drawn from Power Information Network (PIN), an affiliate of J.D. Power and Associates. PIN collects sales transaction data from a sample of dealerships in the major metropolitan areas in the U.S. These are retail transactions, i.e., sales or leases to final consumers, excluding fleet sales and including both new car and used car sales. Each observation in the PIN database contains the transaction date, the manufacturer, model year, make, model, trim and other car information, the transaction price, consumer rebates, the interest rate, term, amount financed (when the vehicle is financed or leased), etc. I used data for the compact cars sold during the period January 2009- June 2009. The new and used cars of the top nine selling models are included in the study: Mazda 3, Honda Civic, Chevrolet Cobalt, Toyota Corolla, Hyundai Elantra, Ford Focus, Volkswagen Jetta, Nissan Sentra, and Kia Spectra. These models in the study accounted for 87% of category unit sales. Empirical application of the nested logit model showed that the proposed model outperformed the IIA (Independence of Irrelevant Alternatives) model in both calibration and holdout samples. The other comparison model that assumes choice between new and used cars at the first stage and car model choice at the second stage turned out to be mis-specfied since the dissimilarity parameter (i.e., inclusive or categroy value parameter) was estimated to be greater than 1. Post hoc analysis based on estimated parameters was conducted employing the modified Lanczo's iterative method. This method is intuitively appealing. For example, suppose a new car offers a certain amount of rebate and gains market share at first. In response to this rebate, a used car of the same model keeps decreasing price until it regains the lost market share to maintain the status quo. The new car settle down to a lowered market share due to the used car's reaction. The method enables us to find the amount of price discount to main the status quo and equilibrium market shares of the new and used cars. In the first simulation, I used Jetta as a focal brand to see how its new and used cars set prices, rebates or APR interactively assuming that reactive cars respond to price promotion to maintain the status quo. The simulation results showed that the IIA model underestimates cross elasticities, resulting in suggesting less aggressive used car price discount in response to new cars' rebate than the proposed nested logit model. In the second simulation, I used Elantra to reconfirm the result for Jetta and came to the same conclusion. In the third simulation, I had Corolla offer $1,000 rebate to see what could be the best response for Elantra's new and used cars. Interestingly, Elantra's used car could maintain the status quo by offering lower price discount ($160) than the new car ($205). In the future research, we might want to explore the plausibility of the alternative nested logit model. For example, the NUB model that assumes choice between new and used cars at the first stage and brand choice at the second stage could be a possibility even though it was rejected in the current study because of mis-specification (A dissimilarity parameter turned out to be higher than 1). The NUB model may have been rejected due to true mis-specification or data structure transmitted from a typical car dealership. In a typical car dealership, both new and used cars of the same model are displayed. Because of this fact, the BNU model that assumes brand choice at the first stage and choice between new and used cars at the second stage may have been favored in the current study since customers first choose a dealership (brand) then choose between new and used cars given this market environment. However, suppose there are dealerships that carry both new and used cars of various models, then the NUB model might fit the data as well as the BNU model. Which model is a better description of the data is an empirical question. In addition, it would be interesting to test a probabilistic mixture model of the BNU and NUB on a new data set.

  • PDF