• Title/Summary/Keyword: Stochastic analysis

Search Result 1,250, Processing Time 0.028 seconds

A Study On The Economic Value Of Firm's Big Data Technologies Introduction Using Real Option Approach - Based On YUYU Pharmaceuticals Case - (실물옵션 기법을 이용한 기업의 빅데이터 기술 도입의 경제적 가치 분석 - 유유제약 사례를 중심으로 -)

  • Jang, Hyuk Soo;Lee, Bong Gyou
    • Journal of Internet Computing and Services
    • /
    • v.15 no.6
    • /
    • pp.15-26
    • /
    • 2014
  • This study focus on a economic value of the Big Data technologies by real options model using big data technology company's stock price to determine the price of the economic value of incremental assessed value. For estimating stochastic process of company's stock price by big data technology to extract the incremental shares, Generalized Moments Method (GMM) are used. Option value for Black-Scholes partial differential equation was derived, in which finite difference numerical methods to obtain the Big Data technology was introduced to estimate the economic value. As a result, a option value of big data technology investment is 38.5 billion under assumption which investment cost is 50 million won and time value is a about 1 million, respectively. Thus, introduction of big data technology to create a substantial effect on corporate profits, is valuable and there are an effects on the additional time value. Sensitivity analysis of lower underlying asset value appear decreased options value and the lower investment cost showed increased options value. A volatility are not sensitive on the option value due to the big data technological characteristics which are low stock volatility and introduction periods.

Time-series Mapping and Uncertainty Modeling of Environmental Variables: A Case Study of PM10 Concentration Mapping (시계열 환경변수 분포도 작성 및 불확실성 모델링: 미세먼지(PM10) 농도 분포도 작성 사례연구)

  • Park, No-Wook
    • Journal of the Korean earth science society
    • /
    • v.32 no.3
    • /
    • pp.249-264
    • /
    • 2011
  • A multi-Gaussian kriging approach extended to space-time domain is presented for uncertainty modeling as well as time-series mapping of environmental variables. Within a multi-Gaussian framework, normal score transformed environmental variables are first decomposed into deterministic trend and stochastic residual components. After local temporal trend models are constructed, the parameters of the models are estimated and interpolated in space. Space-time correlation structures of stationary residual components are quantified using a product-sum space-time variogram model. The ccdf is modeled at all grid locations using this space-time variogram model and space-time kriging. Finally, e-type estimates and conditional variances are computed from the ccdf models for spatial mapping and uncertainty analysis, respectively. The proposed approach is illustrated through a case of time-series Particulate Matter 10 ($PM_{10}$) concentration mapping in Incheon Metropolitan city using monthly $PM_{10}$ concentrations at 13 stations for 3 years. It is shown that the proposed approach would generate reliable time-series $PM_{10}$ concentration maps with less mean bias and better prediction capability, compared to conventional spatial-only ordinary kriging. It is also demonstrated that the conditional variances and the probability exceeding a certain thresholding value would be useful information sources for interpretation.

Robust Design Method for Complex Stochastic Inventory Model

  • Hwang, In-Keuk;Park, Dong-Jin
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1999.04a
    • /
    • pp.426-426
    • /
    • 1999
  • ;There are many sources of uncertainty in a typical production and inventory system. There is uncertainty as to how many items customers will demand during the next day, week, month, or year. There is uncertainty about delivery times of the product. Uncertainty exacts a toll from management in a variety of ways. A spurt in a demand or a delay in production may lead to stockouts, with the potential for lost revenue and customer dissatisfaction. Firms typically hold inventory to provide protection against uncertainty. A cushion of inventory on hand allows management to face unexpected demands or delays in delivery with a reduced chance of incurring a stockout. The proposed strategies are used for the design of a probabilistic inventory system. In the traditional approach to the design of an inventory system, the goal is to find the best setting of various inventory control policy parameters such as the re-order level, review period, order quantity, etc. which would minimize the total inventory cost. The goals of the analysis need to be defined, so that robustness becomes an important design criterion. Moreover, one has to conceptualize and identify appropriate noise variables. There are two main goals for the inventory policy design. One is to minimize the average inventory cost and the stockouts. The other is to the variability for the average inventory cost and the stockouts The total average inventory cost is the sum of three components: the ordering cost, the holding cost, and the shortage costs. The shortage costs include the cost of the lost sales, cost of loss of goodwill, cost of customer dissatisfaction, etc. The noise factors for this design problem are identified to be: the mean demand rate and the mean lead time. Both the demand and the lead time are assumed to be normal random variables. Thus robustness for this inventory system is interpreted as insensitivity of the average inventory cost and the stockout to uncontrollable fluctuations in the mean demand rate and mean lead time. To make this inventory system for robustness, the concept of utility theory will be used. Utility theory is an analytical method for making a decision concerning an action to take, given a set of multiple criteria upon which the decision is to be based. Utility theory is appropriate for design having different scale such as demand rate and lead time since utility theory represents different scale across decision making attributes with zero to one ranks, higher preference modeled with a higher rank. Using utility theory, three design strategies, such as distance strategy, response strategy, and priority-based strategy. for the robust inventory system will be developed.loped.

  • PDF

An empirical study for a better curriculum reform of statistical correlation based on an abduction (중등학교 상관관계 지도 내용 개선을 위한 가추적 실증 연구)

  • Lee, Young-Ha;Kim, So-Hyun
    • Journal of Educational Research in Mathematics
    • /
    • v.22 no.3
    • /
    • pp.371-386
    • /
    • 2012
  • This research assumes two facts; One is that the mathematics curriculum reform of Korea in 2007 would have been better if it had been a revise instead of deletion and the other is that every school curriculum should be of help for the sound enhancement of all 6 types of logical concepts that appears in the Piaget's theory of cognitive development. What our mathematics curriculum has introduced as a correlation is not the one of the 6 logical concepts that Piaget had thought in his theory of cognitive development. In order to see the reason of that difference, we check the difference of jargons among the academic denominations, such as Pedagogy, Psychology and Statistics through their college textbooks. Because we suppose that the mismatch of 'Piaget's vs Curriculum's correlation' is due to the mis-communication among scholars of different academic denominations. With what we learned via the above analytical study leaned on an abduction and to get some idea on them for the potential future construction of school Statistics curriculum when it should be returned, which we believe so, we observe two foreign highschool mathematics textbooks briefly. As a result of the study, we found that the concept of correlation in Pedagogy contain all kinds of relation while it was stingy in Statistics. Here we report a main result; A careful discretion among similar concepts of correlation, such as linear relationship(correlation), stochastic change along conditions(dependence), central comparison(other relation) are needed for the potential future curriculum. And if new curriculum contains the linear correlation then we strongly recommend to involve the regression line to connect it with the linear function chapter.

  • PDF

Response scaling factors for nonlinear response analysis of MDOF system (다층건물의 비선형 반응해석을 위한 반응수정계수)

  • 한상환;이리형
    • Computational Structural Engineering
    • /
    • v.8 no.3
    • /
    • pp.103-111
    • /
    • 1995
  • Evaluating nonlinear response of a MDOF system under dynamic stochastic loads such as seismic excitation usually requires excessive computational efforts. To alleviate this computational difficulty, an approximation is developed in which the MDOF inelastic system is replaced by a simple nonlinear equivalent system(ENS).Me ENS retains the most important properties of the original system such as dynamic characteristics of the first two modes and the global yielding behavior of the MDOF system. The system response is described by the maximum global(building) and local(interstory) drifts. The equivalency is achieved by two response scaling factors, a global response scaling factor R/sub G/, and a local response scaling factor R/sub L/, applied to the responses of the ENS to match those of the original MDOF system. These response scaling factors are obtained as functions of ductility and mass participation factors of the first two modes of structures by extensive regression analyses based on results of responses of the MDOF system and the ENS to actual ground accelerations recorded in past earthquakes. To develop the ENS with two response scaling factors, Special Moment Resisting Steel Frames are considered. Then, these response scaling factors are applied to the response of ENS to obtain the nonlinear response of MDOF system.

  • PDF

Evaluation of Response Variability of Functionally Graded Material Beam with Varying Sectional Area due to Spatial Randomness in Elastic Modulus along Axial Direction (기능경사재료 변단면 보에서 축방향 탄성계수의 공간적 불확실성에 의한 응답변화도 평가)

  • Noh, Hyuk Chun
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.27 no.3
    • /
    • pp.199-206
    • /
    • 2014
  • In this paper, a scheme to evaluate the response variability for functionally graded material (FGM) beam with varying sectional area is presented. The randomness is assumed to appear in a spatial domain along the beam axis in the elastic modulus. The functionally graded material categorized as composite materials, however without the drawbacks of delamination and occurrence of cracks due to abrupt change in material properties between layers in the conventional composite materials. The functionally graded material is produced by the gradual solidification through thickness direction, which endows continuous variation of material properties, which makes this material performs in a smooth way. However, due to difficulties in tailoring the gradients, to have uncertainty in material properties is unavoidable. The elastic modulus at the center section is assumed to be random in the spatial domain along the beam axis. Introducing random variables, defined in terms of stochastic integration, the first and second moments of responses are evaluated. The proposed scheme is verified by using the Monte Carlo simulation based on the random samples generated employing the spectral representation scheme. The response variability as a function of correlation distance, the effects of material and geometrical parameters on the response variability are investigated in detail. The efficiency of the proposed scheme is also addressed by comparing the analysis time of the proposed scheme and MCS.

Modeling Virtual Ecosystems that Consist of Artificial Organisms and Their Environment (인공생명체와 그들을 둘러싸는 환경으로 구성 되어지는 가상생태계 모델링)

  • Lee, Sang-Hee
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.12 no.2
    • /
    • pp.122-131
    • /
    • 2010
  • This paper introduces the concept of a virtual ecosystem and reports the following three mathematical approaches that could be widely used to construct such an ecosystem, along with examples: (1) a molecular dynamics simulation approach for animal flocking behavior, (2) a stochastic lattice model approach for termite colony behavior, and (3) a rule-based cellular automata approach for biofilm growth. The ecosystem considered in this study consists of artificial organisms and their environment. Each organism in the ecosystem is an agent that interacts autonomously with the dynamic environment, including the other organisms within it. The three types of model were successful to account for each corresponding ecosystem. In order to accurately mimic a natural ecosystem, a virtual ecosystem needs to take many ecological variables into account. However, doing so is likely to introduce excess complexity and nonlinearity in the analysis of the virtual ecosystem's dynamics. Nonetheless, the development of a virtual ecosystem is important, because it can provide possible explanations for various phenomena such as environmental disturbances and disasters, and can also give insights into ecological functions from an individual to a community level from a synthetic viewpoint. As an example of how lower and higher levels in an ecosystem can be connected, this paper also briefly discusses the application of the second model to the simulation of a termite ecosystem and the influence of climate change on the termite ecosystem.

Nonlinear Autoregressive Modeling of Southern Oscillation Index (비선형 자기회귀모형을 이용한 남방진동지수 시계열 분석)

  • Kwon, Hyun-Han;Moon, Young-Il
    • Journal of Korea Water Resources Association
    • /
    • v.39 no.12 s.173
    • /
    • pp.997-1012
    • /
    • 2006
  • We have presented a nonparametric stochastic approach for the SOI(Southern Oscillation Index) series that used nonlinear methodology called Nonlinear AutoRegressive(NAR) based on conditional kernel density function and CAFPE(Corrected Asymptotic Final Prediction Error) lag selection. The fitted linear AR model represents heteroscedasticity, and besides, a BDS(Brock - Dechert - Sheinkman) statistics is rejected. Hence, we applied NAR model to the SOI series. We can identify the lags 1, 2 and 4 are appropriate one, and estimated conditional mean function. There is no autocorrelation of residuals in the Portmanteau Test. However, the null hypothesis of normality and no heteroscedasticity is rejected in the Jarque-Bera Test and ARCH-LM Test, respectively. Moreover, the lag selection for conditional standard deviation function with CAFPE provides lags 3, 8 and 9. As the results of conditional standard deviation analysis, all I.I.D assumptions of the residuals are accepted. Particularly, the BDS statistics is accepted at the 95% and 99% significance level. Finally, we split the SOI set into a sample for estimating themodel and a sample for out-of-sample prediction, that is, we conduct the one-step ahead forecasts for the last 97 values (15%). The NAR model shows a MSEP of 0.5464 that is 7% lower than those of the linear model. Hence, the relevance of the NAR model may be proved in these results, and the nonparametric NAR model is encouraging rather than a linear one to reflect the nonlinearity of SOI series.

Development of Statistical Downscaling Model Using Nonstationary Markov Chain (비정상성 Markov Chain Model을 이용한 통계학적 Downscaling 기법 개발)

  • Kwon, Hyun-Han;Kim, Byung-Sik
    • Journal of Korea Water Resources Association
    • /
    • v.42 no.3
    • /
    • pp.213-225
    • /
    • 2009
  • A stationary Markov chain model is a stochastic process with the Markov property. Having the Markov property means that, given the present state, future states are independent of the past states. The Markov chain model has been widely used for water resources design as a main tool. A main assumption of the stationary Markov model is that statistical properties remain the same for all times. Hence, the stationary Markov chain model basically can not consider the changes of mean or variance. In this regard, a primary objective of this study is to develop a model which is able to make use of exogenous variables. The regression based link functions are employed to dynamically update model parameters given the exogenous variables, and the model parameters are estimated by canonical correlation analysis. The proposed model is applied to daily rainfall series at Seoul station having 46 years data from 1961 to 2006. The model shows a capability to reproduce daily and seasonal characteristics simultaneously. Therefore, the proposed model can be used as a short or mid-term prediction tool if elaborate GCM forecasts are used as a predictor. Also, the nonstationary Markov chain model can be applied to climate change studies if GCM based climate change scenarios are provided as inputs.

Reliability Assessment Based on an Improved Response Surface Method (개선된 응답면기법에 의한 신뢰성 평가)

  • Cho, Tae Jun;Kim, Lee Hyeon;Cho, Hyo Nam
    • Journal of Korean Society of Steel Construction
    • /
    • v.20 no.1
    • /
    • pp.21-31
    • /
    • 2008
  • response surface method (RSM) is widely used to evaluate th e extremely smal probability of ocurence or toanalyze the reliability of very complicated structures. Althoug h Monte-Carlo Simulation (MCS) technique can evaluate any system, the procesing time of MCS dependson the reciprocal num ber of the probability of failure. The stochastic finite element method could solve thislimitation. However, it is limit ed to the specific program, in which the mean and coeficient o f random variables are programed by a perturbation or by a weigh ted integral method. Therefore, it is not aplicable when erequisite programing. In a few number of stage analyses, RSM can construct a regresion model from the response of the c omplicated structural system, thus, saving time and efort significantly. However, the acuracy of RSM depends on the dist ance of the axial points and on the linearity of the limit stat e functions. To improve the convergence in exact solution regardl es of the linearity limit of state functions, an improved adaptive response surface method is developed. The analyzed res ults have ben verified using linear and quadratic forms of response surface functions in two examples. As a result, the be st combination of the improved RSM techniques is determined and programed in a numerical code. The developed linear adapti ve weighted response surface method (LAW-RSM) shows the closest converged reliability indices, compared with quadratic form or non-adaptive or non-weighted RSMs.