• Title/Summary/Keyword: Levy process

Search Result 31, Processing Time 0.03 seconds

General Laws of the Iterated Logarithm for Levy Processes

  • Wee, In-Suk;Kim, Yun-Kyong
    • Journal of the Korean Statistical Society
    • /
    • v.17 no.1
    • /
    • pp.30-45
    • /
    • 1988
  • Let ${X(t) : 0 \leq t < \infty}$ be a real-valued process with stationary independent increments. In this paper, we obtain necesary and sufficint condition for there to exist a positive, nondecreasing function $\beta(t)$ so that $0 < lim sup $\mid$X(t)$\mid$/\beta(t) < \infty$ a.s. both as t tends to zero and infinity. When no such $\beta(t)$ exists we give a simple integral test for whether $lim sup $\mid$X(t)$\mid$/\beta(t)$ is zero or infinity for a given $\beta(t)$.

  • PDF

Ontology Alignment by Using Discrete Cuckoo Search (이산 Cuckoo Search를 이용한 온톨로지 정렬)

  • Han, Jun;Jung, Hyunjun;Baik, Doo-Kwon
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.3 no.12
    • /
    • pp.523-530
    • /
    • 2014
  • Ontology alignment is the way to share and reuse of ontology knowledge. Because of the ambiguity of concept, most ontology alignment systems combine a set of various measures and complete enumeration to provide the satisfactory result. However, calculating process becomes more complex and required time increases exponentially since the number of concept increases, more errors can appear at the same time. Lately the focus is on meta-matching using the heuristic algorithm. Existing meta-matching system tune extra parameter and it causes complex calculating, as a consequence, the results in the various data of specific domain are not good performed. In this paper, we propose a high performance algorithm by using DCS that can solve ontology alignment through simple process. It provides an efficient search strategy according to distribution of Levy Flight. In order to evaluate the approach, benchmark data from the OAEI 2012 is employed. Through the comparison of the quality of the alignments which uses DCS with state of the art ontology matching systems.

A study of a n.0, pplication of marketing to library management (도서관경영에 있어서 마아케팅의 도입에 관한 연구)

  • 권은경
    • Journal of Korean Library and Information Science Society
    • /
    • v.14
    • /
    • pp.99-120
    • /
    • 1987
  • This paper tries to a n.0, pply the concept and process of marketing which have been developed in profit sector to library management. Since the end of 60's certain marketing researchers, among them Kotler, Levy, and Shapiro have advanced the theseses that marketing is not just a business organization as well. Recently libraries have been interested in markeing also. Marketing is a concept of sensitively serving and satisfying human needs through voluntary exchanges of value. Library is a value exchange system in which library service is exchanged with community's patronage. In order for library user to involve in the value exchange system voluntarily, library should analyze user's needs and offer products satisfying the needs. For doing this, library should understand marketing. In this paper, author introduces the marketing concepts and process, tries to show how to a n.0, pply the key concepts and process to public library management. The needs of marketing in library sector, the effectiveness and barriers in a n.0, pplying marketing to library also discussed.

  • PDF

ON CONSISTENCY OF SOME NONPARAMETRIC BAYES ESTIMATORS WITH RESPECT TO A BETA PROCESS BASED ON INCOMPLETE DATA

  • Hong, Jee-Chang;Jung, In-Ha
    • The Pure and Applied Mathematics
    • /
    • v.5 no.2
    • /
    • pp.123-132
    • /
    • 1998
  • Let F and G denote the distribution functions of the failure times and the censoring variables in a random censorship model. Susarla and Van Ryzin(1978) verified consistency of $F_{\alpha}$, he NPBE of F with respect to the Dirichlet process prior D($\alpha$), in which they assumed F and G are continuous. Assuming that A, the cumulative hazard function, is distributed according to a beta process with parameters c, $\alpha$, Hjort(1990) obtained the Bayes estimator $A_{c,\alpha}$ of A under a squared error loss function. By the theory of product-integral developed by Gill and Johansen(1990), the Bayes estimator $F_{c,\alpha}$ is recovered from $A_{c,\alpha}$. Continuity assumption on F and G is removed in our proof of the consistency of $A_{c,\alpha}$ and $F_{c,\alpha}$. Our result extends Susarla and Van Ryzin(1978) since a particular transform of a beta process is a Dirichlet process and the class of beta processes forms a much larger class than the class of Dirichlet processes.

  • PDF

Multifractal Stochastic Processes and Stock Prices (다중프랙탈 확률과정과 주가형성)

  • Rhee, Il-King
    • The Korean Journal of Financial Management
    • /
    • v.20 no.2
    • /
    • pp.95-126
    • /
    • 2003
  • This paper introduces multifractal processes and presents the empirical investigation of the multifractal asset pricing. The multifractal stock price process contains long-tails which focus on Levy-Stable distributions. The process also contains long-dependence, which is the characteristic feature of fractional Brownian motion. Multifractality introduces a new source of heterogeneity through time-varying local reqularity in the price path. This paper investigates multifractality in stock prices. After finding evidence of multifractal scaling, the multifractal spectrum is estimated via the Legendre transform. The distinguishing feature of the multifractal process is multiscaling of the return distribution's moments under time-resealing. More intensive study is required of estimation techniques and inference procedures.

  • PDF

THREE-DIMENSIONAL VOLUME RECONSTRUCTION BASED ON MODIFIED FRACTIONAL CAHN-HILLIARD EQUATION

  • CHOI, YONGHO;LEE, SEUNGGYU
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.23 no.3
    • /
    • pp.203-210
    • /
    • 2019
  • We present the three-dimensional volume reconstruction model using the modified Cahn-Hilliard equation with a fractional Laplacian. From two-dimensional cross section images such as computed tomography, magnetic resonance imaging slice data, we suggest an algorithm to reconstruct three-dimensional volume surface. By using Laplacian operator with the fractional one, the dynamics is changed to the macroscopic limit of Levy process. We initialize between the two cross section with linear interpolation and then smooth and reconstruct the surface by solving modified Cahn-Hilliard equation. We perform various numerical experiments to compare with the previous research.

Narrative Strategies for Learning Enhanced Interface Design "Symbol Mall"

  • Uttaranakorn, Jirayu;McGregor, Donna-Lynne;Petty, Sheila
    • Proceedings of the IEEK Conference
    • /
    • 2002.07a
    • /
    • pp.417-420
    • /
    • 2002
  • Recent works in the area of multimedia studies focus on a wide range of issues from the impact of multimedia on culture to its impact on economics and anything in between. The interconnectedness of the issues raised by this new practice is complicated by the fact that media are rapidly converging: in a very real way, multimedia is becoming a media prism that reflects the way in which media continually influence each other across disciplines and cultural borders. Thus, the impact of multimedia reflects a complicated crossroads where media, human experience, culture and technology converge. An effective design is generally based on shaping aesthetics for function and utility, with an emphasis on ease of use. However, in designing for cyberspace, it is possible to create narratives that challenge the interactor by encoding in the design an instructional aspect that teaches new approaches and forms. Such a design offers an equally aesthetic experience for the interactor as they explore the meaning of the work. This design approach has been used constructively in many applications. The crucial concern is to determine how little or how much information must be presented for the interactor to achieve a suitable level of cognition. This is always a balancing act: too much difficulty will result in interactor frustration and the abandonment of the activity and too little will result in boredom leading to the same negative result In addition, it can be anticipated that the interactor will bring her or his own level of experiential cognition and/or accretion, to the experience providing reflective cognition and/or restructure the learning curve. If the design of the application is outside their present experience, interactors will begin with established knowledge in order to explore the new work. Thus, it may be argued that the interactor explores, learns and cognates simultaneously based on primary experiential cognition. Learning is one of the most important keys to establishing a comfort level in a new media work. Once interactors have learned a new convention, they apply this cognitive knowledge to other new media experiences they may have. Pierre Levy would describe this process as a "new nomadism" that creates "an invisible space of understanding, knowledge, and intellectual power, within which new qualities of being and new ways of fashioning a society will flourish and mutate" (Levy xxv 1997). Thus, navigation itself of offers the interactors the opportunity to both apply and loam new cognitive skills. This suggests that new media narrative strategies are still in the process of developing unique conventions and, as a result, have not reached a level of coherent grammar. This paper intends to explore the cognitive aspects of new media design and in particular, will explore issues related to the design of new media interfaces. The paper will focus on the creation of narrative strategies that engage interactors through loaming curves thus enhancing interactivity.vity.

  • PDF

Unit Mass Estimation and Analysis from Textile Spinning/Weaving Manufacturing Facility Nearby Nakdong River Basin (낙동강 수계에서 제사방적제조 업체에 대한 공정별 원단위산정 및 분석)

  • Lee, Hongshin;Son, Gontae;Gu, Jungeun;Konboonraksa T.;Lee, Hongtae;Lee, Seunghwan
    • Journal of Korean Society of Water and Wastewater
    • /
    • v.22 no.5
    • /
    • pp.541-550
    • /
    • 2008
  • In this investigative study, the unit mass discharge for the major water quality parameters such as flowrate, SS, BOD, CODmn, CODcr, TN, TP from textile spinning/weaving industry nearby Nakdong river basin was estimated. To represent the respective industries, three companies from hundreds of textile spinning/weaving industries located in Nakdong river basin was carefully selected based on its manufacturing goods, flowrate and location for the estimation of unit mass discharge based on unit operation and process. There was a drastic decrease of unit mass discharge estimation between influents and effluents of water quality parameters, which represents the removal capacity of wastewater treatment plant. With the advent of new regulation on the imposed payment proportional to the total amount of pollutants discharge into the water body, the concept of cleaner production technology should be employed in the unit operation/process in wastewater treatment plant as well as textile manufacturing procedure to minimize the levy on the pollutants discharge. Unit mass discharge estimations of unit process (estimated in this study) in major water quality parameters (SS, BOD, COD, TN and TP) based on land were similar to those of composite process (estimated by National Institute of Environmental Research). But the unit mass discharge estimations of unit process in BOD and CODmn based on total sale were much higher than those of composite one while in SS, TN and TP similar to each other. For the detailed estimation of the imposed payment, unit mass estimation based on unit process should be further emphasized.

Option Pricing Models with Drift and Jumps under L$\acute{e}$vy processes : Beyond the Gerber-Shiu Model (L$\acute{e}$vy과정 하에서 추세와 도약이 있는 경우 옵션가격결정모형 : Gerber-Shiu 모형을 중심으로)

  • Cho, Seung-Mo;Lee, Phil-Sang
    • The Korean Journal of Financial Management
    • /
    • v.24 no.4
    • /
    • pp.1-43
    • /
    • 2007
  • The traditional Black-Scholes model for option pricing is based on the assumption that the log-return of the underlying asset follows a Brownian motion. But this assumption has been criticized for being unrealistic. Thus, for the last 20 years, many attempts have been made to adopt different stochastic processes to derive new option pricing models. The option pricing models based on L$\acute{e}$vy processes are being actively studied originating from the Gerber-Shiu model driven by H. U. Gerber and E. S. W. Shiu in 1994. In 2004, G. H. L. Cheang derived an option pricing model under multiple L$\acute{e}$vy processes, enabling us to adopt drift and jumps to the Gerber-Shiu model, while Gerber and Shiu derived their model under one L$\acute{e}$vy process. We derive the Gerber-Shiu model which includes drift and jumps under L$\acute{e}$vy processes. By adopting a Gamma distribution, we expand the Heston model which was driven in 1993 to include jumps. Then, using KOSPI200 index option data, we analyze the price-fitting performance of our model compared to that of the Black-Scholes model. It shows that our model shows a better price-fitting performance.

  • PDF

Numerical studies on approximate option prices (근사적 옵션 가격의 수치적 비교)

  • Yoon, Jeongyoen;Seung, Jisu;Song, Seongjoo
    • The Korean Journal of Applied Statistics
    • /
    • v.30 no.2
    • /
    • pp.243-257
    • /
    • 2017
  • In this paper, we compare several methods to approximate option prices: Edgeworth expansion, A-type and C-type Gram-Charlier expansions, a method using normal inverse gaussian (NIG) distribution, and an asymptotic method using nonlinear regression. We used two different types of approximation. The first (called the RNM method) approximates the risk neutral probability density function of the log return of the underlying asset and computes the option price. The second (called the OPTIM method) finds the approximate option pricing formula and then estimates parameters to compute the option price. For simulation experiments, we generated underlying asset data from the Heston model and NIG model, a well-known stochastic volatility model and a well-known Levy model, respectively. We also applied the above approximating methods to the KOSPI200 call option price as a real data application. We then found that the OPTIM method shows better performance on average than the RNM method. Among the OPTIM, A-type Gram-Charlier expansion and the asymptotic method that uses nonlinear regression showed relatively better performance; in addition, among RNM, the method of using NIG distribution was relatively better than others.