• Title/Summary/Keyword: exponential weights

Search Result 29, Processing Time 0.023 seconds

SPECIFIC EXAMPLES OF EXPONENTIAL WEIGHTS

  • Jung, Hee-Sun;Sakai, Ryozi
    • Communications of the Korean Mathematical Society
    • /
    • v.24 no.2
    • /
    • pp.303-319
    • /
    • 2009
  • Let $Q\;{\in}\;C^2$ : ${\mathbb{R}}\;{\rightarrow}\;[0,{\infty})$ be an even function. Then we will consider the exponential weights w(x) = exp(-Q(x)) in the weight class from [2]. In the paper, we will give some relations among exponential weights in this class and introduce a new weight subclass. In addition, we will investigate some properties of the typical and specific weights in these weight classes.

ESTIMATES OF CHRISTOFFEL RUNCTIONS FOR GENERALIZED POLYNOMIALS WITH EXPONENTIAL WEIGHTS

  • Joung, Hae-Won
    • Communications of the Korean Mathematical Society
    • /
    • v.14 no.1
    • /
    • pp.121-134
    • /
    • 1999
  • Generalized nonnegative polynomials are defined as the products of nonnegative polynomials raised to positive real powers. The generalized degree can be defined in a natural way. We extend some results on Infinite-Finite range inequalities, Christoffel functions, and Nikolski type inequalities corresponding to weights W\ulcorner(x)=exp(-|x|\ulcorner), $\alpha$>0, to those for generalized nonnegative polynomials.

  • PDF

A Study for Improving the Positioning Accuracy of DGPS Based on Multi-Reference Stations by Applying Exponential Modeling on Pseudorange Corrections

  • Kim, Koon-Tack;Park, Kwan-Dong;Lee, Eunsung;Heo, Moon Beom
    • Journal of Positioning, Navigation, and Timing
    • /
    • v.2 no.1
    • /
    • pp.9-17
    • /
    • 2013
  • In this paper, a pseudorange correction regeneration algorithm was developed to improve the positioning accuracy of DGPS using multi-reference stations, and the optimal minimum number of reference sites was determined by trying out different numbers of reference. This research was conducted using from two to five sites, and positioning errors of less than 1 m were obtained when pseudorange corrections are collected from at least four reference stations and interpolated as the pseudorange correction at the rover. After determining the optimal minimum number of reference stations, the pseudorange correction regeneration algorithm developed was tested by comparison with the performance of other algorithms. Our approach was developed based on an exponential model. If pseudorange corrections are regenerated using an exponential model, the effect of a small difference in the baseline distance can be enlarged. Therefore, weights can be applied sensitively even when the baseline distance differs by a small amount. Also weights on the baseline distance were applied differently by assigning weights depending on the difference of the longest and shortest baselines. Through this method, the positioning accuracy improved by 19% compared to the result of previous studies.

Mechanical degradation kinetics of poly(ethylene oxide) in a turbulent flow

  • Sung, Jun-Hee;Lim, Sung-Taek;Kim, Chul-Am;Heejeong Chung;Park, Hyoung-Jin
    • Korea-Australia Rheology Journal
    • /
    • v.16 no.2
    • /
    • pp.57-62
    • /
    • 2004
  • Turbulent drag reduction (DR) efficiency of water soluble poly(ethylene oxide) (PEO) with two different molecular weights was studied as a function of polymer concentration and temperature in a turbulent flow produced via a rotating disk system. Its mechanical degradation behavior as a function of time in a turbulent flow was also analyzed using both a simple exponential decay function and a fractional exponential decay equation. The fractional exponential decay equation was found to fit the experimental data better than the simple exponential decay function. Its thermal degradation further exhibited that the susceptibility of PEO to degradation increases dramatically with increasing temperature.

On the comparison of cumulative hazard functions

  • Park, Sangun;Ha, Seung Ah
    • Communications for Statistical Applications and Methods
    • /
    • v.26 no.6
    • /
    • pp.623-633
    • /
    • 2019
  • This paper proposes two distance measures between two cumulative hazard functions that can be obtained by comparing their difference and ratio, respectively. Then we estimate the measures and present goodness of t test statistics. Since the proposed test statistics are expressed in terms of the cumulative hazard functions, we can easily give more weights on earlier (or later) departures in cumulative hazards if we like to place an emphasis on earlier (or later) departures. We also show that these test statistics present comparable performances with other well-known test statistics based on the empirical distribution function for an exponential null distribution. The proposed test statistic is an omnibus test which is applicable to other lots of distributions than an exponential distribution.

Exponential Smoothing Temporal Association Rules for Recommendation of Temperal Products (시간 의존적인 상품 추천을 위한 지수 평활 시간 연관 규칙)

  • Jeong Kyeong Ja
    • Journal of the Korea Society of Computer and Information
    • /
    • v.10 no.1 s.33
    • /
    • pp.45-52
    • /
    • 2005
  • We proposed the product recommendation algorithm mixed the temporal association rule and the exponential smoothing method. The temporal association rule added a temporal concept in a commercial association rule In this paper. we proposed a exponential smoothing temporal association rule that is giving higher weights to recent data than past data. Through simulation and case study in temporal data sets, we confirmed that it is more Precise than existing temporal association rules but consumes running time.

  • PDF

Temporal Association Rules with Exponential Smoothing Method (지수 평활법을 적용한 시간 연관 규칙)

  • Byon, Lu-Na;Park, Byoung-Sun;Han, Jeong-Hye;Jeong, Han-Il;Leem, Choon-Seong
    • The KIPS Transactions:PartD
    • /
    • v.11D no.3
    • /
    • pp.741-746
    • /
    • 2004
  • As electronic commerce progresses, the temporal association rule is developed from partitioned data sets by time to offer personalized services for customer's interest. In this paper, we proposed a temporal association rule with exponential smoothing method that is giving higher weights to recent data than past data. Through simulation and case study, we confirmed that it is more precise than existing temporal association rules but consumes running time.

Magnetization of Magnetite Ferrofluid Studied by Using a Magnetic Balance

  • Jin, Daeseong;Kim, Hackjin
    • Bulletin of the Korean Chemical Society
    • /
    • v.34 no.6
    • /
    • pp.1715-1721
    • /
    • 2013
  • Magnetic properties of magnetite ferrofluid are studied by measuring magnetic weights under different magnetic fields with a conventional electronic balance. Magnetite nanoparticles of 11 nm diameter are synthesized to make the ferrofluid. Magnetization calculated from the magnetic weight reveals the hysteresis and deviates from the Langevin function at high magnetic fields. Magnetic weight shifts instantaneously with magnetic field change by Neel and Brown mechanism. When high magnetic field is applied to the sample, slower change of magnetic weight is accompanied with the instantaneous shift via agglomeration of nanoparticles. The slow change of the magnetic weight shows the stretched exponential kinetics. The temporal change of the magnetic weight and the magnetization of the ferrofluid at high magnetic fields suggest that the superparamagnetic sample turns into superspin glass by strong magnetic interparticle interactions.

Fuzzy multi-objective optimization of the laminated composite beam (복합재 적층 보의 퍼지 다목적 최적설계)

  • 이강희;구만회;이종호;홍영기;우호길
    • Proceedings of the Korean Society For Composite Materials Conference
    • /
    • 2000.04a
    • /
    • pp.143-148
    • /
    • 2000
  • In this article, we presents multi-objective design optimization of laminated composite beam using Fuzzy programming method. At first, the two design objectives are minimizing the structural weight and maximizing the buckling load respectively. Fuzzy multi-optimization problem can be formulated based on results of single optimizations. Due to different relative importance of design objectives, membership functions are constructed by adding exponential parameters for different objective's weights. Finite element analysis of composite beam for buckling behavior are carried by Natural mode method proposed by J.Argyris and computational time of analysis can be reduced. With this scheme, a designer can conveniently obtain a compromise optimal solution of a multi-objective optimization problem only by providing some exponential parameters corresponding to the importance of the objective functions.

  • PDF

Sub-Exponential Algorithm for 0/1 Knapsack (0/1 Knapsack에 대한 서브-지수 함수 알고리즘)

  • Rhee, Chung Sei
    • Convergence Security Journal
    • /
    • v.14 no.7
    • /
    • pp.59-64
    • /
    • 2014
  • We investigate $p(n){\cdot}2^{O(\sqrt{n})}$ algorithm for 0/1 knapsack problem where x is the total bit length of a list of sizes of n objects. The algorithm is adaptable of method that achieves a similar complexity for the partition and Subset Sum problem. The method can be applied to other optimization or decision problem based on a list of numerics sizes or weights. 0/1 knapsack problem can be used to solve NP-Complete Problems with pseudo-polynomial time algorithm. We try to apply this technique to bio-informatics problem which has pseudo-polynomial time complexity.