• Title/Summary/Keyword: optimazation

Search Result 24, Processing Time 0.017 seconds

A Study on Load Distribution of Gaming Server Using Proximal Policy Optimization (Proximal Policy Optimization을 이용한 게임서버의 부하분산에 관한 연구)

  • Park, Jung-min;Kim, Hye-young;Cho, Sung Hyun
    • Journal of Korea Game Society
    • /
    • v.19 no.3
    • /
    • pp.5-14
    • /
    • 2019
  • The gaming server is based on a distributed server. In order to distribute workloads of gaming servers, distributed gaming servers apply some algorithms which divide each of gaming server's workload into balanced workload among the gaming servers and as a result, efficiently manage response time and fusibility of server requested by the clients. In this paper, we propose a load balancing agent using PPO(Proximal Policy Optimization) which is one of the methods from a greedy algorithm and Policy Gradient which is from Reinforcement Learning. The proposed load balancing agent is compared with the previous researches based on the simulation.

A Study on the Optimazation of the Hotel Room Rate Pricing Policy (호텔 객실가격정책(客室價格政策)의 합리화(合理化)에 관한 연구(硏究))

  • Han, Seung-Yeop
    • Korean Business Review
    • /
    • v.6
    • /
    • pp.135-152
    • /
    • 1993
  • The optional market segmentation pricing policy for rooms of hotels are investigated under the assumption of a linear demand function, and for four different situations: (1) single price market, (2) optimal segmentation of the unused capacity of a single-price-maeket, (3) optimal segmantation for all rooms, and (4) opimal segmentation for infiltration from higher priced to adjacent lower priced segments. The purpose of tis study is th show that with proper pricing policy, it would be possible to increase profits considerably. Such a profit increase might be achived by market segmentation coupled with product differentiation, where the different market segments are identified, sperated, and in each segment a different price per room is called for. The different prices are determined based on the specific price elasticity typical for each market segment and the relavant costs. The pricing model implied in this study is based on basic economic pricing theory and optimization techniques. While somewhat complex in its mathmatical solution, it can be easily programmed for use by practitioners, avoiding the need to cope with the technical aspects of the solution. In section II-1, the optimal single-market Single-price policy is evaluated. The optimal strategy under the constraint that only the previously unutilized rooms are segmented is analysed in section II-2, while the optimal strategy without this constraint is determined in section II-3. In section II-4, the optimal market-segmentation pricing policy is derived for the case in which market seperation is allowed for all the rooms under the assumption of custtomer infiltration from each market segment to the adjacent lower priced segment Finally, some considerations relating to the practicality of the model as a decision support tool and the requirements for its implementation are discussed in section III.

  • PDF

Composite Stock Cutting using Distributed Simulated Annealing (분산 시뮬레이티드 어닐링을 이용한 복합 재료 재단)

  • Hong, Chul-Eui
    • Journal of KIISE:Software and Applications
    • /
    • v.29 no.1_2
    • /
    • pp.20-29
    • /
    • 2002
  • The composite stock cutting problem is to allocate rectangular and/or irregular patterns onto a large composite stock sheet of finite dimensions in such a way that the resulting scrap will be minimized. In this paper, the distributed simulated annealing with the new cost error tolerant spatial decomposition is applied to the composite stock cutting problem in MPI environments. The cost error tolerant scheme relaxes synchronization and chooses small perturbations on states asynchronously in a dynamically changed stream length to keep the convergence property of the sequential annealing. This paper proposes the efficient data structures for representation of patterns and their affinity relations and also shows how to determine move generations, annealing parameters, and a cost function. The spatial decomposition method is addressed in detail. This paper identifies that the final quality is not degraded with almost linear speedup. Composite stock shapes are not constrained to convex polygons or even regular shapes, but the rotations are only allowed to 2 or 4 due to its composite nature.

Evaluation of a Sample-Pooling Technique in Estimating Bioavailability of a Compound for High-Throughput Lead Optimazation (혈장 시료 풀링을 통한 신약 후보물질의 흡수율 고효율 검색기법의 평가)

  • Yi, In-Kyong;Kuh, Hyo-Jeong;Chung, Suk-Jae;Lee, Min-Haw;Shim, Chang-Koo
    • Journal of Pharmaceutical Investigation
    • /
    • v.30 no.3
    • /
    • pp.191-199
    • /
    • 2000
  • Genomics is providing targets faster than we can validate them and combinatorial chemistry is providing new chemical entities faster than we can screen them. Historically, the drug discovery cascade has been established as a sequential process initiated with a potency screening against a selected biological target. In this sequential process, pharmacokinetics was often regarded as a low-throughput activity. Typically, limited pharmacokinetics studies would be conducted prior to acceptance of a compound for safety evaluation and, as a result, compounds often failed to reach a clinical testing due to unfavorable pharmacokinetic characteristics. A new paradigm in drug discovery has emerged in which the entire sample collection is rapidly screened using robotized high-throughput assays at the outset of the program. Higher-throughput pharmacokinetics (HTPK) is being achieved through introduction of new techniques, including automation for sample preparation and new experimental approaches. A number of in vitro and in vivo methods are being developed for the HTPK. In vitro studies, in which many cell lines are used to screen absorption and metabolism, are generally faster than in vivo screening, and, in this sense, in vitro screening is often considered as a real HTPK. Despite the elegance of the in vitro models, however, in vivo screenings are always essential for the final confirmation. Among these in vivo methods, cassette dosing technique, is believed the methods that is applicable in the screening of pharmacokinetics of many compounds at a time. The widespread use of liquid chromatography (LC) interfaced to mass spectrometry (MS) or tandem mass spectrometry (MS/MS) allowed the feasibility of the cassette dosing technique. Another approach to increase the throughput of in vivo screening of pharmacokinetics is to reduce the number of sample analysis. Two common approaches are used for this purpose. First, samples from identical study designs but that contain different drug candidate can be pooled to produce single set of samples, thus, reducing sample to be analyzed. Second, for a single test compound, serial plasma samples can be pooled to produce a single composite sample for analysis. In this review, we validated the issue whether the second method can be applied to practical screening of in vivo pharmacokinetics using data from seven of our previous bioequivalence studies. For a given drug, equally spaced serial plasma samples were pooled to achieve a 'Pooled Concentration' for the drug. An area under the plasma drug concentration-time curve (AUC) was then calculated theoretically using the pooled concentration and the predicted AUC value was statistically compared with the traditionally calculated AUC value. The comparison revealed that the sample pooling method generated reasonably accurate AUC values when compared with those obtained by the traditional approach. It is especially noteworthy that the accuracy was obtained by the analysis of only one sample instead of analyses of a number of samples that necessitates a significant man-power and time. Thus, we propose the sample pooling method as an alternative to in vivo pharmacokinetic approach in the selection potential lead(s) from combinatorial libraries.

  • PDF