• Title/Summary/Keyword: Caching Strategy

Search Result 52, Processing Time 0.025 seconds

An Empirical Study on the Construction Strategy of Web-caching Network (효과적인 웹-캐싱 네트웍 구축전략에 관한 실증 연구)

  • 이주헌;조병룡
    • The Journal of Information Technology and Database
    • /
    • v.8 no.2
    • /
    • pp.41-60
    • /
    • 2001
  • Despite the growth in Internet users, demand for multi-medial, large data files and resulting explosive growth in data traffic, there has been lack of investment in Middle-Mile, interconnection of various networks, resulting in bottleneck effect, which is acerbating. One strategy to overcome such network bottleneck is Content Delivery Network (CDN). CDN does not achieve efficient delivery of large file data through physical improvement/increase in network capacity, but by delivering large file contents, the cause of bottlenecks, from distributed servers. Since it is impracticable to physically improve networks capacity to accommodate the growth in internet traffic, CON, by strong CPs contents at cache servers deployed at major ISPs networks, is able to deliver requested contents to the requesting Web clients without the loss of data and long latency.

  • PDF

Caching Replacement Strategy for mobile Information Service Environment (이동 정보서비스 환경에서의 캐슁 대체 전략)

  • 최인선;조기환
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2001.10c
    • /
    • pp.283-285
    • /
    • 2001
  • 최근 급변하는 정보통신 산업과 인터넷 사용인구의 증가에 따라 다양한 응용 서비스들이 무선 인터넷상에서도 제공되고 있다. 하지만 무선 네트워크는 낮은 대역폭, 높은 지연과 트래픽 그리고 잦은 연결의 재설정 등은 이동사용자에게 커다란 장애요소로 인식되고 있다. 따라서 한번 검색되고 정보를 재활용하는 캐슁기법의 적용이 다양한 형태로 고려되고 있다. 본 논문에서는 정보의 사용 빈도수를 고려하여 일정한 비율 이하의 캐슁 정보들 중에서 사용자가 이용하고자 하는 반대 방향의 가장 먼 거리에 있는 정보를 대체하는 캐쉬 운용 전략을 제시한다. 그 결과로 사용자가 자주 이동하는 이동 정보서비스 환경에서 캐쉽 정보의 활용도를 높이고 이동 네트워크의 접속을 최소화하는 기초를 제공한다.

  • PDF

A Caching Strategy based on Analysis of Access Pattern in Mobile Computing Environments (이동 컴퓨팅 환경에서 요구 패턴 분석에 기반한 캐쉬 대체 전략)

  • 이윤장;신동천;김도일
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2001.10a
    • /
    • pp.235-237
    • /
    • 2001
  • 이동 컴퓨팅 환경에서 캐쉬 대체 전략은 시스템의 성능에 많은 영향을 준다. 과거의 캐쉬 대체 전략은 주로 푸쉬 환경에서 방송 디스크(Broadcast Disk) 기법을 사용하는 스케줄링 기법을 기반으로 제안되었다. 따라서 다양한 사용자 요구를 반영하는 풀 환경에서는 좋은 성능을 보여주기 어렵다. 본 논문에서는 사용자의 요구 패턴 분석을 기반으로 풀 기반에 적용할 수 있는 효율적인 캐쉬 대체 전략을 제안한다. 제안한 대체 전략은 히트율 뿐만 아니라 미스 비용도 고려한 전략이다.

  • PDF

Multi-layer Caching Scheme Considering Sub-graph Usage Patterns (서브 그래프의 사용 패턴을 고려한 다중 계층 캐싱 기법)

  • Yoo, Seunghun;Jeong, Jaeyun;Choi, Dojin;Park, Jaeyeol;Lim, Jongtae;Bok, Kyoungsoo;Yoo, Jaesoo
    • The Journal of the Korea Contents Association
    • /
    • v.18 no.3
    • /
    • pp.70-80
    • /
    • 2018
  • Due to the recent development of social media and mobile devices, graph data have been using in various fields. In addition, caching techniques for reducing I/O costs in the process of large capacity graph data have been studied. In this paper, we propose a multi-layer caching scheme considering the connectivity of the graph, which is the characteristics of the graph topology, and the history of the past subgraph usage. The proposed scheme divides a cache into Used Data Cache and Prefetched Cache. The Used Data Cache maintains data by weights according to the frequently used sub-graph patterns. The Prefetched Cache maintains the neighbor data of the recently used data that are not used. In order to extract the graph patterns, their past history information is used. Since the frequently used sub-graphs have high probabilities to be reused, they are cached. It uses a strategy to replace new data with less likely data to be used if the memory is full. Through the performance evaluation, we prove that the proposed caching scheme is superior to the existing cache management scheme.

A Cache Replacement Strategy based on the Analysis of Request Patterns in Mobile Computing Environments (이동 컴퓨팅 환경에서 요구 패턴 분석을 기반으로 하는 캐쉬 대체 전략)

  • 이윤장;신동천
    • Journal of KIISE:Software and Applications
    • /
    • v.30 no.7_8
    • /
    • pp.780-791
    • /
    • 2003
  • Caching is a useful technique to improve the response time by reducing contention of requests in mobile computing environments with a narrow bandwidth. in the traditional cache-based systems, to improve the hit ratio has been usually one of main concerns for the time. However, in mobile computing environments, it is necessary to consider the cost of cache miss as well as the hit ratio. In this paper, we propose a new cache replacement strategy in pull-based data dissemination systems. Then, we evaluate performance of the proposed strategy by a simulation approach. The proposed strategy considers both the popularity and the wating time together, so the page with the smallest value of multiplying popularity by waiting time is selected as a victim.

Measurement of Short-term Temporal Locality Based on Request Interarrival Time (상호참조시간을 고려한 단기간 임시지역성 측정)

  • Kim, Yeong-Ill;Shim, Jae-Hong;Choi, Kyung-Hee;Jung, Gi-Hyun
    • The KIPS Transactions:PartC
    • /
    • v.11C no.1
    • /
    • pp.63-74
    • /
    • 2004
  • Temporal locality of Web server references is one of the important characteristics to be considered in the design of a Web caching strategy, and it is important to measure the temporal locality exactly. Various methods to estimate the temporal locality have been proposed, however, Web server designers have still troubles in its measurement by using the tools that don't reflect the interarrival time of document requests. In this paper, we propose a measurement tool for short-term temporal locality based on request interarrival time, and discuss the simulation results based on the flares from NLANR and NASA Web sites. The results show that the proposed tool estimates the short-term temporal locality more exactly than that based on a stack.

An Efficient Caching Strategy in Data Broadcasting (데이터 방송 환경에서의 효율적인 캐슁 정책)

  • Kim, Su-Yeon;Choe, Yang-Hui
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.26 no.12
    • /
    • pp.1476-1484
    • /
    • 1999
  • TV 방송 분야에서 다양한 정보와 상호 작용성을 제공하기 위해서 최근 기존 방송 내용인 A/V 스트림 외 부가정보 방송이 시도되고 있다. 데이타 방송에 대한 기존 연구는 대부분 고정된 내용의 데이타를 방송하는 환경을 가정하고 있어서 그 결과가 방송 내용의 변화가 많은 환경에 부적합하다. 본 논문에서는 데이타에 대한 접근이 반복되지 않을 가능성이 높고 사용자 접근 확률을 예상하기 어려운 상황에서 응답 시간을 개선하는 방안으로 수신 데이타를 무조건 캐쉬에 반입하고 교체가 필요한 경우 다음 방송 시각이 가장 가까운 페이지를 축출하는 사용자 단말 시스템에서의 캐슁 정책을 제안하였다. 제안된 캐쉬 관리 정책은 평균적인 캐쉬 접근 실패 비용을 줄임으로써 사용자 응답 시간을 개선하며, 서로 다른 스케줄링 기법을 사용하는 다양한 방송 제공자가 공존하는 환경에서 보편적으로 효과를 가져올 수 있다.Abstract Recently, many television broadcasters have tried to disseminate digital multimedia data in addition to the traditional content (audio-visual stream). The broadcast data need to be cached by a client system, to provide a reasonable response time for a user request. Previous studies assumed the dissemination of a fixed set of items, and the results are not suitable when broadcast items are frequently changed. In this paper, we propose a novel cache management scheme that chooses the replacement victim based on the remaining time to the next broadcast instance. The proposed scheme reduces response time, where it is hard to predict the probability distribution of user accesses. The caching policy we present here significantly reduces expected response time by minimizing expected cache miss penalty, and can be applied without difficulty to different scheduling algorithms.

A study on the ways for differentiation of domestic car sharing service (국내 카셰어링(Car Sharing) 서비스의 차별화를 위한 방향 연구)

  • Kim, So-Hyeon;Lee, Dong-Min
    • Journal of the Korea Convergence Society
    • /
    • v.9 no.3
    • /
    • pp.181-186
    • /
    • 2018
  • Car-sharing services are one of the successful model of sharing economies. The car-sharing service is a kind of car rental service that can pay the cost per minutes, also can makes it easy to book and return by smart phone at any time. Experts predict that the car-sharing services will likely dominate the auto market in the future by reducing the burden on consumers' purchasing cost of car and resolving the environmental issues caused by the vehicle. Therefore, a differentiated service strategy is needed to establish a competitive caching service among these companies. In this study, we surveyed differentiation cases by comparing and analyzing domestic caching companies, And customized service tailored to the situation and effective vehicle type infotainment. As a result of this study, the proposed service method change, customized service provision, and new platform application are expected to be detailed and in depth.

Resource Allocation for Heterogeneous Service in Green Mobile Edge Networks Using Deep Reinforcement Learning

  • Sun, Si-yuan;Zheng, Ying;Zhou, Jun-hua;Weng, Jiu-xing;Wei, Yi-fei;Wang, Xiao-jun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.7
    • /
    • pp.2496-2512
    • /
    • 2021
  • The requirements for powerful computing capability, high capacity, low latency and low energy consumption of emerging services, pose severe challenges to the fifth-generation (5G) network. As a promising paradigm, mobile edge networks can provide services in proximity to users by deploying computing components and cache at the edge, which can effectively decrease service delay. However, the coexistence of heterogeneous services and the sharing of limited resources lead to the competition between various services for multiple resources. This paper considers two typical heterogeneous services: computing services and content delivery services, in order to properly configure resources, it is crucial to develop an effective offloading and caching strategies. Considering the high energy consumption of 5G base stations, this paper considers the hybrid energy supply model of traditional power grid and green energy. Therefore, it is necessary to design a reasonable association mechanism which can allocate more service load to base stations rich in green energy to improve the utilization of green energy. This paper formed the joint optimization problem of computing offloading, caching and resource allocation for heterogeneous services with the objective of minimizing the on-grid power consumption under the constraints of limited resources and QoS guarantee. Since the joint optimization problem is a mixed integer nonlinear programming problem that is impossible to solve, this paper uses deep reinforcement learning method to learn the optimal strategy through a lot of training. Extensive simulation experiments show that compared with other schemes, the proposed scheme can allocate resources to heterogeneous service according to the green energy distribution which can effectively reduce the traditional energy consumption.

NOD Caching Strategy using User-Preference Pattern for Time-Window (구간별 사용자 요구 패턴을 이용한 NOD에서의 캐싱 방법)

  • 최태욱;박용운;김영주;정기동
    • Proceedings of the Korea Multimedia Society Conference
    • /
    • 1998.04a
    • /
    • pp.71.1-75
    • /
    • 1998
  • NOD 데이터는 VOD 데이터에 비해서 life cycle이 짧다. 그리고 사용자의 접근성이 높으며, 접근패턴도 시간에 따라 달라질 수 있다. VOD 데이터와 같이 NOD 뉴스기사의 경우 특정 기사들에 집중적으로 접근된다. 그리고 이러한 인기 있는 기사들은 시간대에 따라 변할 수 있다. 본 논문에서는 이러한 인기도의 변화를 예측하기 위해서 시계열분석방법중의 하나인 지수평활법(exponenital smoothing method)을 사용한다. 시간대별 타임윈도우로 나누고 이전의 윈도우들의 접근패턴을 분석하여 다음 접근을 예측한다. 그리고 이 예측값을 이용해서 캐시정책을 새운다. 즉 예측값이 높은 기사순으로 캐시에 배치하는 것이다. 실시간 멀티미디어데이터의 경우 데이터의 방대함으로 연산의 오버헤드가 크다. 따라서 정적인 캐싱전략을 사용하는데, 하나의 윈도우동안 재배치하는 한번으로 한다는 것이다. 전통적인 block 단위 캐싱은 멀티미디어데이터에 적합하지 않다. 따라서 기사단위의 캐시구조를 제안한다. 사용자는 기사단위로 요청을 하기 때문에 재사용을 위해서는 기사단위로 캐시되야 한다.

  • PDF