Published May 28, 2023 | Version v1
Conference paper

Optimistic Online Caching for Batched Requests

Citation

An error occurred while generating the citation.

Description

In this paper we study online caching problems where predictions of future requests, e.g., provided by a machine learning model, are available. We consider different optimistic caching policies which are based on the Follow-The-Regularized-Leader algorithm and enjoy strong theoretical guarantees in terms of regret. These new policies have a higher computational cost than classic ones like LRU, LFU, as each update of the cache state requires to solve a constrained optimization problem. We study then their performance when the cache is updated less frequently in order to amortize the update cost over time or over multiple requests.

Abstract

International audience

Additional details

Created:
January 5, 2024
Modified:
January 5, 2024