No-Regret Caching with Noisy Request Estimates - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Conference Papers Year : 2023

No-Regret Caching with Noisy Request Estimates

Abstract

Online learning algorithms have been successfully used to design caching policies with regret guarantees. Existing algorithms assume that the cache knows the exact request sequence, but this may not be feasible in high load and/or memory-constrained scenarios, where the cache may have access only to sampled requests or to approximate requests' counters. In this paper, we propose the Noisy-Follow-the-Perturbed-Leader (NFPL) algorithm, a variant of the classic Follow-the-Perturbed-Leader (FPL) when request estimates are noisy, and we show that the proposed solution has sublinear regret under specific conditions on the requests estimator. The experimental evaluation compares the proposed solution against classic caching policies and validates the proposed approach under both synthetic and real request traces.
Fichier principal
Vignette du fichier
onlineCaching.pdf (631.29 Ko) Télécharger le fichier
Origin : Publisher files allowed on an open archive

Dates and versions

hal-04318435 , version 1 (01-12-2023)
hal-04318435 , version 2 (24-12-2023)

Identifiers

  • HAL Id : hal-04318435 , version 2

Cite

Younes Ben Mazziane, Francescomaria Faticanti, Giovanni Neglia, Sara Alouf. No-Regret Caching with Noisy Request Estimates. IEEE VCC 2023 - IEEE Virtual Conference on Communications, Nov 2023, New York (ONLINE), United States. ⟨hal-04318435v2⟩
61 View
24 Download

Share

Gmail Facebook X LinkedIn More