LeadCache: Regret-Optimal Caching in Networks

Debjit Paria, Abhishek Sinha

We consider a set-valued online prediction problem in the context of network caching. Assume that users are connected to a number of caches via a bipartite network. At any time slot, each user requests some file chosen from a large catalog. A user's request is met if the requested file is cached in at least one of the caches connected to the user. The objective is to predict and optimally store the files on the caches to maximize the total number of cache hits. We propose $\texttt{LeadCache}$ - an online caching policy based on the Follow-the-Perturbed-Leader paradigm. We show that the policy is regret-optimal up to a factor of $\tilde{O}(n^{3/8}),$ where $n$ is the number of users. We implement the policy by designing a new linear-time Pipage rounding algorithm. With an additional Strong-Law-type assumption, we show that the total number of file fetches under $\texttt{LeadCache}$ remains almost surely finite. Additionally, we derive a tight regret lower bound using results from graph coloring. Our conclusion is that the proposed learning-based caching policy decisively outperforms the classical policies both theoretically and empirically.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment