#### The Capacity of Cache Aided Private Information Retrieval

##### Ravi Tandon

The problem of cache enabled private information retrieval (PIR) is considered in which a user wishes to privately retrieve one out of $K$ messages, each of size $L$ bits from $N$ distributed databases. The user has a local cache of storage $SL$ bits which can be used to store any function of the $K$ messages. The main contribution of this work is the exact characterization of the capacity of cache aided PIR as a function of the storage parameter $S$. In particular, for a given cache storage parameter $S$, the information-theoretically optimal download cost $D^{*}(S)/L$ (or the inverse of capacity) is shown to be equal to $(1- \frac{S}{K})\left(1+ \frac{1}{N}+ \ldots + \frac{1}{N^{K-1}}\right)$. Special cases of this result correspond to the settings when $S=0$, for which the optimal download cost was shown by Sun and Jafar to be $\left(1+ \frac{1}{N}+ \ldots + \frac{1}{N^{K-1}}\right)$, and the case when $S=K$, i.e., cache size is large enough to store all messages locally, for which the optimal download cost is $0$. The intermediate points $S\in (0, K)$ can be readily achieved through a simple memory-sharing based PIR scheme. The key technical contribution of this work is the converse, i.e., a lower bound on the download cost as a function of storage $S$ which shows that memory sharing is information-theoretically optimal.

arrow_drop_up