John C. Duchi, Michael I. Jordan, Martin J. Wainwright

We study statistical risk minimization problems under a privacy model in which the data is kept confidential even from the learner. In this local privacy framework, we establish sharp upper and lower bounds on the convergence rates of statistical estimation procedures. As a consequence, we exhibit a precise tradeoff between the amount of privacy the data preserves and the utility, as measured by convergence rate, of any statistical estimator or learning procedure.

Knowledge Graph

arrow_drop_up

Comments

Sign up or login to leave a comment