Part of Advances in Neural Information Processing Systems 32 (NeurIPS 2019)
Matthew Joseph, Janardhan Kulkarni, Jieming Mao, Steven Z. Wu
We study a basic private estimation problem: each of n users draws a single i.i.d. sample from an unknown Gaussian distribution N(\mu,\sigma^2), and the goal is to estimate \mu while guaranteeing local differential privacy for each user. As minimizing the number of rounds of interaction is important in the local setting, we provide adaptive two-round solutions and nonadaptive one-round solutions to this problem. We match these upper bounds with an information-theoretic lower bound showing that our accuracy guarantees are tight up to logarithmic factors for all sequentially interactive locally private protocols.