Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
This paper proposes “oblivious differential privacy” (ODP), a notion of privacy-preserving computation taking into account the leakage that memory access patterns can reveal about the input data when a differentially private mechanism is ran inside a trusted execution environment (TEE). The paper presents algorithms for several basic tasks in this model; most of these algorithms use ideas and building blocks developed elsewhere. The main value of this paper is proposing a robust definition privacy capable of taking into account particularities of the computing architecture where the mechanism is executed, and showing that several basic computations can be performed efficiently and privately in this model. ODP has the potential to become a standard model for research in privacy-preserving computation inside TEEs.