Skip to content

Differential Privacy for Free? Harnessing the Noise In Approximate HE

Photo of Rand Hindi
Hosted By
Rand H. and 3 others
Differential Privacy for Free? Harnessing the Noise In Approximate HE

Details

Abstract

In this meetup, we'll make a connection between two important ideas in the Privacy Enhancing Technologies ecosystem -- Homomorphic Encryption and Differential Privacy. While Homomorphic Encryption ensures that sensitive data is not exposed during computation, Differential Privacy guarantees that each data subject can maintain their privacy when we share the result of that computation.

During this talk, we'll look at noise growth in Homomorphic Encryption (HE), and investigate the possibility that this inherent noise can give Differential Privacy (DP) "for free". We will recap what we mean by HE, noise, and DP, before examining new results on the DP guarantees of the Approximate HE setting. We'll finish by applying our results to a case study: Ridge Regression Training via Gradient Descent.

About the speaker

Tabitha Ogilvie is a PhD student in the Information Security Group at Royal Holloway, University of London, and has just completed a year long internship at Intel, as part of the Security and Privacy Research Group within Intel Labs. Her area of research is Privacy Enhancing Technologies and Privacy Preserving Machine Learning, with a focus on Homomorphic Encryption.

Relevant paper / poster
📄 https://eprint.iacr.org/2023/701

Online event
🔗 https://zama-ai.zoom.us/j/81793675423

Never miss an update

  1. The newsletter where we post community announcements: https://fheorg.substack.com/
  2. The discord server where you can discuss FHE related topics with the community: https://discord.fhe.org

Make sure to join either (or both) of these to stay informed about future events!

Photo of FHE.org - Homomorphic Encryption & Secure Computation group
FHE.org - Homomorphic Encryption & Secure Computation
See more events