In 2019 Google open-sourced their first Differential Privacy library to enable developers and organizations to learn from the majority of their data while simultaneously ensuring that those results do not allow any individual's data to be distinguished or re-identified. In this codelab you'll learn how to produce statistics that are preserving the user’s privacy by using differentially private aggregations.
This event is suitable for developers, data scientists, business analysts, product managers who work with or analyze personable identifiable datasets to improvetheir product offerings or plan to publish statistics based on datasets that require a robust data anonymization technique to protect their user’s privacy and prevent data leakages.
Participants with some degree of familiarity with Go, the open source programming language, and Beam will have an easier time following the codelab and understanding the computational models, which will be using differentially private aggregations. There's no official technical experience required other than being able to read and write Go.
- Christiane Ahlheim, Data Scientist: Christiane is a data scientist in gTech, helping Google’s customers to reach peak marketing performance. One of her focus areas is privacy-first data science, where she’s developing new solutions to help clients thrive.
- Ehsaan Qadir, Customer Solutions Engineer: Ehsaan is a Customer Solutions Engineer at Google. He has worked with many customers across various industries to solve their most pressing technical challenges. He is fascinated by good design, both in software and user experiences.