Academic Keynote: Differentially Private Covariance-Adaptive Mean Estimation, Adam Smith (BU)  @GoogleTechTalks
Academic Keynote: Differentially Private Covariance-Adaptive Mean Estimation, Adam Smith (BU)  @GoogleTechTalks
Google TechTalks | Academic Keynote: Differentially Private Covariance-Adaptive Mean Estimation, Adam Smith (BU) @GoogleTechTalks | Uploaded February 2022 | Updated October 2024, 1 week ago.
A Google TechTalk, presented by Adam Smith, 2021/11/9
ABSTRACT: Differentially Private Covariance-Adaptive Mean Estimation

Covariance-adaptive mean estimation is a fundamental problem in statistics, where we are given n i.i.d. samples from a d-dimensional distribution with mean $\mu$ and covariance $\Sigma$ and the goal is to find an estimator $\hat\mu$ with small error $\|\hat\mu-\mu\|_{\Sigma}\leq \alpha$, where $\|\cdot\|_{\Sigma}$ denotes the Mahalanobis distance. (We call this "covariance-adaptive" since the accuracy metric depends on the data distribution.)

It is known that the empirical mean of the dataset achieves this guarantee if we are given at least $n=\Omega(d/\alpha^2)$ samples. Unfortunately, the empirical mean and other statistical estimators can reveal sensitive information about the samples of the training dataset. To protect the privacy of the individuals who participate in the dataset, we study statistical estimators which satisfy differential privacy, a condition that has become a standard criterion for individual privacy in statistics and machine learning.

We present two new differentially private mean estimators for d-dimensional (sub)Gaussian distributions with unknown covariance whose sample complexity is optimal up to logarithmic factors and matches the non-private one in many parameter regimes. Previous estimators with the same guarantee either require strong a priori bounds on the covariance matrix or require $\Omega(d^{3/2})$ samples.

Based on the paper arxiv.org/pdf/2106.13329.pdf, which will appear as a spotlight paper at NeurIPS 2021 and is joint work with Gavin Brown, Marco Gaboardi, Jonathan Ullman, and Lydia Zakynthinou.

About the Speaker: Adam Smith, Boston University
Adam Smith is a professor of computer science at Boston University. From 2007 to 2017, he served on the faculty of the Computer Science and Engineering Department at Penn State. His research interests lie in data privacy and cryptography, and their connections to machine learning, statistics, information theory, and quantum computing. He obtained his Ph.D. from MIT in 2004 and has held postdoc and visiting positions at the Weizmann Institute of Science, UCLA, Boston University and Harvard. He received a Presidential Early Career Award for Scientists and Engineers (PECASE) in 2009; a Theory of Cryptography Test of Time award in 2016; the Eurocrypt 2019 Test of Time award; and the 2017 Gödel Prize.

For more information about the workshop: events.withgoogle.com/2021-workshop-on-federated-learning-and-analytics/#content
Academic Keynote: Differentially Private Covariance-Adaptive Mean Estimation, Adam Smith (BU)Foundation Models and Fair UseTree Learning: Optimal Algorithms and Sample Complexity2023 Blockly Developer Summit Day 1-8: Blocks in DocsShiva Rajaraman | VP of Product at OpenSea | web3 talks | April 21st 2022 | MC: Raphael HydeSteven Goldfeder | CEO Offchain Labs / Arbitrum  | web3 talks | Aug 24 2023 | MC: Marlon RuizFast Neural Kernel Embeddings for General ActivationsDay 1 Lightning Talks: Privacy & SecurityExample Memorization in Learning:  Batch and Streaming2023 Blockly Developer Summit DAY 1-12: Serialization and Visual DiffNear-Optimal Experimental Design for Networks: Independent Block RandomizationAccelerating Transformers via Kernel Density Estimation Insu Han

Academic Keynote: Differentially Private Covariance-Adaptive Mean Estimation, Adam Smith (BU) @GoogleTechTalks

SHARE TO X SHARE TO REDDIT SHARE TO FACEBOOK WALLPAPER