5. Continuized Accelerations of Deterministic and Stochastic Gradient Descents, and of Gossip Algorithms, by Mathieu Even, @RaphalBerthier1, @BachFrancis, Nicolas Flammarion, Hadrien Hendrikx, Pierre Gaillard, Laurent Massoulié, Adrien Taylor
New paper on arXiv: "Efficient Mean Estimation with Pure Differential Privacy via a Sum-of-Squares Exponential Mechanism," with @Samuel_BKH and @mahbodm_.
Finally resolves an open problem on my mind since April 2019 (~2.5 years ago).
We give the 1st algorithm for mean estimation which is simultaneously:
-(ε,0)-differentially private
-O(d) sample complexity
-poly time
The fact we didn't have such an algorithm before indicates something was missing in our understanding of multivariate private estimation. 2/n
This algorithm is an instance of a broader framework which employs Sum-of-Squares for private estimation. This is the first application of SoS for DP that I am aware of. We apply this framework for two sub-problems, I'm sure there are more applications lurking. 3/n
(Thread) How can I say no to this? My course will be made public! Stay tuned.
I was pulling your leg here. Here's my (hot? cold?) take: academics have a *moral obligation* to make as much of their material public as possible. Not just papers, but code, talks, and lectures. 1/n
95% of the work for most content is preparing it. So why not present it so the whole world can appreciate it? arXiv, Github, and video sites.
All academics should learn how to record videos in high quality, say using @OBSProject (not just Zoom). It's imperative for the future 2/n