Bio: Sara Hooker is a research scientist at Google Brain working on training models that fulfill multiple desiderata. Her main research interests gravitate towards interpretability, model compression and security. In 2014, she founded Delta Analytics, a non-profit dedicated to bringing technical capacity to help non-profits across the world use machine learning for good.
Draft title: Going Beyond Top-Line Metrics: Characterizing the Generalization Trade-offs Incurred By Compression
Draft abstract: Compression techniques such as pruning and quantization are widely used to enable ML at the edge. It is possible to prune or heavily quantize with negligible decreases to test-set accuracy. However, top-line metrics obscure critical differences in generalization between compressed and non-compressed networks. In this talk, we will go beyond test-set accuracy and discuss some of my recent work measuring the trade-offs between compression, robustness and algorithmic bias. Characterizing these trade-offs provide insight into how capacity is used in deep neural networks – the majority of parameters are used to memorize the long-tail of the data distribution.