Title:
Robust Mean Estimation in Nearly-Linear Time

Thumbnail Image
Author(s)
Hopkins, Samuel
Authors
Advisor(s)
Advisor(s)
Editor(s)
Associated Organization(s)
Organizational Unit
Organizational Unit
Series
Collections
Supplementary to
Abstract
Robust mean estimation is the following basic estimation question: given i.i.d. copies of a random vector X in d-dimensional Euclidean space of which a small constant fraction are corrupted, how well can you estimate the mean of the distribution? This is a classical problem in statistics, going back to the 60's and 70's, and has recently found application to many problems in reliable machine learning. However, in high dimensions, classical algorithms for this problem either were (1) computationally intractable, or (2) lost poly(d) factors in their accuracy guarantees. Recently, polynomial time algorithms have been demonstrated for this problem that still achieve (nearly) optimal error guarantees. However, the running times of these algorithms were either at least quadratic in dimension or in 1/(desired accuracy), running time overhead which renders them ineffective in practice. In this talk we give the first truly nearly linear time algorithm for robust mean estimation which achieves nearly optimal statistical performance. Our algorithm is based on the matrix multiplicative weights method. Based on joint work with Yihe Dong and Jerry Li, to appear in NeurIPS 2019.
Sponsor
Date Issued
2019-12-02
Extent
56:43 minutes
Resource Type
Moving Image
Resource Subtype
Lecture
Rights Statement
Rights URI