Title:
Ten Steps of EM Suffice for Mixtures of Two Gaussians

No Thumbnail Available
Author(s)
Daskalakis, Constantinos
Authors
Advisor(s)
Advisor(s)
Editor(s)
Associated Organization(s)
Organizational Unit
Organizational Unit
Series
Collections
Supplementary to
Abstract
The Expectation-Maximization (EM) algorithm is a widely used method for maximum likelihood estimation in models with latent variables. For estimating mixtures of Gaussians, its iteration can be viewed as a soft version of the k-means clustering algorithm. Despite its wide use and applications, there are essentially no known convergence guarantees for this method. We provide global convergence guarantees for mixtures of two Gaussians with known covariance matrices. We show that EM converges geometrically to the correct mean vectors, and provide simple, closed-form expressions for the convergence rate. As a simple illustration, we show that, in one dimension, ten steps of the EM algorithm initialized at infinity result in less than 1% error estimation of the means. (Joint work with Christos Tzamos and Manolis Zampetakis.)
Sponsor
Date Issued
2017-03-06
Extent
63:35 minutes
Resource Type
Moving Image
Resource Subtype
Lecture
Rights Statement
Rights URI