Title:
Learning Tree Models in Noise: Exact Asymptotics and Robust Algorithms

Thumbnail Image
Author(s)
Tan, Vincent Y. F.
Authors
Advisor(s)
Advisor(s)
Editor(s)
Associated Organization(s)
Organizational Unit
Organizational Unit
Series
Collections
Supplementary to
Abstract
We consider the classical problem of learning tree-structured graphical models but with the twist that the observations are corrupted in independent noise. For the case in which the noise is identically distributed, we derive the exact asymptotics via the use of probabilistic tools from the theory of strong large deviations. Our results strictly improve those of Bresler and Karzand (2020) and Nikolakakis et al. (2019) and demonstrate keen agreement with experimental results for sample sizes as small as that in the hundreds. When the noise is non-identically distributed, Katiyar et al. (2020) showed that although the exact tree structure cannot be recovered, one can recover a "partial" tree structure; that is, one that belongs to the equivalence class containing the true tree. We propose Symmetrized Geometric Averaging (SGA), a statistically robust algorithm for partial tree recovery. We provide error exponent analyses and extensive numerical results on a variety of trees to show that the sample complexity of SGA is significantly better than the algorithm of Katiyar et al. (2020). SGA can be readily extended to Gaussian models and is shown via numerical experiments to be similarly superior.
Sponsor
Date Issued
2021-02-10
Extent
54:22 minutes
Resource Type
Moving Image
Resource Subtype
Lecture
Rights Statement
Rights URI