Title:
Scalable, Efficient, and Fair Algorithms for Structured Convex Optimization Problems

dc.contributor.advisor Vempala, Santosh S.
dc.contributor.author Ghadiri, Mehrdad
dc.contributor.committeeMember Peng, Richard
dc.contributor.committeeMember Singh, Mohit
dc.contributor.committeeMember Brand, Jan van den
dc.contributor.committeeMember Gupta, Swati
dc.contributor.department Computer Science
dc.date.accessioned 2023-09-22T15:06:18Z
dc.date.available 2023-09-22T15:06:18Z
dc.date.created 2023-12
dc.date.issued 2023-08-24
dc.date.submitted December 2023
dc.date.updated 2023-09-22T15:06:19Z
dc.description.abstract The growth of machine learning and data science has necessitated the development of provably fast and scalable algorithms that incorporate ethical requirements. In this thesis, we present algorithms for fundamental optimization algorithms with theoretical guarantees on approximation quality and running time. We analyze the bit complexity and stability of efficient algorithms for problems including linear regression, $p$-norm regression, and linear programming by showing that a common subroutine, inverse maintenance, is backward stable and that iterative approaches for solving constrained weighted regression problems can be carried out with bounded-error pre-conditioners. We also present conjectures regarding the running time of computing symmetric factorizations for Hankel matrices that imply faster-than-matrix-multiplication time algorithms for solving sparse poly-conditioned linear programs. We present the first subquadratic algorithm for solving the Kronecker regression problem, which improves the running time of all steps of the alternating least squares algorithm for the Tucker decomposition of tensors. In addition, we introduce the Tucker packing problem for computing an approximately optimal core shape for the Tucker decomposition problem. We prove this problem is NP-hard and provide polynomial-time approximation schemes for it. Finally, we show that the popular $k$-means clustering algorithm (Lloyd's heuristic) can result in outcomes that are unfavorable to subgroups of data. We introduce the socially fair $k$-means problem for which we provide a very efficient and practical heuristic. For the more general problem of $(\ell_p,k)$-clustering problem, we provide bicriteria constant-factor approximation algorithms. Many of our algorithms improve the state-of-the-art in practice.
dc.description.degree Ph.D.
dc.format.mimetype application/pdf
dc.identifier.uri https://hdl.handle.net/1853/72860
dc.language.iso en_US
dc.publisher Georgia Institute of Technology
dc.subject Convex Optimization
dc.subject Tensor Decomposition
dc.subject Fairness in Machine Learning
dc.subject Numerical Analysis
dc.subject Bit Complexity
dc.title Scalable, Efficient, and Fair Algorithms for Structured Convex Optimization Problems
dc.type Text
dc.type.genre Dissertation
dspace.entity.type Publication
local.contributor.advisor Vempala, Santosh S.
local.contributor.corporatename College of Computing
relation.isAdvisorOfPublication 08846825-37f1-410b-b338-526d4f79815b
relation.isOrgUnitOfPublication c8892b3c-8db6-4b7b-a33a-1b67f7db2021
thesis.degree.level Doctoral
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
GHADIRI-DISSERTATION-2023.pdf
Size:
7.98 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
3.87 KB
Format:
Plain Text
Description: