Title:
Learning over functions, distributions and dynamics via stochastic optimization

dc.contributor.advisor Song, Le
dc.contributor.author Dai, Bo
dc.contributor.committeeMember Zha, Hongyuan
dc.contributor.committeeMember Boots, Byron
dc.contributor.committeeMember Lan, Guanghui
dc.contributor.committeeMember Gretton, Arthur
dc.contributor.department Computational Science and Engineering
dc.date.accessioned 2018-08-20T15:39:23Z
dc.date.available 2018-08-20T15:39:23Z
dc.date.created 2018-08
dc.date.issued 2018-07-27
dc.date.submitted August 2018
dc.date.updated 2018-08-20T15:39:23Z
dc.description.abstract Machine learning has recently witnessed revolutionary success in a wide spectrum of domains. The learning objectives, model representation, and learning algorithms are important components of machine learning methods. To construct successful machine learning methods that are naturally fit to different problems with different targets and inputs, one should consider these three components together in a principled way. This dissertation aims for developing a unified learning framework for such purpose. The heart of this framework is the optimization with the integral operator in infinite-dimensional spaces. Such an integral operator representation view in the proposed framework provides us an abstract tool to consider these three components together for plenty of machine learning tasks and will lead to efficient algorithms equipped with flexible representations achieving better approximation ability, scalability, and statistical properties. We mainly investigate several motivated machine learning problems, i.e., kernel methods, Bayesian inference, invariance learning, policy evaluation and policy optimization in reinforcement learning, as the special cases of the proposed framework with different instantiations of the integral operator in the framework. These instantiations result in the learning problems with inputs as functions, distributions, and dynamics. The corresponding algorithms are derived to handle the particular integral operators via efficient and provable stochastic approximation by exploiting the particular structure properties in the operators. The proposed framework and the derived algorithms are deeply rooted in functional analysis, stochastic optimization, nonparametric method, and Monte Carlo approximation, and contributed to several sub-fields in machine learning community, including kernel methods, Bayesian inference, and reinforcement learning. We believe the proposed framework is a valuable tool for developing machine learning methods in a principled way and can be potentially applied to many other scenarios.
dc.description.degree Ph.D.
dc.format.mimetype application/pdf
dc.identifier.uri http://hdl.handle.net/1853/60316
dc.language.iso en_US
dc.publisher Georgia Institute of Technology
dc.subject Nonparametric method
dc.subject Stochastic optimization
dc.subject Reproducing kernel Hilbert space (RKHS)
dc.subject Functional gradient
dc.subject Bayesian inference
dc.subject Monte-Carlo approximation
dc.subject Fenchel's duality
dc.subject Saddle-point problem
dc.subject Reinforcement learning
dc.subject Markov decision process
dc.subject Bellman equation
dc.title Learning over functions, distributions and dynamics via stochastic optimization
dc.type Text
dc.type.genre Dissertation
dspace.entity.type Publication
local.contributor.advisor Song, Le
local.contributor.corporatename College of Computing
local.contributor.corporatename School of Computational Science and Engineering
relation.isAdvisorOfPublication b279cef1-4f3d-40b1-852c-1ccfe5fbbd26
relation.isOrgUnitOfPublication c8892b3c-8db6-4b7b-a33a-1b67f7db2021
relation.isOrgUnitOfPublication 01ab2ef1-c6da-49c9-be98-fbd1d840d2b1
thesis.degree.level Doctoral
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
DAI-DISSERTATION-2018.pdf
Size:
4.67 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
3.86 KB
Format:
Plain Text
Description: