Title:
Learning matrix and functional models in high-dimensions

dc.contributor.advisor Lebanon, Guy
dc.contributor.author Balasubramanian, Krishnakumar
dc.contributor.committeeMember Balcan, Maria-Florina
dc.contributor.committeeMember Song, Le
dc.contributor.committeeMember Lafferty, John
dc.contributor.committeeMember Yuan, Ming
dc.contributor.department Computer Science
dc.date.accessioned 2014-08-27T13:38:53Z
dc.date.available 2014-08-27T13:38:53Z
dc.date.created 2014-08
dc.date.issued 2014-06-20
dc.date.submitted August 2014
dc.date.updated 2014-08-27T13:38:53Z
dc.description.abstract Statistical machine learning methods provide us with a principled framework for extracting meaningful information from noisy high-dimensional data sets. A significant feature of such procedures is that the inferences made are statistically significant, computationally efficient and scientifically meaningful. In this thesis we make several contributions to such statistical procedures. Our contributions are two-fold. We first address prediction and estimation problems in non-standard situations. We show that even when given no access to labeled samples, one can still consistently estimate error rate of predictors and train predictors with respect to a given (convex) loss function. We next propose an efficient procedure for predicting with large output spaces, that scales logarithmically in the dimensionality of the output space. We further propose an asymptotically optimal procedure for sparse multi-task learning when the tasks share a joint support. We show consistency of the proposed method and derive rates of convergence. We next address the problem of learning meaningful representations of data. We propose a method for learning sparse representations that takes into account the structure of the data space and demonstrate how it enables one to obtain meaningful features. We establish sample complexity results for the proposed approach. We then propose a model-free feature selection procedure and establish its sure-screening property in the high dimensional regime. Furthermore we show that with a slight modification, the approach previously proposed for sparse multi-task learning enables one to obtain sparse representations for multiple related tasks simultaneously.
dc.description.degree Ph.D.
dc.format.mimetype application/pdf
dc.identifier.uri http://hdl.handle.net/1853/52284
dc.language.iso en_US
dc.publisher Georgia Institute of Technology
dc.subject Statistics
dc.subject Machine learning
dc.subject Matrix
dc.subject Kernel
dc.subject Consistency
dc.title Learning matrix and functional models in high-dimensions
dc.type Text
dc.type.genre Dissertation
dspace.entity.type Publication
local.contributor.corporatename College of Computing
relation.isOrgUnitOfPublication c8892b3c-8db6-4b7b-a33a-1b67f7db2021
thesis.degree.level Doctoral
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
BALASUBRAMANIAN-DISSERTATION-2014.pdf
Size:
1.47 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
3.88 KB
Format:
Plain Text
Description: