Evaluating Visual Classification Models On Out-of-distribution Shifts With Limited Training Data

Author(s)
Singh, Aaditya
Advisor(s)
Editor(s)
Associated Organization(s)
Organizational Unit
Supplementary to:
Abstract
Deep Learning has led to major breakthroughs for several Artificial Intelligence tasks. As such models are increasingly deployed in real-world applications, it is necessary that they perform reliably not only on the training dataset(s) but also on different kinds of (out-of) distribution shifts. In this thesis, we focus on two problem settings where out-of-distribution (OOD) classification performance is measured with limited (order of thousand images) training data -- unsupervised domain adaptation and robustness to natural distribution shifts. Overall, this thesis demonstrates that (1) such models can be better utilised for unsupervised domain adaptation and (2) conventional wisdom for OOD robustness might not apply when the amount of fine-tuning data is not as high. We hope to motivate future researchers to also focus on this setting of practical importance.
Sponsor
Date
2023-05-02
Extent
Resource Type
Text
Resource Subtype
Thesis
Rights Statement
Rights URI