Quantitative analysis of adaptiveness and consistency of a class of online learning algorithms
Author(s)
Young, Carol C.
Advisor(s)
Editor(s)
Collections
Supplementary to:
Permanent Link
Abstract
This thesis models online ensemble learning algorithms to obtain theoretical analyses of various performance metrics. Online ensemble learning algorithms often serve to learn unknown, possibly time-varying, probability distributions or interact with other learning systems. Their simplicity allows flexibility in design choices, leading to variations that balance adaptiveness and consistency and allows for chatter resistant co-learning. To analyze online ensemble learning algorithms for these variations this work provides a method for the creation of automata by properly selecting states. These automata provide an analytical framework to quantify the adaptiveness and consistency of online ensemble learning algorithms when interacting with a probability distribution. The resulting Markov chain provides quantatative metrics of adaptiveness and consistency can be calculated through mathematical formulas, other than relying on numerical simulations. This analysis shows that the Multi Expert Algorithm (MEA) achieves a higher consistency than the more adaptive Weighted Majority Algorithm (WMA), and a higher adaptiveness than the more consistent Winnow algorithm, thus achieving a balance between the historical algorithms in terms of the adaptiveness and consistency metrics. The automata also provides an analytical framework to identify chatter which can happen when an online learning algorithm is used by a robot to predict human intention when interacting with a human. When chatter happens, the learning algorithm continually changes its prediction, without reaching a constant prediction of human intention. Utilizing Rescorla-Wagner model for human learning, we analyze an expert based online learning algorithm and identify if chatter will occur, and if so what conditions will cause chatter.
Sponsor
Date
2019-08-21
Extent
Resource Type
Text
Resource Subtype
Dissertation