Title:
Efficient Tools For Reliability Analysis Using Finite Mixture Distributions

Thumbnail Image
Author(s)
Cross, Richard J. (Richard John)
Authors
Advisor(s)
Armanios, Erian A.
Advisor(s)
Editor(s)
Associated Organization(s)
Series
Supplementary to
Abstract
The complexity of many failure mechanisms and variations in component manufacture often make standard probability distributions inadequate for reliability modeling. Finite mixture distributions provide the necessary flexibility for modeling such complex phenomena but add considerable difficulty to the inference. This difficulty is overcome by drawing an analogy to neural networks. With appropropriate modifications, a neural network can represent a finite mixture CDF or PDF exactly. Training with Bayesian Regularization gives an efficient empirical Bayesian inference of the failure time distribution. Training also yields an effective number of parameters from which the number of components in the mixture can be estimated. Credible sets for functions of the model parameters can be estimated using a simple closed-form expression. Complete, censored, and inpection samples can be considered by appropriate choice of the likelihood function. In this work, architectures for Exponential, Weibull, Normal, and Log-Normal mixture networks have been derived. The capabilities of mixture networks have been demonstrated for complete, censored, and inspection samples from Weibull and Log-Normal mixtures. Furthermore, mixture networks' ability to model arbitrary failure distributions has been demonstrated. A sensitivity analysis has been performed to determine how mixture network estimator errors are affected my mixture component spacing and sample size. It is shown that mixture network estimators are asymptotically unbiased and that errors decay with sample size at least as well as with MLE.
Sponsor
Date Issued
2004-12-02
Extent
1500153 bytes
Resource Type
Text
Resource Subtype
Thesis
Rights Statement
Rights URI