Organizational Unit:
H. Milton Stewart School of Industrial and Systems Engineering

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Organizational Unit
Includes Organization(s)

Publication Search Results

Now showing 1 - 10 of 38
  • Item
    Analytics-based approaches to improve patient outcomes in healthcare
    (Georgia Institute of Technology, 2022-07-28) Hampapur, Kirthana B.
    Patient outcomes in healthcare are tremendously affected by the efficiency of the care-providing system. This thesis focuses on the application of analytics-based approaches to improve patient outcomes in two healthcare settings: rehabilitation scheduling and deceased-donor kidney transplantation. Appointment scheduling in rehabilitation centers is a complex process that requires patients to be scheduled for a combination of services over multiple days with different providers. In Chapter 2 of the thesis, we develop a mixed-integer programming model and a heuristic to generate appointment schedules that provide timely and ideal care to the patients. We also develop a decision support tool for scheduling the appointments and discuss its implementation at Shepherd Center, a rehabilitation hospital. Kidney transplantation is the preferred option for patients with chronic kidney disease due to lower long-term costs and better quality of life resulting from transplantation. However, in the United States, there is a large gap between the supply and demand of organs. Owing to this organ scarcity, many patients on the transplant waitlist either die or become too sick to receive a transplant. Thus, there is an urgent need to develop and evaluate allocation policies and acceptance practices that improve patient health outcomes and reduce organ wastage. In Chapter 3, we discuss the development and validation of a discrete-event simulation model that mimics the current deceased-donor kidney allocation process. In Chapter 4, we develop an alternative kidney allocation policy that is based on geographical supply-demand trends and present a quantitative comparison to the current policy via the simulation model. Our findings indicate that the alternative allocation policy results in better system- and patient-level outcomes – such as higher number of transplants, lower average wait time, and higher kidney utilization – than the current system. In Chapter 5, we develop post-transplant survival and waitlist survival models to predict a patient's post-transplant survival probability and waitlist survival probability, respectively, using donor and patient characteristics. We then propose two organ offer accept/decline practices in which: (i) a patient accepts the organ offer only if the estimated post-transplant survival probability is above a certain threshold; and (ii) a patient accepts the offer only if the difference of the post-transplant survival and waitlist survival probabilities is above a certain threshold. Using the simulation model, we investigate the potential benefits of using these proposed data-driven predictive methods in kidney offer accept/decline decisions.
  • Item
    COST-EFFECTIVE MANAGEMENT OF DISEASES: EARLY DETECTION AND INTERVENTIONS FOR IMPROVED HEALTH OUTCOMES
    (Georgia Institute of Technology, 2021-01-26) Yildirim, Fatma Melike Melike
    Physical and mental health conditions have an impact on a person’s daily life. If those conditions are not properly treated and managed, it may affect patients’ overall health. This thesis contributes to the decision-making process of preventive intervention programs for major public health problems such as asthma and depression. Asthma is a lifelong condition, and many variables are making it one of the most common and severe chronic diseases for children. Asthma may change over time, with varying severity levels, and cause profound adverse effects financially, physically, and mentally. However, patients can sustain their life longer periods without symptoms or attacks by choosing proper treatment. The 6—18 Initiative by the Center for Disease Control (CDC) has developed intervention strategies to improve patients’ health outcomes. These intervention strategies include: (i) self-management education (AS-ME) for individuals whose asthma is not well-controlled, (ii) home visit to improve self-management education and reduce home asthma triggers for individuals whose asthma is not well-controlled, and (iii) strategies that enhance access and adherence to asthma medications and devices. In Chapter 2, we estimated the return on investment (ROI) of AS-ME and Home Visit for Medicaid-enrolled children with asthma. The cost and utilization measures were quantified using claims data from the Medicaid Analytic eXtract (MAX) files. We modeled the progression of pediatric asthma patients by utilizing the Markov chain model. Discrete event simulation was used to estimate the healthcare utilization and costs for no intervention and intervention scenarios. The main effects of intervention programs, transition probabilities after the intervention were obtained from the literature. The ROI calculation was performed for different sub-populations based on characteristics, including utilization of services (Emergency Department (ED) or Inpatient (IP) visits), age, Asthma Medication Ratio (AMR), and whether they lived in geographic regions with higher rates of ED visits for asthma. In Chapter 3, we quantified the effect of a set of interventions including AS-ME, influenza vaccine, and asthma devices (spacers and nebulizers) on health utilization and expenditures for Medicaid-enrolled children with asthma in New York and Michigan. We evaluated the children aged 0-17 with persistent asthma in 2010 and 2011. Difference-indifference regression was used to quantify the interventions’ effect on the probability of asthma-related healthcare utilization, asthma medication, and utilization costs. We estimated the average change in outcome measures from pre-intervention/intervention (2010) to post-intervention (2011) periods for the intervention group by comparing this with the average change in the control group over the same time horizon. We utilized patients’ data, asthma-related expenditures, utilizations, and interventions in 2010 and 2011 from MAX files. In Chapters 4 and 5, we focused on one of the significant mental health conditions, depression. Depression is a common mental disorder, and it affects a substantial percentage of people in the US. Major depressive disorder (MDD) is a severe form of depression that may lead to increased health services use, functional impairment, disability, and suicide. It is a treatable disease; the combination of psychotherapy and pharmacotherapy is an effective treatment. Minor depression (mD) is another form of depression with fewer symptoms. Improvement of depressive conditions may be achieved with specific treatments (antidepressants, psychotherapy, etc.) or watchful waiting. Current studies show that mental disorder is underdiagnosed and undertreated in the US population. Untreated mental illnesses may cause serious individual and societal consequences. In chapter 4, we performed a systematic investigation of parameters and calibrations to adapt the natural history model of major depression to the current US adult population. We utilized secondary data that was collected from laptop computer-assisted personal interviews and a national telephone survey of adults in the US. We derived data for the US adult population (18 and over) from nationally representative samples of cohorts from the National Comorbidity Survey Replication and from the Baltimore Epidemiologic Catchment Area study. The model is feasible if incidence is low and lifetime prevalence is 30.2% (females) or 17.6% (males). A natural history model can be utilized to make informed decisions about interventions and treatments of major depression, validated with recall bias that increases with age. In chapter 5, our primary goal is to understand the potential benefits of routine depression for the general US population. We develop a discrete-time nonstationary Markov model with annual transitions that were dependent on patient histories, such as the number of previous episodes, treatment status, and time spent without treatment state based on the available data. Markov model was simulated for the hypothetical cohort of 18-year-old and older adults. We evaluated the cost-effectiveness of screening scenarios with different frequencies. In the general population, all screening strategies were cost-effective compare to the baseline. However, there was a difference between age groups of male and female populations based on cost over quality-adjusted life years (QALY). We showed that routine screening is cost-effective for all age groups of females and young, middle-aged males. Male population results are sensitive to the higher costs of screening.
  • Item
    Topics on Multiresolution Signal Processing and Bayesian Modeling with Applications in Bioinformatics
    (Georgia Institute of Technology, 2021-01-14) Yousefi Zowj, Parisa
    Analysis of multi-resolution signals and time-series data has wide applications in biology, medicine, engineering, etc. In many cases, the large-scale (low-frequency) features of a signal including basic descriptive statistics, trends, smoothed functional estimates, do not carry useful information about the phenomenon of interest. On the other hand, the study of small-scale (high-frequency) features that look like noise may be more informative even though extracting such informative features are not always straightforward. In this dissertation we try to address some of the issues pertaining to high-frequency features extraction and denoising of noisy signals. Another topic studied in this dissertation is focused on the integration of genome data with transatlantic voyage data of enslaved people from Africa to determine the ancestry origin of Afro-Americans. Chapter 2. Assessment of Scaling by Auto-Correlation Shells. In this chapter, we utilize the Auto-Correlation (AC) Shell to propose a feature extraction method that can effectively capture small-scale information of a signal. The AC Shell is a redundant shift-invariant and symmetric representation of the signal that is obtained by using Auto-Correlation function of compactly supported wavelets. The small-scale features are extracted by computing the energy of AC Shell coefficients at different levels of decomposition as well as the slope of the line fitted to these energy values using AC Shell spectra. We discuss the theoretical properties and verify them using extensive simulations. We compare the extracted features from AC Shell with those of Wavelets in terms of bias, variance, and mean square error (MSE). The results indicate that the AC Shell features tend to have smaller variance, hence more reliable. Moreover, to show its effectiveness, we validate our feature extraction method in the context of classification to identify patients with ovarian cancer through the analysis of their blood mass spectrum. For this study, we use the features extracted by AC Shell spectra along with a support vector machine classifier to distinguish control from cancer cases. Chapter 3. Bayesian Binary Regressions in Wavelet-based Function Estimation. Wavelet shrinkage has been widely used in nonparametric statistics and signal processing for a variety of purposes including denoising noisy signals and images, dimension reduction, and variable/feature selection. Although the traditional wavelet shrinkage methods are effective and popular, they have one major drawback. In these methods the shrinkage process only relies on the information of the coefficient being thresholded and the information contained in the neighboring coefficients is ignored. Similarly, the standard AC Shell denoising methods shrink the empirical coefficients independently, by comparing their magnitudes with a threshold value. The information of other coefficients has no influence on behavior of a particular coefficients. However, due to redundant representation of signals and coefficients obtained by AC Shells, the dependency of neighboring coefficients and the amount of shared information between them increases. Therefore, it would be vital to propose a new thresholding approach for AC Shells coefficients that considers the information of neighboring coefficients. In this chapter, we develop a new Bayesian denoising for AC Shell coefficients approach that integrates logistic regression, universal thresholding and Bayesian inference. We validate the proposed method using extensive simulations with various types of smooth and non-smooth signals. The results indicate that for all signal types including the neighbor coefficients would improve the denoising process, resulting in lower MSEs. Moreover, we applied our proposed methodology to a case study of denoising Atomic Force Microscopy (AFM) signals measuring the adhesion strength between two materials at the nano-newton scale to correctly identify the cantilever detachment point. Chapter 4. Bayesian Method in Combining Genetic and Historical Records of Transatlantic Slave Trade in the Americas. In the era between 1515 and 1865, more than 12 million people were enslaved and forced to move from Africa to North and Latin America. The shipping documents have recorded the origin and disembarkation of enslaved people. Traditionally, genealogy study has been done via the exploration of historical records, family tress and birth certificates. Due to recent advancements in the field of genetics, genealogy has been revolutionized and become more accurate. Although these methods can provide continental differentiation, they have poor spatial resolution that makes it hard to localize ancestry assignment as these markers are distributed across different sub-continental regions. To overcome the foregoing drawbacks, in this chapter, we propose a hybrid approach that combines the genetic markers results with the historical records of transatlantic voyage of enslaved people. Addition of the journey data can provide with substantially increased resolution in ancestry assignment, using a Bayesian modeling framework. The proposed Bayesian framework uses the voyage data from historical records available in the transatlantic slave trade database as prior probabilities and combine them with genetic markers of Afro-Americans, considered as the likelihood information to estimate the posterior (updated) probabilities of their ancestry assignments to geographical regions in Africa. We applied the proposed methodology to 60 Afro-American individuals and show that the prior information has increased the assignment probabilities obtained by the posterior distributions for some of the regions.
  • Item
    Interfacing Data Harnessing, Stochastic Modeling and Optimization for Maintenance Decisions for Railways
    (Georgia Institute of Technology, 2020-11-03) De Almeida Costa, Mariana
    The increasing demand for cost-effective and transparent solutions for the improvement of the maintenance decision-making process in railways fuels the development of more sophisticated and flexible models, which largely exploit the use of data analytics and optimization tools. At the same time, recent advancements in technologies for railway condition monitoring and the availability of massive amounts of data allow for more accurate and reliable fault detection. One obstacle, however, is how to deal with the data provided by the monitoring equipment as well as the choice of suitable methods to translate the data into useful information for maintenance scheduling and prioritization. In light of this, three main stages of the maintenance decision-making process can be identified: i) data acquisition, ii) modeling approach and, iii) implementation of the policy. Deciding on which parameter(s) represent the real condition of the asset and accurately measuring them, guaranteeing appropriate instrument and good measurement precision concerns data acquisition (step i)). Next, step ii) implies the choice of a comprehensive model that can tackle all the constraints and uncertainties associated with the deteriorating system, while providing solutions (in terms of a maintenance policy) in a reasonable amount of time. Finally, step iii) concerns the ease of implementation of the new maintenance policy, guaranteeing its practical applicability within the context of the train operating company under study. This dissertation aims to provide contributions to these three aspects in terms of railway track and wheelset maintenance. For both deteriorating systems, the choice of an appropriate maintenance policy should balance the trade-off between maintenance costs and costs resulting from the poor-maintained asset, including those arising from potential safety hazards. This is discussed in the context of the three main stages mentioned above. The dissertation is structured in five chapters. Chapter 1 provides the introduction, as well as a brief overview of each of the topics and results presented in the subsequent chapters. Then, chapters 2 and 3 focus on wheelset maintenance and chapters 4 and 5 focus on railway track maintenance. In chapter 2, the optimization of railway wheelset maintenance policy is discussed. This policy is developed based on a data-driven model encompassing estimation of wear rates and further application of a Markov Decision Process (MDP) approach to represent possible discretized wheel states, where the problem of maintenance planning is tackled from the perspective of immediate action cost-optimization. A bidimensional framework considering discrete intervals of wheel diameter along with a quantitative variable (kilometers since last turning/renewal) is used to represent the possible wheel states. In addition, the probability of a defect interfering with the wheel maintenance schedule is modeled by contemplating survival curves derived from a Cox Proportional- Hazards model. As a secondary goal, a comparison of the optimized policy with another wheel’s reprofiling policy that is also ”easy to implement” is provided. In chapter 3, an investigation around the uncertainty of wheelset inspection data is made. Previous research has highlighted the relevance of this topic in the decision-making process surrounding wheelset maintenance actions. In light of this, the investigation aimed to assess the agreement between data acquired from three different inspection devices, namely: i) manual (gauge device), ii) a laser device and iii) an under-floor wheel lathe. Three main wheelset parameters (flange thickness (Ft), flange height (Fh) and flange slope (qR)) are compared using a Linear Mixed Model (LMM) approach under several real-world limitations, such as those imposed by serially correlated, unbalanced and unequally replicated data. Findings supported the use of LMM, showing its ability to capture and account for the differences among the various groups and highlighting statistical significant performances of the inspection devices. In the context of the railway track, chapter 4 presents a spatiotemporal approach for the modeling and prediction of track geometry faults. Spatial-time data from a train operating company is considered through a 5-year inspection database. The track twist, defined as the amount by which the difference in elevation of rails increases or decreases in a given length of the track, is used as the main track quality parameter. The spatiotemporal approach considered two Kriging models with a Gaussian correlation function to study a strategic portion of a track used in heavy-haul transport. A CUSUM (Cumulative Sum) control chart approach is then applied to identify out-of-control track sections and a Logistic Regression model is used to get estimates of the probabilities of future out-of-control points based on the adopted thresholds. Finally, a simple MDP model based on out-of-control points is proposed to compare different maintenance policies aimed at cost minimization for different thresholds of twist standard deviation for different track sections grouping strategies. Lastly, chapter 5 explores the use of Wavelet Analysis (WA) in the statistical modeling of railway track irregularities, namely (1) longitudinal level, (2) alignment, (3) crosslevel, (4) gauge. WA is used to study and reconstruct the four different track geometry irregularity signals. This investigation aimed at finding wavelets that can appropriately describe each track irregularity signal studied, and investigating whether the presence of some high amplitude wavelet coefficients in certain frequencies can be associated with higher vertical or lateral forces in the wheel-rail contact. The last step is accomplished by reconstructing the different irregularities signals using wavelets coefficients in various decomposition levels and studying their impact on Nadal’s safety criterion Y/Q (a critical quantity for derailment safety assessments) through vehicle dynamics simulations. The presence of certain wavelets at different decomposition levels allows identifying wavelets that are more prejudicial in terms of the safety criterion.
  • Item
    Computational advances for big data analytics and medical decision making
    (Georgia Institute of Technology, 2020-08-27) Li, Zhuonan
    With the increase in volume and complexity of data and evidence, medical decision making can be a complex process. Many decisions involve timeliness, uncertainties, and tradeoffs, and can have serious consequences for patients and the clinical practice. This dissertation aims to develop computationally efficient methods for big data analytics and medical decision making. We investigate three topics: the double pivot simplex method to advance linear programming solution techniques, the multiple isocenter selection problem in radiation therapy treatment planning, and the multi-objective treatment planning optimization problem. Chapter 1 advances the computation aspects of the double pivot simplex method by improving its computational efficiency and stability. The double pivot simplex method is a recent advancement to the simplex method which optimally solves linear programs. During any iteration, this algorithm pivots up to two variables at a time instead of one. An efficient implementation of double pivots is developed into LPSOL, a simplex solver for linear programs. We discuss a procedure to handle double pivots and bounded variables, a strategy to update the basis factorization with two variables simultaneously, and other topics related to numerical instability. On average, this implementation enabled double pivots to solve benchmark problems with nearly 30% fewer pivots and in better than 25% less time than the classical single pivots. In Chapter 2, we study the multiple isocenters placement problem in external beam radiation therapy treatment planning. In current treatment strategies, most plans use a single isocenter. Multiple isocenters can improve the dose conformity but their number and locations are difficult to determine. To address this issue, we propose a mathematical model which incorporates the tumor’s geometric characteristics to determine the number of isocenters. An approximation heuristic approach is developed to solve the isocenter selection problem. With the optimized isocenters, the treatment plans can achieve better conformity compared to single isocenter plans. In Chapter 3, we propose a radiation therapy treatment planning framework for stereotactic body radiation treatment (SBRT). Beam angles and the final aperture shapes are critical in developing feasible and deliverable radiation treatment plans. Here we propose the first treatment planning framework that combines the two by integrating the warm-start simultaneous beam angle and fluence map optimization (BAFMO) and direct aperture optimization (DAO). Both problems are multiple objective optimization problems. We introduce the matrix reduction technique to handle dense dose matrix for BAFMO and an approximation scheme for column generation in DAO. We further investigate the benefit of utilizing the optimized beam angle from the BAFMO and show that the final plans use 20% less total modular units (MU) of typical radiation doses.
  • Item
    Resource allocation optimization problems in the public sector
    (Georgia Institute of Technology, 2020-04-25) Leonard, Taylor Joseph
    This dissertation consists of three distinct, although conceptually related, public sector topics: the Transportation Security Agency (TSA), U.S. Customs and Border Patrol (CBP), and the Georgia Trauma Care Network Commission (GTCNC). The topics are unified in their mathematical modeling and mixed-integer programming solution strategies. In Chapter 2, we discuss strategies for solving large-scale integer programs to include column generation and the known heuristic of particle swarm optimization (PSO). In order to solve problems with an exponential number of decision variables, we employ Dantzig-Wolfe decomposition to take advantage of the special subproblem structures encountered in resource allocation problems. In each of the resource allocation problems presented, we concentrate on selecting an optimal portfolio of improvement measures. In most cases, the number of potential portfolios of investment is too large to be expressed explicitly or stored on a computer. We use column generation to effectively solve these problems to optimality, but are hindered by the solution time and large CPU requirement. We explore utilizing multi-swarm particle swarm optimization to solve the decomposition heuristically. We also explore integrating multi-swarm PSO into the column generation framework to solve the pricing problem for entering columns of negative reduced cost. In Chapter 3, we present a TSA problem to allocate security measures across all federally funded airports nationwide. This project establishes a quantitative construct for enterprise risk assessment and optimal resource allocation to achieve the best aviation security. We first analyze and model the various aviation transportation risks and establish their interdependencies. The mixed-integer program determines how best to invest any additional security measures for the best overall risk protection and return on investment. Our analysis involves cascading and inter-dependency modeling of the multi-tier risk taxonomy and overlaying security measurements. The model selects optimal security measure allocations for each airport with the objectives to minimize the probability of false clears, maximize the probability of threat detection, and maximize the risk posture (ability to mitigate risks) in aviation security. The risk assessment and optimal resource allocation construct are generalizable and are applied to the CBP problem. In Chapter 4, we optimize security measure investments to achieve the most cost-effective deterrence and detection capabilities for the CBP. A large-scale resource allocation integer program was successfully modeled that rapidly returns good Pareto optimal results. The model incorporates the utility of each measure, the probability of success, along with multiple objectives. To the best of our knowledge, our work presents the first mathematical model that optimizes security strategies for the CBP and is the first to introduce a utility factor to emphasize deterrence and detection impact. The model accommodates different resources, constraints, and various types of objectives. In Chapter 5, we analyze the emergency trauma network problem first by simulation. The simulation offers a framework of resource allocation for trauma systems and possible ways to evaluate the impact of the investments on the overall performance of the trauma system. The simulation works as an effective proof of concept to demonstrate that improvements to patient well-being can be measured and that alternative solutions can be analyzed. We then explore three different formulations to model the Emergency Trauma Network as a mixed-integer programming model. The first model is a Multi-Region, Multi-Depot, Multi-Trip Vehicle Routing Problem with Time Windows. This is a known expansion of the vehicle routing problem that has been extended to model the Georgia trauma network. We then adapt an Ambulance Routing Problem (ARP) to the previously mentioned VRP. There are no known ARPs of this magnitude/extension of a VRP. One of the primary differences is many ARPs are constructed for disaster scenarios versus day-to-day emergency trauma operations. The new ARP also implements more constraints based on trauma level limitations for patients and hospitals. Lastly, the Resource Allocation ARP is constructed to reflect the investment decisions presented in the simulation.
  • Item
    The War Room Effects Model (WREM): A Parametric Model for the Optimization of Organizationally Supported Decision Making According to the Personality of Decision Makers
    (Georgia Institute of Technology, 2020-04-07) Dickens, James Fredrick
    The War Room Effects Model (WREM): A Parametric Model for the Optimization of Organizationally Supported Decision Making According to the Personality of Decision Makers James F. Dickens 250 Pages Directed by Dr. David Goldsman The War Room Effects Model (WREM) and its accompanying system of situational control are proposed as a concept for the optimization of organizationally supported decisions according to the personality of the decision maker. Concepts and components of the PEN model of personality, the Affect Infusion Model, the Vroom-Yetton model, Situational Strength and the Yerkes-Dodson law provided the theoretical basis for the establishment of WREM as a conceptual model. Two experiments supported the identification of key sources of performance variance in the context of hypothetical decision-making scenarios. The first of these strongly supported acceptance of WREM’s core personality and situational factors as important sources of variance. The second experiment generally confirmed the significance of WREM’s core factors and further indicated that the preponderance of performance variability resulted from key interactions between personality and situational factors. This directly supported the conditional validation of WREM as a parametric model. Response surface analysis and model optimization led to the identification of personality-aligned optimization solutions as a system of situational control. Stochastic simulation of this system indicated dramatic improvements to decision-making performance across the examined ranges of WREM’s personality factors. By practically and holistically accounting for personality and situational factors in an economical theory and model, WREM advances our basic understanding of the dynamic interaction between these factors and their cumulative effects on cognitive performance in decision making. This research concludes by proposing WREM as the subject of further basic and applied research and presents a draft concept for its implementation and application to industry.
  • Item
    Using machine learning to estimate survival curves for transplarnt patients receiving an increased risk for disease transmission donor organ versus waiting for a standard organ
    (Georgia Institute of Technology, 2019-03-26) Mark, Ethan Joshua
    In 1994, the Centers for Disease Control and Prevention (CDC) and the Public Health Service (PHS) released guidelines classifying donors at risk of transmitting human immunodeficiency virus (HIV) through organ transplantation. In 2013, the guidelines were updated to include donors at risk of transmitting hepatitis B (HBV) and hepatitis C (HCV). These donors are known as increased risk for disease transmission donors (IRD). Even though donors are now universally screened for HIV, HBV, and HCV by nucleic acid testing (NAT), NAT can be negative during the eclipse phase, when the virus is not detectable in blood. In part due to the opioid epidemic, over 19% of organ donors were classified as IRD in 2014. Despite the risks of disease transmission and associated mortality from accepting an IRD organ offer, patients also face mortality risks if they decline the organ and wait for a non-IRD organ. The main theme of this thesis is to build organ transplant and waitlist survival models and to help patients decide between accepting an IRD organ offer or remaining on the waitlist for a non-IRD organ. In chapter one, we introduced background information and the outline of the thesis. In chapter two, we used machine learning to build an organ transplant survival model for the kidney that achieves greater performance than the model currently being used in the U.S. kidney allocation system. In chapter three, we used similar modeling techniques and simulation to compare the survival for patients accepting IRD kidney offers vs. waiting for non-IRD kidneys. We then extend our IRD vs. non-IRD survival comparisons to the liver, heart and lung in chapter four, using different models and parameters. In chapter five, we built a model that predicts how the health of a patient changes from waitlist registration to transplantation. In chapter six, we utilized the transplant and waitlist survival models built in chapters three and four to create an interactive tool that displays the survival curves for a patient receiving an IRD organ or waiting for a non-IRD organ. The tool can also show the survival curve if a patient chooses to receive a non-IRD organ immediately. We then concluded with a discussion and major takeaways in chapter seven.
  • Item
    Sequential interval estimation for Bernoulli trials
    (Georgia Institute of Technology, 2018-07-31) Yaacoub, Tony
    Interval estimation of a binomial proportion is one of the most-basic problems in statistics with many important real-world applications. Some classical applications include estimation of the prevalence of a rare disease and accuracy assessment in remote sensing. In these applications, the sample size is fixed beforehand, and a confidence interval for the proportion is obtained. However, in many modern applications, sampling is especially costly and time consuming, e.g., estimating the customer click-through probability in online marketing campaigns and estimating the probability that a stochastic system satisfies a specific property as in Statistical Model Checking. Because these applications tend to require extensive time and cost, it is advantageous to reduce the sample size while simultaneously assuring satisfactory quality (coverage) levels for the corresponding interval estimates. The sequential version of the interval estimation aims at the latter goal by allowing the sample size to be random and, in particular, formulating a stopping time controlled by the observations themselves. The literature focusing on the sequential setup of the problem is limited compared to its fixed sample-size counterpart, and sampling procedure optimality has not been established in the literature. The work in this thesis aims to extend the body of knowledge on the topic of sequential interval estimation for Bernoulli trials, addressing both the theoretical and practical concerns. In the first part of this thesis, we propose an optimal sequential methodology for obtaining fixed-width confidence intervals for a binomial proportion when prior knowledge of the proportion is available. We assume that there exists a prior distribution for the binomial proportion, and our goal is to minimize the expected number of samples while guaranteeing that the coverage probability is at least a specified nominal coverage probability level. We demonstrate our stopping time is always bounded from above and below; we will need to first accumulate a sufficient amount of information before we start applying our stopping rule, and our stopping time will always terminate in finite time. We also compare our method with the optimum fixed-sample-size procedure as well as with existing alternative sequential schemes. In the second part of this thesis, we propose a two-stage sequential method for obtaining tandem-width confidence intervals for a binomial proportion when no prior knowledge of the proportion is available and when it is desired to have a computationally efficient method. By tandem-width, we mean that the half-width of the confidence interval of the proportion is not fixed beforehand; it is instead required to satisfy two different upper bounds depending on the underlying value of the binomial proportion. To tackle this problem, we propose a simple but useful sequential method for obtaining fixed-width confidence intervals for the binomial proportion based on the minimax estimator of the binomial proportion. Finally, we extend the idea for Bernoulli distributions in the first part of this thesis to interval estimation for arbitrary distributions, with an alternative optimality formulation. Here, we propose a conditional cost alternative formulation to circumvent certain analytical/computational difficulties. Specifically, we assume that an independent and identically distributed process is observed sequentially with its common probability density function having a random parameter that must be estimated. We follow a semi-Bayesian approach where we assign cost to the pair (estimator, true parameter), and our goal is to minimize the average sample size guaranteeing at the same time an average cost below some prescribed level. For a variety of examples, we compare our method with the optimum fixed-sample-size and other existing sequential schemes.
  • Item
    Topics in the statistical aspects of simulation
    (Georgia Institute of Technology, 2015-08-19) McDonald, Joshua L.
    We apply various variance reduction techniques to the estimation of Asian averages and options and propose an easy-to-use quasi-Monte Carlo method that can provide significant variance reductions with minimal increases in computational time. We have also extended these techniques to estimate higher moments of the Asians. We then use these estimated moments to efficiently implement Gram--Charlier based estimators for probability density functions of Asian averages and options. Finally, we investigate a ranking and selection application that uses post hoc analysis to determine how the circumstances of procedure termination affect the probability of correct selection.