Organizational Unit:
Aerospace Systems Design Laboratory (ASDL)

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Includes Organization(s)

Publication Search Results

Now showing 1 - 10 of 215
  • Item
    Accelerated Simulation-Based Analysis of Emergent and Stochastic Behavior in Military Capability Design
    (Georgia Institute of Technology, 2023-07-25) Braafladt, Alexander
    In military capability design, the United States Air Force (USAF) is working to modernize to be ready to succeed in future operations. During the process, high-fidelity military simulation is used iteratively to build up understanding of complex military scenarios and consider technology and concept alternatives. While high-fidelity simulation is critical to the analysis, it is often expensive and time consuming to work with. In addition, the required pace for analysis needs to be accelerated as technology and threats rapidly evolve. In response to these challenges, the research in this thesis focuses on accelerating two central parts of simulation-based analysis in capability design. The first part focuses on improving methods for searching for emergent behavior, which is critical for building up understanding with simulation. The second part focuses on including stochastic responses from simulation in parametric models used during tabletop design exercises, which are critical for comparing alternatives. To accelerate simulation-based analysis of emergent behavior, a specific definition of emergent behavior is synthesized from the literature that prompts optimization approaches to be used for searching more quickly than with brute-force Monte Carlo Simulation (MCS). This definition also allows formulation of the new ENFLAME (Exploration of Nonlinear and stochastic Future behavior under Lack of knowledge using simulation-based Analysis to Manage Emergent behavior) framework for structuring activities working to manage emergent behavior with simulation-based analysis. Specifically in this work, the new LANTERN (Low-cost Adaptive exploratioN to Track down Extreme, Rare events using Numerical optimization) methodology for searching for emergent behavior as rare, localized and stochastic extreme events is developed that accelerates the process using novel Bayesian Optimization (BO) techniques that adaptively query the simulation to find rare events. In experiments with test problems based on the behavior expected with an Agent-Based Modeling (ABM) simulation approach, the new BO techniques show significant improvement over MCS. For accelerating analysis of stochastic behavior during tabletop design exercises, the ECDF-ROM surrogate modeling approach that uses Reduced-Order Modeling (ROM) techniques combined with a new field representation is developed. The surrogate modeling approach is shown to work effectively with distributions like those expected with military simulation, allowing parametric, interactive queries of distributions. A final demonstration of the techniques was completed using two scenarios developed in simulation with the Advanced Framework for Simulation, Integration, and Modeling (AFSIM). First, a Suppression of Enemy Air Defenses (SEAD) scenario was used to demonstrate the effectiveness of the new techniques at searching for rare, localized extreme events. Second, a four vs. four air combat scenario was used to demonstrate the effectiveness of the new technique for searching for rare, stochastic extreme events, and to demonstrate the new distribution surrogate modeling approach. The results together show that the LANTERN methodology accelerates the search for emergent behavior effectively for iterative simulation-based analysis of military scenarios and the ECDF-ROM approach enables parametric models of stochastic outcomes.
  • Item
    Methodology for Aircraft Architecture Selection & Design Optimization
    (Georgia Institute of Technology, 2023-04-30) Harish, Anusha
    Growing concerns about the environment have led to aviation agencies around the world such as IATA, ICAO, NASA, and ACARE to set targets to curb noise, emissions and fuel consumption in the coming years. In order to achieve these goals, several new aircraft technologies and concepts in the areas of unconventional airframe configurations (such as blended wing body and truss-braced wing), advanced propulsion systems (example, open rotor engines and electrified propulsion), alternative energy sources (such as hydrogen, battery, etc.) as well as propulsion-airframe integration concepts (distributed propulsion, boundary layer ingestion, etc.) have been proposed. There are over 100,000 possible combinations of these technologies. However, this vast architecture space has not yet been fully explored. Therefore, there is a need for a lower-order analysis methodology capable of rapidly analyzing different combinations. This research aims to propose a methodology for rapid generation and assessment of architectures in order to identify promising ones that are capable of meeting future environmental goals. There are 3 key aspects to this problem - generation of alternatives, evaluation of the design space for the architectures, and finally the optimization of the aircraft designs. The first research area focuses on the generation of architecture alternatives using Constraint Programming for every aircraft configuration with known propulsive-airframe integration concept, given the compatibility between different components. Since there is currently no methodology that automatically generates architecture alternatives, this proposed methodology is validated by comparing its results against known or studied architectures in the literature. The second research area is aimed at developing a ”pre-conceptual” design methodology that can quickly evaluate and optimize architecture alternatives with fewer design details and consistent set of assumptions and requirements. Parameters such that the energy and the power split between different components, and the path for power flow from the energy source to the thrust producing device at both sizing points as well as throughout the mission segments are proposed and used in the determination of key performance indicators such as global chain efficiency, energy specific air range and thrust specific power consumption. The objective of the final research question is the optimization of the aircraft design for each generated architecture. A multi-objective optimization algorithm is implemented to optimize each design with aircraft weight and energy consumption as the two objectives, while meeting all aircraft requirements such as range, payload, cruise altitude and speed, mission power requirements, etc. Thus, a complete, generalized, universal architecture enumeration and pre-conceptual design and optimization methodology is proposed. The capability of this methodology is demonstrated in the final use case where architectures with different alternatives in terms of energy sources – jet fuel, batteries (high specific power, high specific energy) and hydrogen; and advanced propulsion system architectures with distributed propulsion – electrified propulsion and hydrogen propulsion hybrids, are generated, evaluated and optimized for a 2050 Entry-into-Service. Furthermore, the impact of technologies on the aircraft performance is investigated through a technology sensitivity study.
  • Item
    A Techno-Economic Approach to the Evaluation of Hybrid-Electric Propulsion Architectures at the Conceptual/Exploratory Design Phase
    (Georgia Institute of Technology, 2022-07-20) Milios, Konstantinos
    A new, revolutionary concept, capable of mitigating the impact of global aviation on the climate, is electrified propulsion-based aircraft configurations. However, the introduction of a new electric powertrain to the existing propulsion system has created a series of challenges. Multiple energy sources are available to meet the system power requirements throughout the flight envelope compared to conventional fuel-only based vehicles. Electrified flight segments (eTaxi, takeoff boost, climb boost, etc.) can lead to large variations in total mission fuel burn depending on the amount and duration of electric power provided. Electrified propulsion systems are radical innovations and as such, entail a high degree of risk in technical and financial performance. Traditional project management methods for new products, such as the stage-gate model, tend to favor more traditional and conventional engine advancements where the associated technologies and economics are better understood, leading to promising novel concepts being discarded during the early design phases. With cost overruns and schedule delays being a common theme among new airplane development programs, it is imperative that the most promising electrified propulsion concepts advance to the later stages of product development. The present work proposes a techno-economic approach for evaluating hybrid-electric propulsion architectures. Technical feasibility and financial viability of notional hybrid-electric concepts is concurrently quantified during the conceptual/exploratory design phase, in combination with uncertainty analysis associated with low maturity technologies and dynamic economic environments. A technical framework was developed based on the Environmental Design Space (EDS) simulation tool capable of performing sizing, mission, and emission analysis of a hybrid-electric aircraft. A comprehensive cost model for hybrid-electric systems was developed and applied for calculating the financial performance of each notional concept. Technical and financial uncertainties associated with hybrid-electric propulsion systems were identified and their impact on the overall business case performance of each concept measured. Finally, the proposed techno-economic framework is demonstrated using a multi-variable scenario-based analysis for determining the impact of external market factors (fuel prices, electricity prices, environmental policies, etc.) on the evaluation of hybrid-electric propulsion systems.
  • Item
    A Graph-Based Methodology for Model Inconsistency Identification and Robust Architecture Exploration and Analysis
    (Georgia Institute of Technology, 2022-05-03) Duca, Ruxandra
    The rise in complexity in aircraft design and the move towards non-conventional architectures lead to errors discovered late when changes are costly. A leading cause is the distributed design with isolated but interdependent models, which makes it difficult to maintain a consistent set of assumptions. Several gaps were identified, then a methodology was proposed to (1) to define a novel, internally feasible candidate architecture, (2) ensure that external analysis models are consistent with it, and (3) systematically extract cross-tool dependencies for multi-disciplinary analysis setup. In the first step, Model-Based Systems Engineering was leveraged to create a formal descriptive model of a baseline architecture. For this, an interface-based ontology was formulated using rules about component terminals and a standardized set of interactions. Incremental exploration was then enabled by developing a query-and-action process to find elements that must be added or removed after a local component replacement. The process was demonstrated by sequentially electrifying subsystems of a conventional baseline, resulting in numerous changes and restoring the system’s internal feasibility. In the second step, the application of inconsistency detection methods was enabled by automating the search for semantic overlap between analysis models and the central descriptive model. For this, data from the two was encoded into labeled digraphs and an algorithm was used to find the maximum common subgraph. It was demonstrated between the electrified candidate architecture from the first step and a conventional aircraft model as seen by an analysis tool. After finding the equivalent elements, the inconsistency detection method was demonstrated. The last step leveraged the results of the first two: analysis tools linked to a cross-disciplinary descriptive view of the whole system. Using the central model as an intermediary, cross-tool constraints were extracted, even when the relevant parameters were not exposed as inputs or outputs. This was demonstrated between the analysis model in the second step and a localized thermal model. With a formal, cross-disciplinary view of the candidate architecture and a set of properly configured tools and cross-tool constraints, this methodology enables the exploration of subsystem architectures during preliminary design with less effort than current methods, and with the prospective of fewer errors being discovered in later stages of design.
  • Item
    TOWARD A ROBUST COMPUTATIONAL SOLUTION FOR FORMAL VERIFICATION AND VALIDATION IN MODEL-BASED SYSTEMS ENGINEERING
    (Georgia Institute of Technology, 2022-05-02) Gharbi, Aroua
    The development of reliable, large complex systems depends on a systematic approach with well-established standards and practices. One of these standards is Model-Based Systems Engineering (MBSE), which adopts an approach that centers models to design, analyze and maintain products throughout their life cycle. To maximize the quality and output of these models, multiple verification and validation activities need to be conducted throughout this process. Despite numerous advances, these activities are time-consuming, primarily based on heuristics, and performed in a bottom-up approach that assumes that the validity of subsystems guarantees the correctness of their composite model. Popularized in the 1960s, formal methods rectify the shortcomings of heuristic approaches by using mathematics to provide proof of correctness. Formal verification and validation (V&V) are heavily used in software design and engineering to help generate correct codes and identify unforeseen situations. Formal V&V techniques used in MBSE are extrapolated from software engineering practices. They center on model checking, which is a form of verification only. Therefore, a new approach to formal verification and validation in MBSE needs to stem from the characteristics of the discipline itself. A couple of authors attempted to provide a rigorous foundation of MBSE. The most notable and comprehensive one is the Tricotyledon Theory of System Design (T3SD), developed by Wayne Wymore in 1993. Founded on the set theory, T3SD laid the groundwork for a system design language to rigorously solve design engineering problems. Wymore was the first to coin the term MBSE and establish the tools and mechanisms to adopt it in a design process. However, this theory was a victim of its rigor and exhaustiveness as the complexity of its mathematical constructs deterred practitioners from using it. For almost 30 years, all the concepts, problems, and examples developed by Wymore remained as an abstract proof in his book. Yet, T3SD has the mathematical formalism needed to create a robust verification and validation framework. For instance, the System Design Problem (SDR) provides a concise formulation of the MBSE design problem from which proof-based assertions can be deduced. In this thesis, a methodology is proposed to (1) provide computational implementations to the complex constructs of T3SD and (2) generate an algorithmic solution to S In the first step of the proposed methodology, the theory elements are arranged hierarchically based on their inter-dependency. Next, the SDR statement is decoupled, leading to the identification of practical phases for a formal verification and validation task. These phases are centered on two critical T3SD concepts: The System Coupling Recipe (SCR), which is concerned with the structural composition of systems, and system homomorphisms, a mathematical tool to identify the equivalence between systems. To provide a computational implementation of SCR, Wymore’s state transition diagrams were proved to be a special case of Finite-State Machines (FSM). As FSMs are mathematical models of computation, in essence, an algorithm was developed to support a code that calculates the resultant of a SCR. The correctness of this implementation was demonstrated in multiple examples as part of this thesis. For the concept of system homomorphisms, its T3SD definition was reformulated using mathematical logic. The new formulation resulted in an instance of the satisfiability problem (SAT), for which a Python code using the Gurobi optimizer was developed. The correctness of the reformulation and implementation were also validated and demonstrated in examples in this thesis. Finally, a holistic postulate for SDR was concluded. This postulate proposed a many-objective ordering solution of partially ordered sets (posets) for the formal approach to verification and validation. Aside from being the first extensive investigation of T3SD, the methodology developed as part of this research represents a first down-payment toward a practical computational solution for formal verification and validation in MBSE. The algorithms and codes developed in this thesis enable a set-up of real-life design problems, where the conformance between a candidate solution and its requirements can be established objectively.
  • Item
    A Framework for Integrating Advanced Air Mobility Vehicle Development, Safety and Certification
    (Georgia Institute of Technology, 2022-04-28) Markov, Alexander
    As urbanization continues to grow world wide, cities are experiencing challenges dealing with the increases in pollution, congestion, and availability of public transportation. A new market in aviation, Advanced Air Mobility, has emerged to address these challenges by engineering novel aircraft that are all electric and meant to transport people within and between cities quickly and efficiently. The scale of this market and the associated operations means that vehicles will need to fly with increased autonomy. The lack of highly trained and skilled pilots, along with the increased work load for novel aircraft makes piloted aircraft infeasible at the scale intended or Advanced Air Mobility. While a variety of concepts have been created to meet the performance needs of such operations, the safety and certification requirements of these aircraft remain unclear. The paradigm shift from conventional aircraft to novel, highly integrated, and autonomous aircraft presents many challenges which motivate this work. An emphasis is placed on the safety assessment and the gaps between current regulations and the needs for Advanced Air Mobility. The research objective of this work is to develop a framework for the development and safety assessment of autonomous Advanced Air Mobility aircraft by first examining the existing methods, techniques, and regulations. In doing so, several gaps are identified pertaining to the hazard analysis, reliability analysis of Integrated Modular Avionics systems, and the inclusion of a Run-Time Assurance architecture for vehicle control. An improved hazard analysis approach is developed to capture functional failures as well as systematic areas that can lead to unsafe system behavior. The Systems-Theoretic Process Analysis is supplemented to the Continuous Functional Hazard Assessment so that system behavior and component interactions can be captured. Unsafe system and component actions are identified and used to develop loss scenarios which provide context to the specific conditions that lead to loss of critical vehicle functionality. This information is traced back to identified hazards and used to establish constraints to mitigate unsafe behavior. The Functional Hazard Assessment is then applied to applicable scenarios to provide severity and risk information so that quantitative metrics can be used in additional to qualitative ones. The improved approach develops requirements and determines component and system constraints so that requirements can be refined. It also develops a control structure of the system and assigns traceable items at each step to track how unsafe actions, losses, hazards, and constraints are linked. To improve the reliability modeling of complex modular avionics systems utilizing Multi-Core Processing, a Dynamic Bayesian Network modeling method is developed. This method first utilizes the existing methods defined in ARP 4761 for reliability analysis, namely the Fault Tree Analysis. A mapping is identified for converting fault trees to Bayesian networks, before a Dynamic Bayesian Network is developed by defining how component reliability changes with time. The capability to model reliability of these kinds of systems overtime alone is useful for developing and evaluating maintenance schedules. Additionally, it can handle degradable and repairable components and has the capability to infer failure probabilities using observed evidence. This is useful for identifying weak areas of the system that may be the most likely to cause an overall system failure. A secondary capability is the modeling of uncertainty and the reliability impacts of Multi-Core Processing factors. Subject Matter Expert input and test data can be used to develop conditional dependencies between factors like Worst-Case Execution time, complexity, and partitioning of multi-core systems and their impact on the reliability of the Real-Time Operating System. The added safety challenges of interference and system complexity can be modeled earlier in the design process and can quickly be updated as more information becomes available. Finally, the safe inclusion of autonomy is addressed. To do so, a Simplex architecture is chosen for the development and testing of complex controllers. These controllers are non0deterministic in nature and would otherwise not be certifiable as a result. The Simplex architecture uses an assured back up controller that is triggered when a monitor senses that some predefined safety threshold is breached and gives control back once the system is back to nominal operations. This architecture enables the use of complex control and functionality while also enabling the overall system to be certified. A model predictive control algorithm is developed using a recursive neural network and a receding horizon control scheme that allows a simple system to be controlled quickly and accurately. A PID controller is used as the assured back up controller and the monitoring and triggering capability is demonstrated. The architecture successfully triggers the back up when a threshold is exceeded and hands control back over to the complex controller when the system is brought back to nominal conditions. The main contribution of this dissertation is the development of a modified development assurance and safety management framework that is applicable to Advanced Air Mobility aircraft. The modifications made are specifically targeted at the challenges of applying the existing framework to novel, integrated, complex, and autonomous aircraft. This supports the objective of this research and provides guidance for how existing well understood and trusted methods can be modified for novel applications.
  • Item
    Bayesian, Gradient-Free, and Multi-Fidelity Supervised Dimension Reduction Methods for Surrogate Modeling of Expensive Analyses with High-Dimensional Inputs
    (Georgia Institute of Technology, 2022-04-20) Gautier, Raphael H.
    Modern approaches to engineering design rely on decision-support tools such as design space exploration, engineering optimization, or uncertainty quantification, to make better-informed design decisions. Such approaches typically rely on physics-based analyses that model the aspects of the system-of-interest that are relevant to the design task. As they operate by repeatedly evaluating their underlying analyses, carrying out these so-called “many-query applications” may become prohibitively expensive. Surrogate models act as enablers by replacing the online cost of evaluating analyses with a smaller offline cost spent to gather data used to train a cheap-to-evaluate mathematical model. Two current trends however make the generation of surrogate models more challenging and may therefore hinder the application of modern approaches. First, analyses of higher fidelity and greater computational cost are increasingly used to gather more detailed and accurate design knowledge early on in the design process, leading to the availability of fewer training observations under a constant analysis budget. Second, higher-dimensional parameter spaces are being considered, for example motivated by a more thorough exploration of the design space, the investigation of novel vehicle configurations, or the desire to retain design freedom longer, leading to surrogate models with high-dimensional inputs whose training suffers from the curse of dimensionality. In this thesis, we propose to investigate methods that address the impacts of these two trends on the generation of surrogate models: we seek new methods better suited for the creation of surrogate models with high-dimensional inputs and using only relatively few training observations. In particular, we focus on three surrogate modeling scenarios that map to the three research areas structuring this thesis: 1) single-fidelity surrogate modeling, 2) multi-fidelity surrogate modeling, and 3) active sampling in the multi-fidelity context. The methods proposed in this thesis rely on approximation by ridge functions to alleviate the curse of dimensionality. It consists in first projecting the original high-dimensional inputs onto a low-dimensional feature space, followed by a traditional regression. Accordingly, training such approximations consists in 1) determining a relevant projection, and 2) training the regression model. Multiple contributions are made in this thesis, starting in the single-fidelity context with a fully Bayesian and gradient-free formulation of approximation by ridge functions. Compared to existing approaches, the proposed method enables a full quantification of epistemic uncertainty due to limited training data, in both the regression parameters and the low-dimensional projection. Through a thorough study conducted on multiple datasets originating from science and engineering applications, it is shown to outperform existing state-of-the-art methods. Alternate methods for determining the dimension of the low-dimensional feature space, that aim to address shortcomings of existing methods, are then proposed and assessed. These advancements are then brought to the multi-fidelity context by altering a deep multi-fidelity Gaussian process model to include an initial projection of its inputs and a fully Bayesian approach to its training. Under certain conditions, this approach is shown to make better use of a given analysis budget compared to relying on a single fidelity. The relationship between the projections used for the low- and high-fidelity parts of the model is then investigated. Two approaches to sampling leveraging the feature space are formulated and assessed. The proposed approach to experimental design for selecting the location of high-fidelity observations is shown to outperform a traditional design of experiments in the original input space, but the proposed active sampling approach does not yield any additional improvement. Finally, a coherent approach to multi-fidelity modeling is assembled, that leverages the knowledge of the low-dimensional feature space to assist the selection of expensive, high-fidelity observations and is shown to outperform the state-of-the-art deep multi-fidelity Gaussian process method.
  • Item
    Ship and Naval Technology Trades-Offs for Science And Technology Investment Purposes
    (Georgia Institute of Technology, 2022-01-14) Gradini, Raffaele
    Long-term naval planning has always been a challenge, but in recent years the difficulty has increased. The degradation of the security environment is leading toward a more volatile, uncertain, complex, and ambiguous world, heavily affecting the quality of predictions needed in long-term defense technology investments. This work tackles the problem from the perspective of the maritime domain, with a new approach stemming from the state-of-the-art in the defense investment field. Moving away from classic methodologies that rely on well-defined assumptions, it is possible to find investment processes that are broad enough, yet concrete, to support decision making in naval technology trades for science and technology purposes. In fulfilling this objective, this work is divided in two main areas: identifying technological gaps in the security scenario and providing robust technology investment strategies to cover those gaps. The core of the first part is the capability of decomposing maritime assets using modern taxonomies, to map the impact of different technologies on ships. Once technologies are mapped, they can be traded inside assets, and assets inside fleets to quantitatively evaluate the overall fleet robustness. The first deliverable achieved through this process is called Vulnerable Scenarios, a list of possible conflict scenarios in which a tested fleet would consistently fail. The second deliverable is called Robust Strategies and is made of different technological investments to allow the studied fleet in succeeding the discovered Vulnerable Scenario. To find the first deliverable a large set of scenarios were simulated. The results of this simulation were analyzed using the Patient Rule Induction Method to isolate, among the large set of relevant cases, a subgroup of Vulnerable Scenarios. These were identified by highlight commonalities on shared parameters and variables. Once the Vulnerable Scenarios were discovered, an ad-hoc adaptive response system using a “signpost and trigger” mechanism was used to identify different technologies on the ships studied that could enhance the overall robustness of the fleet. In identifying these technologies, the adaptive system was supported by different taxonomies in performing the different technological trades that allowed the algorithm to find Robust technology Strategies. The methodology was completed by a ranking system that was designed to firstly check all the Robust Strategies in all the scenarios of interest, and then to compare them against ranking metrics defined by decision makers. To test the created methodology, several experiments were conducted across two use cases. The first use case, which involved an anti-submarine warfare (ASW) mission, was used to demonstrate the individual pieces employed in the creation of the methodology. The second use case, involving a large operation made of several tasks, was used to test the overall methodology as one. Both use cases were designed on the same original scenario created in collaboration with former generals and admirals of the US Air Force and the Italian Navy. The primary results of this experiments show that once Vulnerable Scenarios are discovered, it is possible to employ an iterative algorithm that recursively infuse new technologies into the fleet. This process is repeated until Robust Technology Strategies that can support the fleet are selected. The missions designed demonstrated the presence of gaps which had to be covered via technology investment showing how planners will have to account for new technologies to be able to succeed in future challenges. The methodology created in this thesis provided an innovative way of enhancing the screening of maritime scenarios, reducing the leading time for investment decisions on naval technologies. In conclusion, the work done in this thesis helps in advancing the state of the art of methodologies used by planners when looking for Vulnerable Scenarios and for new technologies to invest on. Therefore, this thesis demonstrates that by employing the proposed methodology, Vulnerable Scenarios and relevant technologies can be identified in less time than by employing current methods. These efforts will support planners and decision makers in reacting faster to new emerging threats in unforeseen naval scenarios and, will enable them to identify in a rapid fashion in which areas more investments are needed.
  • Item
    RETROSPECTIVE AND EXPLORATORY ANALYSES FOR ENHANCING THE SAFETY OF ROTORCRAFT OPERATIONS
    (Georgia Institute of Technology, 2021-12-13) Chin, Hsiang-Jui
    From recent safety reports, the accident rates associated with helicopter operations have reached a plateau and even have an increasing trend. More attention needs to be directed to this domain, and it was suggested to expand the use of flight data recorders on board for monitoring the operation. With the expected growth of flight data records in the coming years, it is essential to conduct analyses and provide the findings to the operator for risk mitigation. In this thesis, a retrospective analysis is proposed to detect potential anomalies in the fight data for rotorcraft operations. In the study, an algorithm is developed to detect the phases of flight for segmenting the flights into homogeneous entities. The anomaly detection is then performed on the flight segments within the same flight phases, and it is implemented through a sequential approach. Aside from the retrospective analysis, the exploratory analysis aims to efficiently find the safety envelope and predict the recovery actions for a hazardous event. To facilitate the exploration of the corresponding operational space, we provide a framework consisting of surrogate modeling and the design of experiments for tackling the tasks. In the study, the autorotation, a maneuver used to land the vehicle under power loss, is treated as a used case to test and validate the proposed framework.
  • Item
    A DATA-DRIVEN METHODOLOGY TO ANALYZE AIR TRAFFIC MANAGEMENT SYSTEM OPERATIONS WITHIN THE TERMINAL AIRSPACE
    (Georgia Institute of Technology, 2021-12-10) Corrado, Samantha Jane
    Air Traffic Management (ATM) systems are the systems responsible for managing the operations of all aircraft within an airspace. In the past two decades, global modernization efforts have been underway to increase ATM system capacity and efficiency, while maintaining safety. Gaining a comprehensive understanding of both flight-level and airspace-level operations enables ATM system operators, planners, and decision-makers to make better-informed and more robust decisions related to the implementation of future operational concepts. The increased availability of operational data, including widely-accessible ADS-B trajectory data, and advances in modern machine learning techniques provide the basis for offline data-driven methods to be applied to analyze ATM system operations. Further, analysis of ATM system operations of arriving aircraft within the terminal airspace has the highest potential to impact safety, capacity, and efficiency levels due to the highest rate of accidents and incidents occurring during the arrival flight phases. Therefore, motivating this research is the question of how offline data-driven methods may be applied to ADS-B trajectory data to analyze ATM system operations at both the flight and airspace levels for arriving aircraft within the terminal airspace to extract novel insights relevant to ATM system operators, planners, and decision-makers. An offline data-driven methodology to analyze ATM system operations is proposed involving the following three steps: (i) Air Traffic Flow Identification, (ii) Anomaly Detection, and (iii) Airspace-Level Analysis. The proposed methodology is implemented considering ADS-B trajectory data that was extracted, cleaned, processed, and augmented for aircraft arriving at San Francisco International Airport (KSFO) during the full year of 2019 as well as the corresponding extracted and processed ASOS weather data. The Air Traffic Flow Identification step contributes a method to more reliably identify air traffic flows for arriving aircraft trajectories through a novel implementation of the HDBSCAN clustering algorithm with a weighted Euclidean distance function. The Anomaly Detection step contributes the novel distinction between spatial and energy anomalies in ADS-B trajectory data and provides key insights into the relationship between the two types of anomalies. Spatial anomalies are detected leveraging the aforementioned air traffic flow identification method, whereas energy anomalies are detected leveraging the DBSCAN clustering algorithm. Finally, the Airspace-Level Analysis step contributes a novel method to identify operational patterns and characterize operational states of aircraft arriving within the terminal airspace during specified time intervals leveraging the UMAP dimensionality reduction technique and DBSCAN clustering algorithm. Additionally, the ability to predict, in advance, a time interval’s operational pattern using metrics derived from the ASOS weather data as input and training a gradient-boosted decision tree (XGBoost) algorithm is provided.