Organizational Unit:
Daniel Guggenheim School of Aerospace Engineering

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Organizational Unit
Includes Organization(s)

Publication Search Results

Now showing 1 - 10 of 41
  • Item
    Robust aircraft subsystem conceptual architecting
    (Georgia Institute of Technology, 2013-11-19) Jackson, David Wayne
    Aircraft subsystems are key components in modern aircraft, the impact and significance of which have been constantly increasing. Furthermore, the architecture selection of these subsystems has overall system-level effects. Despite the significant effects of architecture selections, existing methods for determining the architecture, especially early in design, are similar to the use of traditional point solutions. Currently, aircraft subsystems are rarely examined during the conceptual design phase, despite the fact that this phase has a significant influence on aircraft cost and performance. For this reason, there is a critical need to examine subsystem architecture trades and investigate the design space during the conceptual design of an aircraft. Traditionally, after the aircraft conceptual design phase, subsystems are developed in a process that begins with the point selection of the architecture, then continues with its development and analysis, and concludes in the detailed development of the subsystems. The choice of the point design of the architecture to be developed can be made using simplified models to explore the design space. This method known as conceptual architecting is explored in this dissertation. This dissertation also focuses on bringing actuation subsystem architecture trades into conceptual design because of the significant cost impact of this design phase and the interdependence of vehicle sizing with the subsystems impact on the aircraft. The extent of these interdependencies is examined and found to be significant. As a result, this coupling must be captured to enable better informed decision making. A methodology to examine the design space of aircraft subsystem architectures during the conceptual design of aircraft, while incorporating this coupling, is presented herein and applied specifically to actuation architectures.
  • Item
    Requirement analysis framework of naval military system for expeditionary warfare
    (Georgia Institute of Technology, 2013-11-19) Lee, Hyun Seop
    Military systems are getting more complex due to the demands of various types of missions, rapidly evolving technologies, and budgetary constraints. In order to support complex military systems, there is a need to develop a new naval logistic asset that can respond to global missions effectively. This development is based on the requirement which must be satisfice-able within the budgetary constraints, address pressing real world needs, and allow designers to innovate. This research is conducted to produce feasible and viable requirements for naval logistic assets in complex military systems. The process to find these requirements has diverse uncertainties about logistics, environment and missions. To understand and address these uncertainties, this research includes instability analysis, operational analysis, sea state analysis and disembarkation analysis. By the adaptive Monte-Carlo simulation with maximum entropy, uncertainties are considered with corresponding probabilistic distribution. From Monte-Carlo simulation, the concept of Probabilistic Logistic Utility (PLU) was created as a measure of logistic ability. To demonstrate the usability of this research, this procedure is applied to a Medium Exploratory Connector (MEC) which is an Office of Naval Research (ONR) innovative naval prototype. Finally, the preliminary design and multi-criteria decision-making method become capable of including requirements considering uncertainties.
  • Item
    An efficient approach for high-fidelity modeling incorporating contour-based sampling and uncertainty
    (Georgia Institute of Technology, 2013-11-18) Crowley, Daniel R.
    During the design process for an aerospace vehicle, decision-makers must have an accurate understanding of how each choice will affect the vehicle and its performance. This understanding is based on experiments and, increasingly often, computer models. In general, as a computer model captures a greater number of phenomena, its results become more accurate for a broader range of problems. This improved accuracy typically comes at the cost of significantly increased computational expense per analysis. Although rapid analysis tools have been developed that are sufficient for many design efforts, those tools may not be accurate enough for revolutionary concepts subject to grueling flight conditions such as transonic or supersonic flight and extreme angles of attack. At such conditions, the simplifying assumptions of the rapid tools no longer hold. Accurate analysis of such concepts would require models that do not make those simplifying assumptions, with the corresponding increases in computational effort per analysis. As computational costs rise, exploration of the design space can become exceedingly expensive. If this expense cannot be reduced, decision-makers would be forced to choose between a thorough exploration of the design space using inaccurate models, or the analysis of a sparse set of options using accurate models. This problem is exacerbated as the number of free parameters increases, limiting the number of trades that can be investigated in a given time. In the face of limited resources, it can become critically important that only the most useful experiments be performed, which raises multiple questions: how can the most useful experiments be identified, and how can experimental results be used in the most effective manner? This research effort focuses on identifying and applying techniques which could address these questions. The demonstration problem for this effort was the modeling of a reusable booster vehicle, which would be subject to a wide range of flight conditions while returning to its launch site after staging. Contour-based sampling, an adaptive sampling technique, seeks cases that will improve the prediction accuracy of surrogate models for particular ranges of the responses of interest. In the case of the reusable booster, contour-based sampling was used to emphasize configurations with small pitching moments; the broad design space included many configurations which produced uncontrollable aerodynamic moments for at least one flight condition. By emphasizing designs that were likely to trim over the entire trajectory, contour-based sampling improves the predictive accuracy of surrogate models for such designs while minimizing the number of analyses required. The simplified models mentioned above, although less accurate for extreme flight conditions, can still be useful for analyzing performance at more common flight conditions. The simplified models may also offer insight into trends in the response behavior. Data from these simplified models can be combined with more accurate results to produce useful surrogate models with better accuracy than the simplified models but at less cost than if only expensive analyses were used. Of the data fusion techniques evaluated, Ghoreyshi cokriging was found to be the most effective for the problem at hand. Lastly, uncertainty present in the data was found to negatively affect predictive accuracy of surrogate models. Most surrogate modeling techniques neglect uncertainty in the data and treat all cases as deterministic. This is plausible, especially for data produced by computer analyses which are assumed to be perfectly repeatable and thus truly deterministic. However, a number of sources of uncertainty, such as solver iteration or surrogate model prediction accuracy, can introduce noise to the data. If these sources of uncertainty could be captured and incorporated when surrogate models are trained, the resulting surrogate models would be less susceptible to that noise and correspondingly have better predictive accuracy. This was accomplished in the present effort by capturing the uncertainty information via nuggets added to the Kriging model. By combining these techniques, surrogate models could be created which exhibited better predictive accuracy while selecting the most informative experiments possible. This significantly reduced the computational effort expended compared to a more standard approach using space-filling samples and data from a single source. The relative contributions of each technique were identified, and observations were made pertaining to the most effective way to apply the separate and combined methods.
  • Item
    A hybrid probabilistic method to estimate design margin
    (Georgia Institute of Technology, 2013-11-15) Robertson, Bradford E.
    Weight growth has been a significant factor in nearly every space and launch vehicle development program. In order to account for weight growth, program managers allocate a design margin. However, methods of estimating design margin are not well suited for the task of assigning a design margin for a novel concept. In order to address this problem, a hybrid method of estimating margin is developed. This hybrid method utilizes range estimating, a well-developed method for conducting a bottom-up weight analysis, and a new forecasting technique known as executable morphological analysis. Executable morphological analysis extends morphological analysis in order to extract quantitative information from the morphological field. Specifically, the morphological field is extended by adding attributes (probability and mass impact) to each condition. This extended morphological field is populated with alternate baseline options with corresponding probabilities of occurrence and impact. The overall impact of alternate baseline options can then be estimated by running a Monte Carlo analysis over the extended morphological field. This methodology was applied to two sample problems. First, the historical design changes of the Space Shuttle Orbiter were evaluated utilizing original mass estimates. Additionally, the FAST reference flight system F served as the basis for a complete sample problem; both range estimating and executable morphological analysis were performed utilizing the work breakdown structure created during the conceptual design of this vehicle.
  • Item
    Bayesian adaptive sampling for discrete design alternatives in conceptual design
    (Georgia Institute of Technology, 2013-08-23) Valenzuela-Del Rio, Jose Eugenio
    The number of technology alternatives has lately grown to satisfy the increasingly demanding goals in modern engineering. These technology alternatives are handled in the design process as either concepts or categorical design inputs. Additionally, designers desire to bring into early design more and more accurate, but also computationally burdensome, simulation tools to obtain better performing initial designs that are more valuable in subsequent design stages. It constrains the computational budget to optimize the design space. These two factors unveil the need of a conceptual design methodology to use more efficiently sophisticated tools for engineering problems with several concept solutions and categorical design choices. Enhanced initial designs and discrete alternative selection are pursued. Advances in computational speed and the development of Bayesian adaptive sampling techniques have enabled the industry to move from the use of look-up tables and simplified models to complex physics-based tools in conceptual design. These techniques focus computational resources on promising design areas. Nevertheless, the vast majority of the work has been done on problems with continuous spaces, whereas concepts and categories are treated independently. However, observations show that engineering objectives experience similar topographical trends across many engineering alternatives. In order to address these challenges, two meta-models are developed. The first one borrows the Hamming distance and function space norms from machine learning and functional analysis, respectively. These distances allow defining categorical metrics that are used to build an unique probabilistic surrogate whose domain includes, not only continuous and integer variables, but also categorical ones. The second meta-model is based on a multi-fidelity approach that enhances a concept prediction with previous concept observations. These methodologies leverage similar trends seen from observations and make a better use of sample points increasing the quality of the output in the discrete alternative selection and initial designs for a given analysis budget. An extension of stochastic mixed-integer optimization techniques to include the categorical dimension is developed by adding appropriate generation, mutation, and crossover operators. The resulted stochastic algorithm is employed to adaptively sample mixed-integer-categorical design spaces. The proposed surrogates are compared against traditional independent methods for a set of canonical problems and a physics-based rotor-craft model on a screened design space. Next, adaptive sampling algorithms on the developed surrogates are applied to the same problems. These tests provide evidence of the merit of the proposed methodologies. Finally, a multi-objective rotor-craft design application is performed in a large domain space. This thesis provides several novel academic contributions. The first contribution is the development of new efficient surrogates for systems with categorical design choices. Secondly, an adaptive sampling algorithm is proposed for systems with mixed-integer-categorical design spaces. Finally, previously sampled concepts can be brought to construct efficient surrogates of novel concepts. With engineering judgment, design community could apply these contributions to discrete alternative selection and initial design assessment when similar topographical trends are observed across different categories and/or concepts. Also, it could be crucial to overcome the current cost of carrying a set of concepts and wider design spaces in the categorical dimension forward into preliminary design.
  • Item
    A product family design methodology employing pattern recognition
    (Georgia Institute of Technology, 2013-08-23) Freeman, Dane Fletcher
    Sharing components in a product family requires a trade-off between the individual products' performances and overall family costs. It is critical for a successful family to identify which components are similar, so that sharing does not compromise the individual products' performances. This research formulates two commonality identification approaches for use in product family design and investigates their applicability in a generic product family design methodology. Having a commonality identification approach reduces the combinatorial sharing problem and allows for more quality family alternatives to be considered. The first is based on the pattern recognition technique of fuzzy c-means clustering in component subspaces. If components from different products are similar enough to be grouped into the same cluster, then those components could possibly become the same platform. Fuzzy equivalence relations that show the binary relationship from one products' component to a different products' component can be extracted from the cluster membership functions. The second approach builds a Bayesian network representing the joint distribution of a design space exploration. Using this model, a series of inferences can be made based on product performance and component constraints. Finally the posterior design variable distributions can be processed using a similarity metric like the earth mover distance to identify which products' components are similar to another's.
  • Item
    A representation method for large and complex engineering design datasets with sequential outputs
    (Georgia Institute of Technology, 2013-08-22) Iwata, Curtis
    This research addresses the problem of creating surrogate models of high-level operations and sustainment (O&S) simulations with time sequential (TS) outputs. O&S is a continuous process of using and maintaining assets such as a fleet of aircraft, and the infrastructure to support this process is the O&S system. To track the performance of the O&S system, metrics such as operational availability are recorded and reported as a time history. Modeling and simulation (M&S) is often used as a preliminary tool to study the impact of implementing changes to O&S systems such as investing in new technologies and changing the inventory policies. A visual analytics (VA) interface is useful to navigate the data from the M&S process so that these options can be compared, and surrogate modeling enables some key features of the VA interface such as interpolation and interactivity. Fitting a surrogate model is difficult to TS data because of its size and nonlinear behavior. The Surrogate Modeling and Regression of Time Sequences (SMARTS) methodology was proposed to address this problem. An intermediate domain Z was calculated from the simulation output data in a way that a point in Z corresponds to a unique TS shape or pattern. A regression was then fit to capture the entire range of possible TS shapes using Z as the inputs, and a separate regression was fit to transform the inputs into the Z. The method was tested on output data from an O&S simulation model and compared against other regression methods for statistical accuracy and visual consistency. The proposed methodology was shown to be conditionally better than the other methodologies.
  • Item
    A multidisciplinary framework for mission effectiveness quantification and assessment of micro autonomous systems and technologies
    (Georgia Institute of Technology, 2013-08-21) Mian, Zohaib Tariq
    Micro Autonomous Systems and Technologies (MAST) is an Army Research Laboratory (ARL) sponsored project based on a consortium of revolutionary academic and industrial research institutions working together to develop new technologies in the field of microelectronics, autonomy, micromechanics and integration. The overarching goal of the MAST consortium is to develop autonomous, multifunctional, and collaborative ensembles of microsystems to enhance small unit tactical situational awareness in urban and complex terrain. Unmanned systems are used to obtain intelligence at the macro level, but there is no real-time intelligence asset at the squad level. MAST seeks to provide that asset. Consequently, multiple integrated MAST heterogeneous platforms (e.g. crawlers, flyers, etc.) working together synergistically as an ensemble shall be capable of autonomously performing a wide spectrum of operational functions based on the latest developments in micro-mechanics, micro-electronics, and power technologies to achieve the desired operational objectives. The design of such vehicles is, by nature, highly constrained in terms of size, weight and power. Technologists are trying to understand the impacts of developing state-of-the-art technologies on the MAST systems while the operators are trying to define strategies and tactics on how to use these systems. These two different perspectives create an integration gap. The operators understand the capabilities needed on the field of deployment but not necessarily the technologies, while the technologists understand the physics of the technologies but not necessarily how they will be deployed, utilized, and operated during a mission. This not only results in a major requirements disconnect, representing the difference of perspectives between soldiers and the researchers, but also demonstrates the lack of quantified means to assess the technology gap in terms of mission requirements. This necessitates the quantification and resolution of the requirements disconnect and technology gap leading to re-definitions of the requirements based on mission scenarios. A research plan, built on a technical approach based on the simultaneous application of decomposition and re-composition or 'Top-down' and 'Bottom-up' approaches, was used for development of a structured and traceable methodology. The developed methodology is implemented through an integrated framework consisting of various decision-making tools, modeling and simulation, and experimental data farming and validation. The major obstacles in the development of the presented framework stemmed from the fact that all MAST technologies are revolutionary in nature, with no available historical data, sizing and synthesis codes or reliable physics-based models. The inherently multidisciplinary, multi-objective and uncertain nature of MAST technologies makes it very difficult to map mission level objectives to measurable engineering metrics. It involves the optimization of multiple disciplines such as Aero, CS/CE, ME, EE, Biology, etc., and of multiple objectives such as mission performance, tactics, vehicle attributes, etc. Furthermore, the concept space is enormous with hundreds of billions of alternatives, and largely includes future technologies with low Technology Readiness Level (TRL) resulting in high uncertainty. The presented framework is a cyber-physical design and analysis suite that combines Warfighter mission needs and expert technologist knowledge with a set of design and optimization tools, models, and experiments in order to provide a quantitative measure of the requirements disconnect and technology gap mentioned above. This quantification provides the basis for re-definitions of the requirements that are realistic in nature and ensure mission success. The research presents the development of this methodology and framework to address the core research objectives. The developed framework was then implemented on two mission scenarios that are of interest to the MAST consortium and Army Research Laboratory, namely, Joppa Urban Dwelling and Black Hawk Down Interior Building Reconnaissance. Results demonstrate the framework’s validity and serve as proof of concept for bridging the requirements disconnect between the Warfighter and the technologists. Billions of alternative MAST vehicles, composed of current and future technologies, were modeled and simulated, as part of a swarm, to evaluate their mission performance. In-depth analyses of the experiments, conducted as part of the research, presents quantitative technology gaps that needs to be addressed by technologist for successful mission completion. Quantitative values for vehicle specifications and systems' Measures of Performance were determined for acceptable level of performance for the given missions. The consolidated results were used for defining mission based requirements of MAST systems.
  • Item
    Conceptual design methodology of distributed intelligence large scale systems
    (Georgia Institute of Technology, 2013-06-28) Nairouz, Bassem R.
    Distributed intelligence systems are starting to gain dominance in the field of large-scale complex systems. These systems are characterized by nonlinear behavior patterns that are only predicted through simulation-based engineering. In addition, the autonomy, intelligence, and reconfiguration capabilities required by certain systems introduce obstacles adding another layer of complexity. However, there exists no standard process for the design of such systems. This research presents a design methodology focusing on distributed control architectures while concurrently considering the systems design process. The methodology has two major components. First, it introduces a hybrid design process, based on the infusion of the control architecture and conceptual system design processes. The second component is the development of control architectures metamodel, placing a distinction between control configuration and control methods. This enables a standard representation of a wide spectrum of control architectures frameworks.
  • Item
    Methodology for global optimization of computationally expensive design problems
    (Georgia Institute of Technology, 2013-06-25) Koullias, Stefanos
    The design of unconventional aircraft requires early use of high-fidelity physics-based tools to search the unfamiliar design space for optimum designs. Current methods for incorporating high-fidelity tools into early design phases for the purpose of reducing uncertainty are inadequate due to the severely restricted budgets that are common in early design as well as the unfamiliar design space of advanced aircraft. This motivates the need for a robust and efficient global optimization algorithm. This research presents a novel surrogate model-based global optimization algorithm to efficiently search challenging design spaces for optimum designs. The algorithm searches the design space by constructing a fully Bayesian Gaussian process model through a set of observations and then using the model to make new observations in promising areas where the global minimum is likely to occur. The algorithm is incorporated into a methodology that reduces failed cases, infeasible designs, and provides large reductions in the objective function values of design problems. Results on four sets of algebraic test problems are presented and the methodology is applied to an airfoil section design problem and a conceptual aircraft design problem. The method is shown to solve more nonlinearly constrained algebraic test problems than state-of-the-art algorithms and obtains the largest reduction in the takeoff gross weight of a notional 70-passenger regional jet versus competing design methods.