During the system-elements’ requirement-based test approach (RBT&E;), it is difficult for both Systems Engineering (SE) and Test and Evaluation (T&E;) communities to objectively identify the corresponding system-of-interest (SOI) capability (or ability to achieve requirements throughout its intended operational environment) through aggregate interpretation and analysis of available element performance and verification data. This is because traditional RBT&E; data are collected under unique measurements and verification conditions specific to only the allocated element requirements (e.g., element A’s demonstrated y1-mph speed under x1-degree verification condition versus element B’s demonstrated payload of y2-tons at x2-m altitude). Such mismatched variables cannot be quantitatively amalgamated to objectively forecast a comprehensive and representative SOI operational capability, thus reverting analyst and stakeholders to subjectively interpret element verification data and produce imprecise and/or incomplete SOI capability forecasts. In this study, the Scenario-Based Experimental-Design (SBED) test approach has been developed as a supplement to the traditional RBT&E; approach to resolve these data-mismatch challenges. Through the implementation of Design of Experiments (DoE) test methods and response surface methodology statistical techniques, SBED models every independent element verification data as respective response models with respect to (w.r.t.) the “SOI-element capability” variable (or ability to fulfill corresponding element requirements) all as influenced by the overall SOI’s intended operational environment; it then objectively forecasts the SOI capability via either standalone Boolean-algebraic summation computations of the (now matching) element-performance response models. This study’s SBED test approach contributes to both SE and T&E; through the introduction of two benefits that have been previously unobtainable during SOI Implementation process. First, the early delivery of objective and quantitative SOI capability forecast(s) to support upfront identification and management of SOI design-related risks and uncertainties. Second, the beneficial delivery of more-descriptive “element capability” response models, which comprehensively characterize the respective element performances throughout the entire SOI operational environment thus expanding the scope of element performance verifications to beyond just allocated requirement-specified conditions. These introduced benefits are demonstrated through a data-analysis example in this dissertation that applies SBED’s data-analysis techniques to an existing United States Air Force flight test’s data set and compares its SBED-produced quantitative SOI capability forecasts against the test’s original qualitative conclusions.
|In Administrative Set: