A Preliminary Framework For Test and Evaluation of Systems of Systems Open Access
Downloadable ContentDownload PDF View PDF in Browser Report an accessibility issue with this item
Modern systems are becoming exponentially more complex and interconnected, particularly in software-intensive systems. Department of Defense (DoD) mandates for net-centric capabilities have greatly complicated the challenges for testing new and legacy systems that are employed in a system of systems (SoS) environment. Traditional "platform-centric" test methodologies have not fared well in this environment, yet systems must still be tested. Reuse of legacy systems, system complexity and potential benefits of interconnectivity have driven systems into an SoS environment that is far more difficult for testing than was initially thought. The desire to "test everything" has been recognized as unrealistic and unaffordable, both in cost and in time. A case study of the Air Force Modeling and Simulation Training Toolkit and how it has successfully attacked challenges of connecting into multiple other systems is presented to illustrate the test and evaluation challenges in an SoS environment. Lessons learned from the program's success are discussed and a preliminary framework for more general cases of testing in an SoS environment is presented. Comments on that framework were solicited from acquisition professionals working on programs testing in the SoS environment. These comments are presented and they give a sense that the recommended framework has relevance in the opinion of those professionals. Recommendations for SoS T&E; execution for both DoD and non-DoD programs are then presented along with a recommendation for formulation of an alternative DoD SoS acquisition model.
Notice to Authors
If you are the author of this work and you have any questions about the information on this page, please use the Contact form to get in touch with us.