MARCH 2026 I Volume 47, Issue 1
MARCH 2026
Volume 47 I Issue 1
IN THIS JOURNAL:
- Issue at a Glance
- Chairman’s Message
Professional Development
- ITEA Initiatives Reinvigorating Certified Test Professionals
Technical Articles
- Continuous Integration Test and Evaluation Concept of Operations
- A Practitioners Perspective on Implementing an Agile-V Hybrid Model in Complex Engineering Systems
- Aircraft Instrumentation Integrates with Naval Test Wing Atlantic
- Building Trust in Autonomous Systems through an Alternative Test Strategy
News
- Association News
- Chapter News
- Corporate Member News
![]()
Continuous Integration Test and Evaluation Concept of Operations

Dr. William Fisher
Principal Systems Security Engineer, The MITRE Corporation; Ft. Meade, MD
![]()
![]()

Ben Jimenez
Digital Engineering Initiative Lead, The MITRE Corporation; McLean, VA
![]()
![]()

Mackenzie Moss
Lead Model Based Systems Engineer, The MITRE Corporation; Colorado Springs, CO
![]()
![]()

Natalia Henriquez Sanchez
Senior Systems Engineer, The MITRE Corporation; Los Angeles, CA
![]()
![]()
The view, opinions, and/or findings contained in this report are those of The MITRE Corporation and should not be construed as an official Government position, policy, or decision, unless designated by other documentation.
This technical data deliverable was developed using contract funds under Basic Contract No. W56KGU-18-D-0004.
Abstract
This paper presents a Concept of Operations (CONOPS) for Continuous Integration Test and Evaluation (CITE), an engineering method designed to address the growing complexity and risk to both system functionality and mission success in the Department of War (DoW) capability lifecycle. Building on advances in digital tools and model-based engineering, CITE elevates continual integration and test as a guiding tenet, enabling iterative development and agile responses to evolving operational needs. The approach leverages composable and executable models to facilitate frequent integration and testing, reduce risk, and accelerate delivery of warfighter capabilities. DoW lifecycle vignettes illustrate how CITE supports cross-domain collaboration, mission engineering, and digital transformation. The paper also discusses roll-out, implementation challenges, and provides foundational principles for leadership to realize the benefits of CITE across the capability lifecycle.
Keywords: Continuous Integration Test and Evaluation (CITE), Agile Systems Engineering (Agile SE), Model-Based Systems Engineering (MBSE), Iterative Development, Digital Thread
Introduction
This article is the fourth in a series examining how advances in digital tools can enable rapid improvements in warfighter capability development, targeting transformative gains in cost, schedule, and performance.
The first article covered the overall concept of test and evaluation (T&E) as a continuum (Collins and Senechal 2023). The second covered how the DoW can apply a decision support evaluation framework as a disciplined methodology to inform decisions across the spectrum of technology development (Beers 2024). The third covered how thinking of technology development as a campaign of learning can effectively and efficiently accelerate acquisition (Miller 2025). Building on these foundations, this paper focuses on leveraging modern tools to drive continuous testing, resulting in iterative development that helps mitigate the risk of undesired behaviors in complex systems and allows for agile responses to evolving operational needs.
Problem Statement
Acquisition and development teams struggle to adapt quickly enough to keep pace with changes in threats, system components, and resources, hindering their ability to meet operational needs. Meanwhile, DoW cyber-physical systems–engineered systems that are built from, and depend upon, the seamless integration of computation and physical components — grow more complex each year (NSF 2025). They incorporate more demanding technologies, like AI and machine learning, to meet expanding operational needs and interact with other complex systems in complicated mission architectures. Increasing system complexity tends to increase development time and risks to system functionality, while the pace of technological change and evolving operational needs introduces risk to mission capability. Taken together, these factors create a widening gap between the pace of capability development and the rapidly evolving demands of modern warfare.
Proposed Solution
The goal of CITE is to enable programs to deliver the desired capability on time while tackling the rising complexity of modern systems. As hardware, software, and mission‑level functions become increasingly interwoven, the risk of unpredictable behavior grows. By continuously reducing those risks through frequent, iterative integration and testing, CITE generates timely, relevant data from each test event. This data‑driven insight enables decision‑makers to understand and manage system complexity, providing actionable knowledge that fuels informed iteration and agile responses to evolving operational needs.
Complex systems, by definition, exhibit behaviors that are difficult to predict by understanding the behavior of individual components. Unpredictable system behaviors risk mission failure. To address this challenge, we advocate for early and continuous evaluation, emphasizing deliberate and purposeful testing at each stage of integration. Extended gaps between integration and test events allow risk to grow exponentially. By employing digital models and leveraging iterative development with smaller, higher-frequency test events, teams gain the ability to simulate, analyze, and visualize system interactions throughout the lifecycle. This approach embodies the tenet “integrate and test early, small, and often,” marking a departure from traditional DoW practices that emphasize large-scale test events tied to rigid schedules and milestones. Here, “small” not only refers to the scope of individual tests, but also captures the advantage of driving decision-making down to smaller test teams, empowering them to take ownership and collaborate earlier in the testing process. Integrating T&E seamlessly with system design and mission engineering throughout the lifecycle enables evaluation at the pace of software development, fostering rapid feedback and continual refinement of capabilities to ensure responsiveness to evolving operational needs.
Background
History of Engineering Methods
Charles Perrow’s Normal Accidents is a classic introduction to complexity and its consequences in engineered systems (Perrow 1999). Perrow uses the 1979 Three Mile Island accident and other examples to show how systems with high interactive complexity and tight coupling are vulnerable to undesired behaviors. Perrow’s work, which has become foundational in safety and risk management, argues that, for these systems, breakdowns and accidents are effectively inevitable given their nature. Defense systems are becoming more interconnected and software-driven, pushing them toward the “edge of chaos,” where system behavior can look stable under normal conditions but become prone to surprising and undesired outcomes (Adcock, Sillitto and Sheard 2025).
Iterative methods have been written about since at least the 1930s with “plan-do-study-act” by Shewhart at Bell Labs, before the invention of software (Shewhart 1939). However, most of the development of iterative methods over the past 50+ years has been for software development. Notable milestones include Boehm’s “A Spiral Model of Software Development and Enhancement” (Boehm 1988) and perhaps the true breakout moment with the “Manifesto for Agile Software Development” in 2001 (Beck, et al. 2001). As compute became ubiquitous it blurred the line between software and hardware resulting in what we now know as “cyber-physical systems.” A synthesis of iterative methods to account for the fundamental differences between the hardware and software components is Industrial DevOps (Johnson and Yeman 2023). Johnson and Yeman adapt Lean, Agile, and DevOps principles to create an engineering method flexible enough to support different development speeds of different components. Finally, Figure 1 shows that there has been significant research into continuous integration and test through the work of Shahin, Elbaum, Stolberg, and Campos among many others but again, this work is almost solely for software development (Shahin, Babar and Zhu 2017) (Elbaum, Rothermel and Penix 2014) (Stolberg 2009) (Campos, et al. 2014).

Figure 1. Litmap for “Continuous IT&E”.
Complex systems has become a major field of research in both mathematical and social sciences but it has not yet penetrated engineering methods to yield generally accepted best practices. The work of Sheard and Mostashari introduces concepts of complexity to systems engineers and identifies tools to explore complexity in a system but does not propose a holistic methodology to systematically address complexity (Sheard and Mostashari 2008). Pennock and Wade directly question this gap in “The Top 10 Illusions of Systems Engineering: A Research Agenda.” They push us to find methods that do not rely on these illusions (Pennock and Wade 2015).
CITE is an attempt to merge the best of these two branches of research — iterative development for hardware and methods to address complexity — into an engineering method that explicitly addresses complexity in cyber-physical systems.
Continuous Integration Test & Evaluation Definition
The Developmental Test, Evaluation, and Assessments (DTE&A) office within the Office of the Undersecretary of War for Research and Engineering (OUSW(R&E)) proposes an engineering method called CITE to address this problem.
Our approach is inspired by the influential work of Rick Dove, whose vision for Agile Systems Engineering (Agile SE) has significantly informed modern approaches to system development. Agile SE defines itself as “…addressing what needs to be accomplished and why, without constraints or directions on how. How those strategies manifest as operational methods depends on the engineering context” (Dove, Lunney, et al. 2023). CITE is an operational method of Agile SE tailored to the engineering context of DoW and its complex cyber-physical systems. CITE maintains the core principles of systems engineering while leveraging new tools to conduct test events continuously.

Figure 2. Eight core aspects of Agile SE; adapted from Dove, et al. 2023.

Figure 3. The CITE operational method elevates the Agile SE aspect Continual Integration and Test to a tenet.
Agile SE is composed of“ …eight agility-supporting strategic aspects…” (Dove, Lunney, et al. 2023). It treats all aspects equally, a set of tools from which engineers construct an operational method to meet the needs of their engineering context by emphasizing and/or tailoring the aspects, as shown in Figure 2. To implement the tenet: “integrate and test early, small, and often,” CITE elevates Continual Integration and Test and uses all other aspects to make continuous testing possible as shown in Figure 3.
Each integration and test event, analogous to a T&E event, creates knowledge from which engineers make decisions about how to proceed in the next iteration. Iterative Incremental Development (hereinafter referred to as Iterative Development) and Being Agile: Operations Concept (hereinafter referred to as Agility) are a CITE Result of decision-making after each integration and test event. The remaining five core aspects of Agile SE are Enablers of Iterative Development and Agility as shown in Table 1.
Table 1. CITE Enablers of Iterative Development and Agility.

Note that CITE changes the prioritization of the Agile SE aspects compared to most iterative development methods. Typically, iterative development methods focus on shortening time between deliveries of value to the customer. Implicit in this choice is that evolving user needs are the largest source of risk. However, in the DoW engineering context, the complexity of systems is typically the largest source of risk. Complexity also stifles agility, meaning that even if user needs are perfectly understood, programs cannot pivot to meet those needs. Thus, CITE seeks to address the risk of system complexity to better adapt to changes in operational need. Rather than promoting iteration for its own sake, and demanding that testing keep up, CITE positions test as the generator of knowledge, after which informed iteration can occur.
Why Now?
Testing earlier and more often has been a stated goal in DoW for decades. Why can CITE be successful now when previous efforts have failed (OUSD(R&E) Defense Science Board 2024)? Continuous testing and iterative development are dominant in software development because software is highly composable and executable. Composable components can, with almost infinitesimal cost in resources, be integrated into different configurations (Nierstrasz and Meijler 1995). Executable components can be run to reproduce the dynamics of the system being simulated. To achieve our tenet, we must represent the system under design in a software-like format. Recent digital technology advances in modeling tools and standards, big data, cloud computing, internet of things, co-simulation, and other fields have made digital representations of systems more accurate, accessible, and improved the quality of their predictions. By using model-driven development, we take advantage of composability, executability, and testability to integrate and test continuously to bend the engineering integration risk curve downward.
CONOPS Scope
This document is a CONOPS, i.e., a verbal or graphic statement that clearly and concisely expresses what the commander intends to accomplish and how it will be done using available resources (DAU 2025). This document includes more detail than an OV-1, but is concise and not a detailed implementation plan, nor does it contain explicit instructions for execution. The goal of this document is to convey the intent, the tenet, and foundational principles and enable leadership to implement the concept using local resources.
CITE CONOPS Foundations to Support Execution
Agile Systems Engineering Lifecycle Model (ASELCM) Pattern
To promote clarity in discussions of systems engineering, digital engineering, and model-based systems engineering, we will leverage the Agile Systems Engineering Lifecycle Model (ASELCM) Pattern as shown in Figure 4 (Dove and Schindel, Agile Systems Engineering Life Cycle Model for Mixed Discipline Engineering 2019).

Figure 4. Adaptation of the ASELCM Pattern based on Dove and Schindel, 2019.
System 1: Engineered System
In the engineering context of DoW, engineers develop a DoW engineered system (or systems of systems), referred to as System 1.
System 2: Engineering and Lifecycle Management Processes
The total environment in which System 1 will experience its lifecycle is System 2. System 2 includes the DoW engineering system that System 1 interacts with during development, the lifecycle management system that manages System 1 instances, and the environment into which System 1 is deployed.
System 3: Process Lifecycle Management Processes
The total environment in which System 2 will experience its lifecycle is System 3. In the engineering context of DoW, System 3 is the larger DoW setting policy, guidance, processes, inspiration, and resources that enable System 2. This article is a System 3 product, based on observations of many System 2s, that seeks to guide deployment into System 2s.
CITE is executed by the people, processes, and tools of System 2. Thus, CITE is a System 2 process whose goal is to improve delivery of System 1s into Environment 1s. System 3, including the authors of this article, will continue to observe the effects of CITE on System 2s and System 1s and iterate the CONOPS as results dictate.
Executable, Composable Models Enable Continual Integration and Test with Hardware
To develop a CONOPS implementing the tenet “integrate and test early, small, and often,” we examine what makes it easier for System 2 to integrate and test the components of System 1. Before System 1 is fielded, System 2 must describe System 1 to ensure a common understanding of System 1. Traditionally, systems engineers develop a series of documents that describe System 1. These documents are static descriptions of the final System 1. Documents are not highly composable; creation, maintenance, and use require humans in the loop and significant resources. As we mentioned earlier, hardware composability is significantly less than software. In hardware dominant systems, such as those common to the era in which systems engineering was created, the lack of composability of documents was a good “impedance match” between waterfall development and non-composable hardware, as shown in Figure 5.

Figure 5. Impedance match between development approaches (waterfall vs. iterative) and System 1 representations (documents vs. executable models).
Describing System 1 with non-composable documents provides an “impedance match” with System 2 waterfall development; this can work for hardware dominant systems. Describing System 1 with non-composable documents is an “impedance mismatch” with System 2 Iterative Development. The time and resources required to deploy, integrate, observe, and decide are so great that System 2 cannot establish a short iteration cadence, minimizing the benefits of Iterative Development. Describing System 1 with composable, executable models provides an “impedance match” between System 2 Iterative Development and the hardware portions of a complex cyber-physical system. Executable models provide a “software-like buffer” between Iterative Development processes and the hardware components they describe.
Despite the best efforts of interface standards, Modular Open Systems Approach (MOSA), and advanced manufacturing, hardware remains less composable than software. Simply “doing Agile” does not provide significant advantage in complex cyber-physical systems. When System 1 is composable, it is easy to integrate many components. When System 1 is executable, it is easy to test the integrated components. When System 1 is both composable and executable, like a pure software system, Iterative Development can provide significant advantages in speed of delivery and quality (GAO 2023). System 2 needs System 1 to be composable and executable throughout its lifecycle: we need executable models.
While modeling and simulation have existed for decades, recent changes in big data, cloud computing, and model-based engineering made it possible to create not just models of System 1 but composable and executable models that are therefore integrable and testable. An executable model is more than a drawing; it is a “computer implementation of discrete and/or continuous time models,” enabling the exploration of system behavior (Amissah 2018). With modern tools, the description of a system is becoming more like that of software. To capitalize on this advance, CITE tailors the tools and processes of System 2 to enable integration and testing of System 1 components “early, small, and often”.
System 2 learns from each test, makes decisions about how to proceed, and iteratively develops System 1. In this Iterative Development process, we measure the effectiveness of System 2 and again take advantage of modern tools to scale System 2. These three elements, (1) model-driven development to create integrable and testable descriptions, (2) iterative integration and testing, and (3) scalable engineering tools, are the foundations of CITE.
Integration and Test Events are Risk Reduction Activities
In complex systems, integration risk can dominate due to unexpected behaviors caused by the interaction of the many components. Each integration and test event decreases the engineering integration risk “bending the risk curve down” as shown in Figure 6. Long durations between integration and test events allow risk to build as engineers design new components. Frequent integration and test events curb growth in the risk curve provided the amount the curve is bent is meaningful.

Figure 6. Comparison of engineering integration risk burndown curves for current state development vs. a future state development using CITE.
The amount the engineering integration risk curve is bent down with each integration and test event is proportional to the number of components integrated and representativeness of the representations of each component involved. The more information gained in a test event, the more the risk curve is bent down. Table 2 shows the Hierarchy of Representativeness of System 1 descriptions.
Table 2. Hierarchy of Representativeness of System 1 representations.

Recall that System 2 encompasses the engineering and lifecycle management processes that guide the development, integration, and testing of System 1. The CITE CONOPS describes the System 2 process, while referencing the CITE Enablers, to:
1.Integrate and test components early, small, and often.
2.Perform Iterative Development with a disciplined cadence to synchronize efforts across teams.
3.Rapidly advance up the hierarchy of representativeness to improve system fidelity and relevance.
Embodying the Software Acquisition Pathway
In early 2025, Secretary of War Hegseth directed the Department of Defense (DoD) to “adopt the Software Acquisition Pathway (SWP) as the preferred pathway for all software development components of business and weapon system programs in the Department” (OSD 2025). The adoption of the SWP marks a significant shift in how DoW approaches capability development, emphasizing iterative delivery, composable architectures, and rapid responsiveness to operational needs. These priorities directly align with the foundational principles of CITE discussed throughout this paper.
DoW complex cyber-physical systems are engineered systems that are built from, and depend on, the seamless integration of computational and physical components. System 1 includes a significant software component upon delivery to the warfighter. However, as discussed above, using executable models as the descriptions of System 1 during development is to have a software-like description of System 1 from the beginning. By leveraging executable models and model-driven development, CITE operationalizes the SWP philosophy for both pure software systems and complex cyber-physical systems that integrate both computational and physical components.
CITE embodies the SWP by creating System 1 as software models and then transforming some software models into hardware as the mission requires, as shown in Figure 7. This approach bridges the gap between traditional hardware-centric development and modern software-driven practices, ensuring that risk is reduced and capability delivery is accelerated. CITE allows almost all acquisitions to adopt the SWP for large portions of their scope, extending its advantages across a broad range of DoW programs.

Figure 7. CITE creates a software-like description of System 1 using models, and transforms some models into final software and some models into hardware to create physical effects as needed.
Test and Model-driven Framework Industry Example: Anduril’s Crucible
In 2024, Anduril Industries, Inc, published an article on its website which described their “Secret to Rapid Development at Scale”, Project Crucible (Enayet and Subramanian 2024). Crucible consists of three key dimensions which express similar ideas to CITE and other best practices promoted by OUSD(R&E) for DoW capability development, as shown in Table 3.
Table 3. Anduril’s Crucible Framework.

Anduril’s Crucible Framework provides an industry example that reinforces the core principles of CITE described earlier in this paper. The dimensions outlined in Table 3 reflect the CITE tenet of “integrate and test early, small, and often.” This alignment demonstrates that leading commercial practices are consistent with CITE and validate its operational effectiveness for rapid capability delivery.
By emphasizing mission-driven validation and iterative development, Anduril’s approach exemplifies the same model-driven, agile philosophy that underpins the Software Acquisition Pathway (SWP) discussed in the previous section. This connection illustrates how CITE principles are being successfully applied in industry to manage complexity and accelerate delivery, providing tangible evidence for their adoption in DoW programs.
This example also sets up our next discussion, which explores the application of CITE throughout the DoW capability lifecycle. The practices highlighted in Table 3 show how frequent integration, composable models, and mission context can be leveraged at every phase to reduce risk and enable agile responses to evolving operational needs.
CITE Throughout the Capability Lifecycle
As a System 2 process, CITE interacts with System 1 throughout its lifecycle. The tenet “integrate and test early, small, and often,” iterative development, and model-based engineering change how engineers interact with representations of System 1 and change the products delivered in different phases of the lifecycle. This section provides a series of vignettes of the CITE CONOPS at different phases in the lifecycle. The vignettes are designed to highlight tool and process choices that implement the tenet “integrate and test early, small, and often,” how those choices enable Agile SE aspects, and how those choices differ from standard DoW practice. Figure 8 shows the five phases used to describe the lifecycle.

Figure 8. A generalized view of DoW lifecycle phases across multiple acquisition pathways.
Concept Definition: Mission Engineering with Executable Models Provides Attentive Situational Awareness
The goal of Concept Definition is to define the operational need and provide a common understanding to all teams, including measurements of System 1 fit to operational need, creating Attentive Situational Awareness. DoW laid a cornerstone of CITE with the publication of the Mission Engineering Guide (OUSD(R&E) Mission Capabilities 2023). Mission engineering, and especially digital mission architectures, provide composable, executable descriptions of mission context in the form of behavioral and structural descriptive models and operational mission simulations. To efficiently design System 1, the engineers of System 2 need a common understanding of the current mission context and measurements of System 1 fit to operational need.

Figure 9. Mission engineering and digital mission architectures provide a composable, executable model of mission context, represented by the image of the airplane.
The image of an airplane represents the common mission context for System 1, as shown in Figure 9 (the mission engineering process graphic is from the Mission Engineering Guide (OUSD(R&E) Mission Capabilities 2023)). Mission engineering and digital mission architectures provide a composable, executable model of mission context, represented by the image of the airplane. This image will provide a framework to which all descriptions of System 1 refer. The mission model provides an executable reference to which all descriptions of System 1 are tested. Changes to the mission model instantly propagate to all engineers enabling Attentive Situational Awareness. System 2 can react to the timely knowledge of changes in operational need. Later vignettes in the lifecycle will construct an approximation of the airplane, using the image as a template, out of models of increasing fidelity. The system models trace to the composable, executable mission model, allowing engineers to test at the speed of execution and immediately understand the mission impact of changes to System 1. This contrasts with traditional developmental testing to specifications which can lead to myopia in which System 1 is built to meet spec, not to meet a mission.
Early, lightweight models can provide extensive insight into future performance including acting as a live, virtual, constructive (LVC) interface for end users during demonstrations at the end of each Iterative Development cadence. Engineers treat descriptive behavioral model execution and operational mission simulations as tests and store the data in a test format to allow longitudinal studies of changes to System 1. Under an appropriate configuration management approach, engineers receive continuous feedback from end users and mission owners and adapt to changes in operational need without the delay of propagation through a series of documents. Composable, executable mission models provide timely knowledge of emergent risks and opportunities enabling Attentive Situational Awareness.
Technology Maturation: Composable Models Accelerate Innovation through Attentive Decision Making
The goal of Technology Maturation is to reduce the technical risk of new technologies, provide decision-makers with evidence to determine whether technology is ready to move forward in the lifecycle, and facilitate transition into programs of record. CITE improves Technology Maturation by enabling Attentive Decision Making, sharing knowledge via models, including the mission model created in the Concept Development vignette, and empowering decision-making at the point of most knowledge. Composability of models enables reuse of models. Engineers move models, including traceability to mission and test models, from a System 1 within a System 2 into another System 1 in another System 2, as shown in Figure 10.

Figure 10. Composable models transfer from a System 1 within a System 2 into another System 1 in another System 2. Engineers construct their System 1 model from reused models containing information about the mission context in which the model was tested, enabling the greatest acceleration of testing, not retesting.
Engineers move models between Technology Maturation programs, enabling sharing of knowledge about technologies, between Technology Maturation programs and programs of record, facilitating the transition into operations, or combining models from both types of programs to perform mission engineering analysis to decide where to focus resources. This sharing reduces retesting because models come with test data and the mission context in which tests were performed, avoids starting from a blank slate, and makes deliverables composable, executable models so it is easier for follow-on programs to integrate the developed technology.
To ease composability, CITE takes advantage of automation in the model-driven environment. Engineers construct analytics that use quantitative data and metadata to estimate how much to trust a model to represent the final System 1. One example is Model Validation Levels (MVLs), an “objective, automatable framework for quantifying how much trust can be placed in the results of a model to represent the real world” (Provost, Stafford and Jones 2024). The MVLs shown in Table 4 indicate the ability of a model to produce data as trustworthy as a referent in the right column. For example, a model that achieves an MVL of 3 can produce data as trustworthy as a single system component being tested within a controlled environment.
Table 4. Interpretation of MVL in terms of trust placed in different data sources.

During each cadence, engineers determine which MVL the integrated System 1 must achieve in the integration and test event and calculate which MVL the component model must achieve to reach that goal. In this way, MVL is a tool for determining how fast an engineer can push up the Hierarchy of Representativeness. This constrains testing to what is needed, eases integration, and helps establish a thorough model validation plan.
Computation time limits the effective composability, or useability, of a model. Computation times that approach or exceed the cadence time make Iterative Development difficult. Engineers make little progress with each iteration. Scalability of model-driven development becomes a challenge. As the fidelity of executable models increases, engineers create reduced order models (ROMs) that act as substitute approximations for full order models (FOMs) in certain workflows. Engineers configure the toolchain to allow users to select ROMs vs FOMs. For example, engineers use ROMs to quickly determine if changes to models have driven System 1 outside accepted mission performance but execute FOMs in overnight regression tests when time is less of a constraint.
Engineering & Manufacturing: Collaborative Modeling Builds Common-Mission Teaming for Seamless Integration
The goal of Engineering and Manufacturing is to develop a system that meets the operational need and is ready for full-rate production and deployment. CITE improves Engineering and Manufacturing by increasing parallelization of work while maintaining quality through Common-Mission Teaming via the shared models from the previous two vignettes. Using Figure 11 as a template, engineers construct an approximation of the airplane out of models of increasing fidelity. A cadence of iterations synchronizes short and long lead-time deliveries. Iteration 1 is effectively Concept Definition. Mission Engineers create executable descriptive models of the mission context to provide measurements of System 1 fit for the remainder of the lifecycle. At the end of the first iteration, mission engineers validate the mission model with end users and periodically repeat this validation to incorporate changes in operational need. These changes propagate automatically to all integrated models. Early integration and test events confirm descriptive models of System 1 behave as desired in the mission context.

Figure 11. Model-driven Iterative Development increases the coverage and fidelity of models, verified and validated at the end of each iteration.
Iteration X shows increasing system model coverage and increasing fidelity with the appearance of computational models. At the end of iteration X, engineers integrate and test descriptive and computational system models in the context of a computational mission model. Quantitative measurements of increasing fidelity reduce the risk System 1 will fail to meet operational needs.
Around Iteration Y, engineers develop physical models, engineering samples, and reduced order models. Descriptive and computational models fill gaps in the integrated system by representing hardware components. This is especially valuable for hardware components with long lead-times or interfaces with significant user interaction. Model representations of user interfaces allow users to validate system capabilities very early in the lifecycle. The user interface, through integration with system models for behavior and mission models for context, becomes a key test asset exercising both developmental testing and operational testing of System 1 during each iteration. When engineering samples become available, models connect to the physical hardware for hardware-in-the-loop testing. When computational models become burdensome, engineers create reduced order models that can verify and validate changes and provide real-time capability for the user interface.
Iteration Z represents operational testing prior to deployment. System 1 is fully physically instantiated. At the end of the iteration, System 1 flies in an operationally representative scenario on a test range. Almost every aspect of the operational test has been executed previously via digital and physical models allowing the operational test to focus on measures of effectiveness and measures of suitability in a system of systems context rather than measures of performance.
CITE’s model-based engineering core can assist in important, and often overlooked, analyses during the Engineering and Manufacturing Development (EMD) phase. Comprehensive system modeling provides a foundation for conducting early studies such as Diminishing Manufacturing Sources and Material Shortages (DMSMS), Human Systems Integration, Environment Safety and Occupational Health (ESOH), and other assessments. The nature of CITE enforcing collaboration between mission engineering, systems engineering, and T&E also assists in maintaining traceability between mission-, system-, and program-oriented objectives. By this phase, numerous activities are being performed by a multitude of organizations, necessitating collaboration to ensure effective information exchanges between them. CITE emphasizes cross-functional teams not only between technical disciplines, but also across these organizations.
Production & Deployment: Product Line Architectures Ensure Reliable Delivery with Shared-Knowledge Management
The goal of Production and Deployment is to produce System 1 en masse and effectively integrate it into the operational environment. CITE improves Production and Deployment by delivering the composable, executable models from the previous vignettes along with the cyber-physical System 1 enabling Shared-Knowledge Management. Introducing a production-representative complex integrated system is the first opportunity to validate the benefits of upstream risk mitigation efforts. Historically, the primary benefits from using modeling and simulation during the Production & Deployment (P&D) phase were advertised as reduced manufacturing costs and schedule improvements (DAU 2023). Yet schedule delays and cost overruns are common, as highlighted in the U.S. General Accountability Office (GAO) Weapon Systems Annual Assessment (GAO 2025). In 2024, estimated average cycle times for the DoW’s 79 Major Defense Acquisition Programs (MDAPs) are 142 months, an increase of about 15% from 2023, with a total cost of about $2.3 trillion. Adoption of leading digital practices across DoW programs remains inconsistent according to GAO analysis of program questionnaire responses, as an excerpt shows in Table 5. These leading practices are consistent with establishing CITE. While a causal linkage between leading digital practices and MDAP outcomes cannot be established, GAO has identified 14 leading product development companies and, through thorough investigation and synthesis, determined key product development structures and activities contributing to their success (GAO 2023).
Table 5. GAO’s 2025 Weapon Systems Annual Assessment; Adapted from “Most Programs GAO Reviewed Do Not Fully Implement Leading Practices, Including Future Efforts That Are Newer and Have Opportunities to Do So”.

CITE’s use of models and Iterative Development can help circumvent common failure modes identified in P&D. Few programs from the 2025 GAO report either document or initiate each of the six leading product development practices. Throughout the report, program offices cite system integration as high risk. CITE promotes a design philosophy that reduces the risk anticipated by large, late, milestone-driven system integrations, instead favoring small, early, and frequent system integrations. In some cases, GAO reports that the Services are directing program offices to use leading product development practices to help control timelines and manage growing cost increases.
Several other P&D manufacturing benefits can be achieved using models. Models are a more precise way to describe System 1 than text requirements. Executable models also provide testability to ensure that System 1 meets operational needs. These advantages are not limited to system development. Models and executable models are a more precise and testable way to monitor quality assurance and compliance with contract terms. Engineers tag models with metadata indicating key metrics, test conditions, and other parameters for Defense Contract Management Agency (DCMA) to clearly understand the desired output from manufacturing.
Operations & Support: Digital Twins Empower Continuous Improvement through Shared-Knowledge Management
The goal of Operations and Support is to ensure necessary logistics, maintenance, and support to maintain and improve operational effectiveness. CITE improves Operations and Support by providing composable, executable models of the system enabling continued iterative improvement. The models developed in the previous vignettes, combined with data collected from deployed instances of System 1, form the basis of digital twins. Digital twins are “a digital representation of a specific real-world system of interest that bi-directionally sends and receives updates between itself and its real-world counterpart at a frequency and fidelity befitting the use case” (Banerjee, et al. 2024). In short, digital twins are executable models that receive data from a physical counterpart to explicitly mirror that instance, as shown in Figure 12. Usability of a digital twin requires the System 1 physical asset to include instrumentation, self-monitoring, or other capabilities to maintain the digital asset.

Figure 12. Digital Twins are a bi-directional data exchange between executable models developed via CITE and an instance of delivered System 1.
Digital twins used in Iterative Development during operations and support (O&S) ease upgrades, software updates, and maintenance fixes to System 1 following production & deployment. Engineers can take feedback from operators and maintainers to continue Iterative Development, embodying the philosophy of “every jet a test jet”, an expression coined during the 40th Annual International T&E Symposium by Maj. Gen. Michael T. Rawls, Commander, Air Force Operational Test and Evaluation Center. “Every jet a test jet” means continuously validating System 1 models and digital twins. System 1 models and digital twins enable system of systems integration and testing across multiple System 1s in operation or development. This could include weapons, sensors, or other equipment as part of a complex system. Collecting and sharing vehicle data back to engineers enables predictive maintenance, mission planning, wargaming and tactics development, identification of future upgrades, and provides data for cost and schedule estimation of future developments.
Summary of Vignettes
CITE reduces the risk of failing to meet operational needs in all phases of the lifecycle by focusing on integrating and testing the components of System 1. Beginning with Concept Definition, CITE leverages executable models to provide all teams with a shared, dynamic understanding of operational needs, enabling early and continuous situational awareness. As the process moves into Technology Maturation, composable models and automated validation frameworks accelerate innovation and decision-making, allowing knowledge and test data to be reused and adapted across programs, thus reducing redundant testing and facilitating transitions. In Engineering & Manufacturing, collaborative modeling fosters cross-functional teaming and parallelized development, with iterative integration and testing of increasingly faithful system representations. During Production & Deployment, CITE’s model-based approach ensures reliable delivery and deployment by providing executable representations of System 1 that can be used as standards to which to compare deliveries. Finally, in Operations & Support, digital twins, built from the models and data accumulated throughout earlier phases, enable ongoing iterative improvement, predictive maintenance, and continuous feedback, embodying the philosophy of “every jet a test jet.” The CITE engineering method transforms the lifecycle into a connected, adaptive process, where models and data flow to empower iterative development, integration, and sustainment.
Roll-out Discussion
Achieving widespread adoption of CITE across the DoW will require a deliberate change management effort. A wide variety of programs, several large and unique Service T&E and engineering organizations, a multitude of acquisition pathways, different stages of the capability development lifecycle, and other factors create challenges for any Department-wide change. Therefore, a roll-out of CITE must take advantage of the lessons learned and infrastructure created by the deployment and scaling of previous acquisition transformations. In this section, we will use the DoW’s formal adoption of MOSA for architectures as an exemplar. This will show which steps led to its successful Department-wide implementation. A successful CITE rollout would likely be inspired by, and would benefit from, lessons learned during MOSA’s adoption.
Four major phases sufficiently capture MOSA’s progress toward DoW-wide implementation: 1. Define, 2. Prototype, 3. Codify, and 4. Normalize.
Define (Early 1990s – Mid-2000s):
OUSD(A&T) issued a 1994 charter establishing the Open Systems Joint Task Force (OS-JTF) to accelerate adoption of an open-systems approach, directing it to promote and oversee the policy, identify implementation opportunities, develop training, and coordinate selection of open-systems specifications and standards (OUSD(A&T) 1994). In 1998, the Defense Science Board Task Force on Open Systems concluded that an open-systems process is fundamental to DoD priorities such as force modernization, reduced cycle time, and lower ownership costs (OUSD(A&T) 1998). In May 2003, OSD published DoDD 5000.1, which required that “a modular, open-systems approach shall be employed, where feasible” (OUSD(AT&L) DoDD 5000.01 2003). The five MOSA principles were then formalized in the OS-JTF Program Manager’s Guide: A Modular Open Systems Approach (MOSA) to Acquisition, and they remain largely unchanged today (OUSD(A&T) 2004).
Taken together, CITE’s first step should focus on defining the concept by standing up a dedicated expert group, securing a strong endorsement from an independent body, inserting the concept into a DoW instruction, and then having the expert group turn it into practical guidance for program managers.
Prototype (Mid-2000s – Mid-2010s):
Pilot efforts and standards emerged across the Services in the 2000s. In 2003, the Navy’s Open Architecture Computing Environment (OACE) demonstrated an initial spiral of an open-systems architecture for Aegis and DD(X) applications (Strei 2003). NAVAIR and Army PEO Aviation backed the Future Airborne Capability Environment (FACE) Standard, an open architecture for reusable, interoperable software on DoD airborne platforms (Vanderbilt ISIS 2025). Established in 2009, the FACE Consortium now has over 230 members, publishes the FACE Technical Standard, and operates a formal certification program (FACE 2026).
CITE’s second step should focus on prototyping by running several high-visibility pilot implementations, adopting a shared technical framework across key communities, and standing up a consortium or working group that maintains the framework and certifies solutions.

Figure 13. MOSA Statutory and Regulatory Timeline (OUSD(R&E) SE&A 2025)
Codify (Mid-2010s – Late-2010s)
As summarized in Figure 13, MOSA moved from concept to formal requirement through a series of NDAA provisions and policy updates. Early statutory language in the FY2015 NDAA Sec. 801 (U.S. Congress) was strengthened by FY2017 NDAA Sec. 805 (U.S. Congress), which formally defined a “modular open system approach” and required its use, where practicable, in major weapon systems, with subsequent adjustments in FY2020 NDAA Sec. 840 (U.S. Congress) and FY2021 NDAA Sec. 804 (U.S. Congress). The FY2022 NDAA (U.S. Congress) then recodified these provisions into 10 U.S.C. 4401-4403, embedding MOSA in the reorganized acquisition chapters of Title 10 (10 U.S.C. § 4401 2024). In parallel, DoD’s shift to the Adaptive Acquisition Framework carried MOSA expectations into core acquisition and engineering policy frameworks, building on earlier 5000.02 direction (OUSD(A&S) 2022).
CITE’s third step should focus on codifying the concept by securing statutory language in successive NDAA provisions, aligning DoW acquisition instructions and frameworks with that direction, and using those to make the concept a formal requirement for major programs.
Normalize (Late 2010s – Present)
Figure 13 also demonstrates how normalization is reflected in the growing body of implementation guidance and Service-level doctrine. “MOSA Reference Frameworks in Defense Acquisition Programs” (OUSD(R&E) 2020) gave program managers concrete patterns for applying modular designs and open business models, and 2022 systems engineering guidance (e.g., the Engineering of Defense Systems Guidebook (OUSD(R&E) 2022), Systems Engineering Guidebook (OUSD(R&E) 2021), and Systems Engineering Plan guidance (OUSD(R&E) 2023)) further integrated MOSA into standard systems engineering practice. Service-specific documents such as the “Air Force Materiel Command Guidebook for implementing MOSA in Weapon Systems v2.0” (AFMC 2023) and the “Naval Modular Open System Approach Guidebook v1.0” (ASN(RD&A) 2025), together with “Implementing a Modular Open Systems Approach in DoD Programs” (OUSD(R&E) SE&A 2025), operationalize these statutory and policy expectations into day-to-day implementation. Finally, Defense Acquisition University (DAU) training, such as the CLE 019 MOSA course, helps normalize these concepts and scales them across the acquisition workforce (DAU 2019).
CITE’s fourth and final step should focus on normalization by publishing implementation frameworks and systems engineering guidance, issuing Service-level guidebooks, and reinforcing the approach through training so that it becomes part of day-to-day implementation.
Challenges and Risks
The authors acknowledge that while advances in big data, cloud computing, internet of things, co-simulation, modeling standards, and other digital technologies make CITE viable as an engineering method, barriers to implementation remain. These barriers include:
Multi-level Security: System 1 of a DoW complex cyber-physical system typically spans multiple security classifications. To operate efficiently, model-driven engineering creates a “digital thread,” i.e., “an extensible and configurable analytical framework that seamlessly expedites the controlled interplay of technical data, software, information, and knowledge in the digital engineering ecosystem, based on the established requirements, architectures, formats, and rules for building digital models. It is used to inform decision makers throughout a system’s life cycle by providing the capability to access, integrate, and transform data into actionable information” (OUSD(R&E) 2023). Current security policy, procedures, and cross-domain solutions make working across multiple security classifications difficult. Digital threads and visibility into an authoritative source of truth lead to challenges with aggregation classification in which it is unclear how to display multiple pieces of data or information, that when combined, elevate the classification beyond the user’s clearance.
Intellectual Property: DoW often contracts work to private companies that desire to keep their intellectual property secure. The digital thread and authoritative source of truth in model-driven engineering require sharing information among the many domains and performers on a lifecycle effort. DoW must invest in architectures, tools, and contractual structures that allow sharing of information between the government and many contractors while preserving intellectual property rights.
Infrastructure: DoW uses multiple, disconnected model-based engineering infrastructures. Network domains, firewalls, tool configuration, licensing, and other features of this distributed architecture prevent digital threads from forming. DoDI 5000.97 (OUSD(R&E) 2023) attempts to resolve this issue by directing the Test Resource Management Center (TRMC) to “develop and maintain a core DoW-wide digital engineering infrastructure capability, methodologies, and practices to support acquisition programs.”
Culture: Engineering domains evolved in response to specific technological challenges and solutions. Each has specialized tools, methodologies, lexicons, prioritizations, and even cultures resulting in silos of execution. Complex systems require tight integration between domains. Engineering tools now integrate at the technical level, but cultural differences remain barriers to process integration.
Incentives: Much of the benefit of CITE and model-based engineering comes from reuse across multiple System 1s. CITE improves speed of delivery and quality of outcomes for the “first” System 1 developed, but the “second” System 1 developed takes advantage of the models to get a significant head start. DoW funding flows at the program level where program managers are incentivized to meet cost, schedule, and performance of their own System 1. Until there are explicit incentives to create models that benefit other System 1s, we cannot expect as much improvement due to CITE and model-based engineering alone.
Conclusion
CITE offers a transformative approach to addressing the growing complexity and risk in DoW capability development. Several digital native companies have already begun using these methodologies and tools. By leveraging advances in technology, CITE enables rapid, informed decision-making and continuous risk reduction throughout the system lifecycle. CITE elevates Continual Integration and Test as a guiding tenet, harnesses composable and executable models for accelerated learning, and supports cross-domain collaboration to adapt to evolving operational needs. While challenges persist in implementation, the foundational principles and operational methods outlined in this CONOPS offer guidance and support for leadership as they consider adopting CITE, ultimately delivering more agile, effective, and resilient warfighter capabilities.
Acknowledgements
This work was supported by OUSD(R&E) DTE&A. The authors would like to thank reviewers from the following partner organizations: Naval Air Systems Command, Johns Hopkins University Applied Physics Laboratory, Institute for Defense Analyses, and the Air Force Institute of Technology Scientific Test and Analysis Techniques Center of Excellence.
The authors also wish to thank the MITRE staff for their technical leadership and support, including Dr. Natalie Kautz, Dr. Suzanne Beers, Dr. Jason Daly, Zoe Henscheid, Sharon Fitzsimmons, Carol Pomales, Brad Young, and Hans Miller.
Finally, the authors are grateful to the ITEA editorial team for their invaluable review of this article.
References
10 U.S.C. § 4401. 2024. “Requirement for modular open system approach in major defense acquisition programs; definitions.”
Adcock, Rick, Hillary Sillitto, and Sarah Sheard. 2025. Guide to the Systems Engineering Body of Knowledge (SEBoK). November 17. Accessed February 12, 2026. https://sebokwiki.org/wiki/Complexity.
AFMC. 2023. “Guidebook for Implementing Modular Open Systems Approaches in Weapon Systems.” Dayton, OH.
Amissah, Matthew. 2018. “A Framework for Executable Systems Modeling.” PhD Diss. Norfolk, VA: Old Dominion University, August. Accessed February 11, 2026. https://digitalcommons.odu.edu/emse_etds/31/.
ASN(RD&A). 2025. Naval Modular Open System Approach Guidebook v1.0. Department of the Navy.
Banerjee, Bonny, Kishor Chakravarthy, William Fisher, Jeremy Werner, Robert Riley, Erwin Sabile, James Sabino, et al. 2024. “Digital Twin: A Quick Overview.” The ITEA Journal of Test and Evaluation 45 (1). Accessed February 11, 2026. https://itea.org/journals/volume-45-1/digital-twin-a-quick-overview/.
Beck, Kent, Mike Beedle, Arie van Bennekum, Alistair Cockburn, Ward Cunningham, Martin Fowler, and James Grenning. 2001. “Manifesto for Agile Software Development.” Agile Alliance. Accessed February 12, 2026. https://agilemanifesto.org/.
Beers, Suzanne. 2024. “Decision-Supporting Capability Evaluation throughout the Capability Development Lifecycle.” The ITEA Journal of Test and Evaluation 45 (1). Accessed February 11, 2026. https://itea.org/journals/volume-45-1/decision-supporting-capability-evaluation-throughout-the-capability-development-lifecycle/.
Boehm, Barry W. 1988. “A Spiral Model of Software Development and Enhancement.” Computer (IEEE) 21 (5): 61-72. Accessed February 12, 2026. doi:10.1109/2.59.
Campos, José, Andrea Arcuri, Gordon Fraser, and Rui Abreu. 2014. “Continuous test generation: enhancing continuous integration with automated test generation.” ASE ’14: Proceedings of the 29th ACM/IEEE International Conference on Automated Software Engineering. Västerås, Sweden: ACM. 55-66. Accessed February 12, 2026. https://dl.acm.org/doi/10.1145/2642937.2643002.
Collins, Christopher, and Kenneth Senechal. 2023. “Test and Evaluation as a Continuum.” The ITEA Journal of Test and Evaluation 44 (1). Accessed February 11, 2026. https://itea.org/journals/volume-44-1/test-and-evaluation-as-a-continuum/.
DAU. 2019. CLE019 Modular Open Systems Approach. Accessed February 12, 2026. https://www.dau.edu/courses/cle-019-0.
—. 2025. CONOPS. May. Accessed February 11, 2026. https://www.dau.edu/acquipedia-article/concept-operations-conops.
DAU. 2023. “Defense Acquisition Guidebook.” 3–2.4.2. Accessed February 11, 2026. https://www.dau.edu/sites/default/files/2023-09/DAG-CH-3-Systems-Engineering.pdf.
Dove, Rick, and Bill Schindel. 2019. “Agile Systems Engineering Life Cycle Model for Mixed Discipline Engineering.” 29th Annual INCOSE International Symposium. Orlando, FL. Accessed February 11, 2026. https://www.parshift.com/s/ASELCM-05Findings.pdf.
Dove, Rick, Kerry Lunney, Michael Orosz, and Mike Yokell. 2023. “Agile Systems Engineering – Eight Core Aspects.” 33rd Annual INCOSE International Symposium. Honolulu, HI: INCOSE. Accessed February 11, 2026. https://www.researchgate.net/publication/373092973_Agile_Systems_Engineering_-Eight_Core_Aspects.
Elbaum, Sebastian, Gregg Rothermel, and John Penix. 2014. “Techniques for improving regression testing in continuous integration development environments.” Proceedings of the 22nd ACM SIGSOFT International Symposium on Foundations of Software Engineering. Hong Kong. 235-245. Accessed February 12, 2026. doi:10.1145/2635868.2635910.
Enayet, Nabil, and Gokul Subramanian. 2024. Inside the Crucible: Anduril’s Secret to Rapid Development at Scale. March 20. Accessed February 12, 2026. https://www.anduril.com/news/anduril-project-crucible.
FACE. 2026. Accessed February 12, 2026. https://www.opengroup.org/face.
GAO. 2023. “Leading Practices: Iterative Cycles Enable Rapid Delivery of Complex, Innovative Products.” Report to Congressional Committees, Washington, D.C. Accessed February 11, 2026. https://www.gao.gov/assets/gao-23-106222.pdf.
GAO. 2025. “Weapon Systems Annual Assessment: DOD Leaders Should Ensure That Newer Programs Are Structured for Speed and Innovation.” Report to Congressional Committees, Washington, D.C. Accessed February 11, 2026. https://www.gao.gov/assets/gao-25-107569.pdf.
Johnson, Suzette, and Robin Yeman. 2023. Industrial DevOps: Build Better Systems Faster. Portland, OR: IT Revolution Press.
Miller, Hans. 2025. “Then What? The Need for Iterative Assessments to Achieve Successful Operational Capabilities.” The ITEA Journal of Test and Evaluation 46 (2). Accessed February 11, 2026. doi:10.61278/itea.46.2.1004.
Nierstrasz, Oscar, and Theo Dirk Meijler. 1995. “Research Directions in Software Composition.” ACM Computing Surveys 27 (2): 262-264. Accessed February 11, 2026. https://dl.acm.org/doi/abs/10.1145/210376.210389.
NSF. 2025. Cyber-Physical System Foundations and Connected Communities (CPS). July 25. Accessed August 2025. https://www.nsf.gov/funding/opportunities/cps-cyber-physical-system-foundations-connected-communities.
OSD. 2025. “Directing Modern Software Acquisition to Maximize Lethality.” March 6. Accessed February 11, 2026. https://media.defense.gov/2025/Mar/07/2003662943/-1/-1/1/DIRECTING-MODERN-SOFTWARE-ACQUISITION-TO-MAXIMIZE-LETHALITY.PDF.
OUSD(A&S). 2022. “DoD Instruction 5000.02 Operation of the Adaptive Acquisition Framework.” Washington, D.C.
OUSD(A&T). 1994. “Acquisition of Weapons Systems Electronics Using Open Systems Specifications and Standards.” Washington, D.C., November 29. Accessed February 12, 2026. https://www.dau.edu/sites/default/files/Migrated/CopDocuments/OSJTF%20Memo%20and%20Charter%20November%2029%201994.pdf.
OUSD(A&T). 2004. “OS-JTF Program Manager’s Guide: A Modular Open Systems Approach (MOSA) to Acquisition.” Guidebook. Accessed February 12, 2026. https://www.acqnotes.com/Attachments/Program%20Managers%20Guide%20to%20Open%20Systems,%20Sept%202004.pdf.
OUSD(A&T). 1998. “Report of the Defense Science Board Task Force on Open Systems.” Report, Washington, D.C. Accessed February 12, 2026. https://apps.dtic.mil/sti/tr/pdf/ADA358287.pdf.
OUSD(AT&L) DoDD 5000.01. 2003. “The Defense Acquisition System.” Washington, D.C.
OUSD(R&E) Defense Science Board. 2024. “Test and Evaluation.” Report, Washington, D.C. Accessed February 11, 2026. https://dsb.cto.mil/wp-content/uploads/2024/08/DSB_TE-Report_UNCLASS_FINAL_August-2024_Stamped.pdf.
OUSD(R&E). 2023. DoD Instruction 5000.97 Digital Engineering. Washington, D.C.: DoD. Accessed February 11, 2026. https://www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodi/500097p.PDF?ver=bePIqKXaLUTK_Iu5iTNREw%3D%3D.
OUSD(R&E). 2023. “DoD Systems Engineering Plan (SEP) Outline.” Washington, D.C.
OUSD(R&E). 2022. “Engineering of Defense Systems Guidebook.” Washington, D.C.
OUSD(R&E) Mission Capabilities. 2023. “DoD Mission Engineering Guide Version 2.0.” Washington, D.C. Accessed February 11, 2026. https://ac.cto.mil/wp-content/uploads/2023/11/MEG_2_Oct2023.pdf.
OUSD(R&E). 2020. Modular Open Systems Approach (MOSA) Reference Frameworks in Defense Acquisition Programs. Washington, D.C.: DDR&E for Advanced Capabilities.
OUSD(R&E) SE&A. 2025. “Implementing a Modular Open Systems Approach in DoD Programs.” Washington, D.C.
OUSD(R&E). 2021. “Systems Engineering Guidebook.” Washington, D.C.
Pennock, Michael J., and Jon P. Wade. 2015. “The Top 10 Illusions of Systems Engineering: A Research Agenda.” 2015 Conference on Systems Engineering Research. Procedia Computer Science. 147-154. Accessed February 12, 2026. doi:doi: 10.1016/j.procs.2015.03.033.
Perrow, Charles. 1999. Normal Accidents: Living with High-Risk Technologies. Princeton, NJ: Princeton University Press.
Provost, K., C. Stafford, and N. Jones. 2024. Model Validation Levels: Methods and Implementation. Technical report, Wright-Patterson Air Force Base, OH: Air Force Institute of Technology, Scientific Test and Analysis Techniques Center of Excellence.
Shahin, Mojtaba, Muhammad Ali Babar, and Liming Zhu. 2017. “Continuous Integration, Delivery and Deployment: A Systematic Review on Approaches, Tools, Challenges and Practices.” IEEE Access (IEEE) 5: 3909-3943. Accessed February 12, 2026. doi:10.1109/ACCESS.2017.2685629.
Sheard, Sarah A, and Ali Mostashari. 2008. “Principles of complex systems for systems engineering.” The Journal of The International Council on Systems Engineering 12 (4): 295-311. Accessed February 12, 2026. doi:10.1002/sys.20124.
Shewhart, Walter A. 1939. Statistical Method from the Viewpoint of Quality Control. Washington, D.C.: The Graduate School, U.S. Department of Agriculture.
Stolberg, Sean. 2009. “Enabling agile testing through continuous integration.” AGILE ’09: Proceedings of the 2009 Agile Conference. Washington, D.C.: IEEE Computer Society. 369-374. Accessed February 12, 2026. https://dl.acm.org/doi/10.1109/AGILE.2009.16.
Strei, Captain Thomas J. 2003. Open Architecture in Naval Combat System Computing of the 21st Century: Network-Centric Applications. PEO IWS, Washington, D.C.: US Navy. Accessed February 12, 2026. https://apps.dtic.mil/sti/tr/pdf/ADA467321.pdf.
U.S. Congress. 2014. NDAA FY15. Washington, D.C.: Pub. L. No. 113-291, 128 Stat. 3292.
U.S. Congress. 2016. NDAA FY17. Washington, D.C.: Pub. L. No. 114-328, 130 Stat. 2000.
U.S. Congress. 2019. NDAA FY20. Washington, D.C.: Pub. L. No. 116-92, 133 Stat. 1198.
U.S. Congress. 2020. NDAA FY21. Washington, D.C.: Pub. L. No. 116-283, 134 Stat. 3388.
U.S. Congress. 2021. NDAA FY22. Washington, D.C.: Pub. L. No. 117-81, 135 Stat. 1541.
Vanderbilt ISIS. 2025. FACE Open Architecture Standard Support (2009-2025). Accessed February 12, 2026. https://www.isis.vanderbilt.edu/projects/facer-open-architecture-standard-support-2009-2025.
Author Biographies
Dr. William Fisher received a Ph.D. in Applied Physics from the University of Michigan. He is a Principal Systems Security Engineer with The MITRE Corporation supporting efforts to advance model-based engineering in T&E. He has been a system architect and lead systems engineer on large scale cyber-physical systems as well as solving hard problems for over 20 years in superconductivity, materials processing, optics, algorithm development, cyber security, and more in defense and academia.
Ben Jimenez is a Digital Engineering Initiative Lead at The MITRE Corporation, helping the DoW accelerate capability delivery through advanced digital engineering and automated T&E tools. Previously, he was Project Leader at Boston Consulting Group, advising aerospace and defense executives on digital strategy. Earlier, he supported the U.S. Army Research Laboratory as an aerospace engineer contractor and led engineering teams at Whirlpool Corporation, earning a Lean Six Sigma Black Belt and publishing five patents. He holds a B.S. in Aerospace Engineering from Arizona State University, an M.S. in Aerospace Engineering from the University of Maryland College Park, and an MBA from Duke University.
Mackenzie Moss is a Lead Model-Based Systems Engineer advising on MBSE and system architecture for Space Force, Navy, and Veterans Affairs systems. He specializes in modernizing legacy systems and optimizing engineering and acquisition processes, leading efforts to reduce lifecycle costs and improve warfighter capabilities. Mackenzie served over 12 years as an Intelligence and Signals Officer in the U.S. Army, generating joint intelligence products for CENTCOM and INDOPACOM. He holds a B.A. in Mathematics from the University of Montana and is researching advancements in formal methods for Model-Based Engineering for his M.S. at Colorado State University.
Kevin Baliga is a Senior Systems Engineer at The MITRE Corporation, where he advances model-based systems engineering practices to enhance acquisition and development processes within the aerospace and defense sector. His expertise spans system architecture development, reliability engineering, and sustainment engineering, with a focus on supporting the full system lifecycle and improving test and evaluation to deliver enduring capabilities for the warfighter. Kevin holds a B.S. in Aerospace Engineering from the Georgia Institute of Technology and an M.S.E. in Systems Engineering from Johns Hopkins University.
Natalia Henriquez Sanchez is a Senior Systems Engineer at The MITRE Corporation. She conducts systems architecture analyses for the U.S. Air Force and U.S. Space Force and supports model-based systems engineering (MBSE) efforts to develop digital thread capabilities that work toward modernizing the defense enterprise. Natalia holds a B.S. in Physics from the University of California, San Diego, and an M.S. in Systems Architecting and Engineering from the University of Southern California.
Dewey Classification: L 681 12


