SEPTEMBER 2024 I Volume 45, Issue 3
Reimagining T&E for the Modern Joint Environment | ITEA Journal
SEPTEMBER 2024 I Volume 45, Issue 3
SEPTEMBER 2024
Volume 45 I Issue 3
![]()
In 2022 and 2023, the Director Operational Test and Evaluation (DOT&E) published the DOT&E Test and Evaluation (T&E) Strategy Update (DOT&E, 2022) and the DOT&E Strategy Implementation Plan (DOT&E, 2023). These documents introduce the need to reimagine T&E in a way that strongly emphasizes the operational and mission context in which systems under test are expected to perform throughout the capability lifecycle. Through the Acquisition Innovation Research Center (AIRC), DOT&E contracted the Virginia Tech Applied Research Corporation (VT-ARC), in partnership with Virginia Tech National Security Institute (VTNSI), to develop a Joint Test Concept (JTC) in support of the Implementation Plan.
The JTC effort recognizes that traditional test and evaluation (T&E) capabilities must be stretched further than ever before, shifting the way we think about system performance and how T&E contributes to the overall assessment of measures and outcomes aligned with complex mission webs and Joint system of systems. Critical to this effort is the collaborative approach to the JTC study design and the T&E stakeholders represented in the community of interest (COI). The JTC recognizes systems under test are likely a part of a system of systems capability lifecycle which requires the application of a capability lifecycle campaign of learning that delves into critical layer questions focused on a system functional focus, a service mission focus, a system of systems capability focus, and a Joint all-domain mission focus. The JTC primary functions (design a campaign of learning strategy, develop a campaign of learning data strategy, monitor for decision points, refine T&E event plan, execute event, and update strategies), enable an iterative approach to answering key T&E questions and support milestone decisions along the capability lifecycle. Next steps involve leveraging simulation exercises to stress the JTC Pilot and workshops to develop an implementation roadmap, ultimately culminating in the publication of the JTC v1.0.
Keywords: Joint Test, Joint Test Concept, Joint Testing, System of Systems, Capability Lifecycle, Capability Portfolio Management, Joint All-Domain, DOT&E, Mission Thread, Mission Web, Test and Evaluation (T&E), Reimagined T&E, Joint Force, Joint Capability, Leading Change, Mission Engineering, Digital Engineering, LVC, Multi-Domain Operations Testing
The rise of peer/near-peer (P/N-P) actors in great power competition and the continued growth of global asymmetric threats indicate the Joint force will be contested in all domains during the execution of distributed, potentially non-contiguous, regular and irregular combat operations. To secure the Joint force advantage posited by the Joint Warfighting Concept 2.0, Joint readiness must stretch traditional T&E agility and capabilities further and faster than ever before. T&E must be reimagined with increased emphasis on the operational and mission context in which the system under test is expected to perform throughout the capability lifecycle. Doing so will shift the way we think about system performance and how T&E contributes to the overall assessment of measures and outcomes aligned with complex mission webs and Joint systems of systems.
The conceptual innovation required to reimagine Joint T&E is a complex problem. It must address the challenges associated with an evolving external operating environment and the internal hurdles posed by a large organizational bureaucracy where critical stakeholders fiercely defend their metaphorical “rice bowls”, particularly resources, roles, and responsibilities. While reimagination sounds good in theory, it is useless without operational integration and adoption. (Nix et al, 2023)
Director, Operational Test and Evaluation (DOT&E) outlined several lines of effort to tackle transformation in the T&E Strategy Update and Implementation Plan. Through the Acquisition Innovation Research Center (AIRC), DOT&E contracted the Virginia Tech Applied Research Corporation (VT-ARC), in partnership with Virginia Tech National Security Institute (VTNSI), to develop a Joint Test Concept (JTC). This multi-year effort, initiated in FY23, developed a JTC community of interest (COI) to serve as the guiding coalition and foundation for concept development. Integrating the COI-generated outcomes during design-based collaborative workshops, the resultant JTC pilot applies an end-to-end capability lifecycle campaign of learning approach, anchored in mission engineering, and supported by a live, virtual, constructive environment to assess material and non-material solutions’ performance, interoperability, and impact to service and Joint mission execution.
Leading change requires a customized approach. This is particularly true when a new construct like the Joint Test Concept, one that will impact a wide range of stakeholders across a complex dynamic ecosystem, has the potential to create a multitude of organizationally disruptive second and third order effects. For this reason, the VT-ARC team created a customized study design that applied John Kotter’s eight-stage “Leading Change” model (Kotter, 2023), the University of St. Gallon’s macro model of Stanford’s design thinking process (Brenner & Uebernickel, 2016), Liberating Structures (Lipmanowicz & McCandless, 2016), and open thinking principles (Pontefract, 2018).
VT-ARC’s design also grew and fostered a cooperative COI across diverse Joint T&E stakeholder groups within the defense ecosystem. This community worked collaboratively over a series of workshops to conceptualize and assess the future of Joint T&E. Participants addressed current structural constraints and restraints (including polices and organizational culture) to conceive of an idealized future where key roadblocks to efficient T&E could be mitigated or removed. The design approach, depicted in Figure 1, is tailored to the complexities and peculiarities of the defense ecosystem.

Figure 1: Joint Test Concept Study Design
Recognizing that T&E data relevant to milestone decision making initiates well before a system becomes a program of record, the COI determined that future T&E must move beyond discrete blocks of system and operational testing toward a connected flow across the entire capability lifecycle. Depicted in Figure 2, the capability lifecycle starts with the identification of a mission or capability need. The more traditional system lifecycle commences with the down-select of a solution approach, followed by a commitment to design, develop, and deliver that solution (or solutions) following a defined acquisition pathway. The capability lifecycle begins before system selection and continues beyond system deployment. It includes ongoing T&E and performance monitoring in the field to support long-term sustainment, operational resilience, and Joint capability assessments.
Capability and system lifecycle activities enable capability portfolio management which shifts away from the system suboptimization commonly practiced today. Capability portfolio management supports capability and system milestone decision making through quality data collection and appropriate access across industry, acquisition, and operational communities. It helps ensure appropriate operational redundancies across the capability portfolio in support of complex, operational mission webs. Additionally, capability portfolio management enables ongoing integration of emergent requirements following fielding and throughout sustainment.

Figure 2: The Multi-System Capability Lifecycle Approach
To support the shift from a system-centric approach towards a system of systems capability lifecycle approach, the COI imagined that T&E layers, depicted in Figure 3, would align with the systems engineering hierarchy. They are not applied in a hierarchical fashion, however, and instead allow for an iterative sequencing whereupon all layers may be applied multiple times across a capability lifecycle.
Recognizing the importance of both developmental and operational testing (DT & OT), the research team identified DT and OT critical requirements and success metrics but purposefully avoided using the terms to encourage innovative, creative, and critical thinking. As a result, the COI integrated the DT and OT critical requirements into a new framework featuring system performance, capability immersion, and Joint capability demonstration layers, and is shown in Figure 3.

Figure 3: Joint Test Concept Layers and These layers provide T&E professionals with four focal points illustrated in the overlapping areas in the Figure 3 Venn diagram.
They include a system functional focus, a systems-of-systems and capability focus, a service mission focus, and a joint all-domain mission focus. This allows the new T&E concept to be applied within the current organizational structure and to be flexible enough to shift as the defense ecosystem adapts to the condensed innovation acquisition timeline. The successful implementation of these layers and associated focus areas across the capability lifecycle requires supporting elements including high-trust data, training, resourcing, organizational structure, and authorities in addition to networked live, virtual, constructive (LVC) environment access.
Extending cohesive T&E across the entire capability lifecycle enables practitioners to move away from the existing high risk binary pass/fail assessment approach and toward a flexible and iterative campaign of learning. See Figure 6. This allows for the natural introduction of emergent requirements, updated mission threads, and changes to existing systems of systems. Additionally, a campaign of learning provides the opportunity for T&E outcomes to be integrated into relevent operational analysis, critical capabilities development, acquisition, and operationalization.
A reimagined T&E construct must help bridge existing organizational gaps and provide a common language that integrates considerations for each stage within a capability lifecycle. Mission Engineering (ME) is defined as “an interdisciplinary process encompassing the entire technical effort to analyze, design, and integrate current and emerging operational needs and capabilities to achieve desired mission outcomes” (MEG 2.0, 2023). ME can provide a connective pathway to a traceable architecture that enables the requirements, development, acquisition, and user communities to communicate effectively and overcome barriers to efficient acquisition and fielding. While Congress incorporated the practice in 2017 language (Section 855), the DoD at large and T&E specifically has yet to train and grow a constituency of practitioners to realize the benefits.
The complexities of conducting a T&E campaign of learning throughout the capability lifecycle underscore the importance of an established reference architecture. This reference architecture would serve as an authoritative source of information about the T&E campaign of learning and create a common baseline of understanding among stakeholders. The reference architecture can then serve as a guide and a constraint to enable instantiations of multiple system of system architectures and solutions. An overarching reference architecture framework provides strategic elements and guidance for detailed reference, data, and solution architectures to follow. From a design and engineering perspective, this guides separate design teams to make “compatible design decisions and identify interdependency of key design decisions” (Beck et al, 2023) From a capability and enterprise-wide perspective, the high-level reference architecture enables robust, scalable, interoperable, and repeatable T&E implementations.
A JTC reference architecture should not complicate T&E processes, artifacts, or stakeholder roles and responsibilities. Instead, it must cultivate and organize a cohesive stakeholder understanding for JTC implementation. It seeks to build upon and leverage existing Joint, mission, and capability-based policies, authorities, and lifecycle artifacts to achieve efficiencies. This ensures Joint perspectives and equities are incorporated iteratively and effectively. The COI must continue to identify ways in which the JTC pilot could create such complications through design-based collaboration and simulation events to ensure mitigation.
The COI concluded that in many cases, critical stakeholders do not have the appropriate resources or authorities to implement a capability lifecycle T&E campaign of learning. To address this challenge, the COI recommended the development of a Joint T&E strategy team (J-TEST) which would be resourced and authorized to lead the implementation and to provide support to the various organizations responsible for development, acquisition, fielding, and sustainment of new and legacy systems. The COI envisioned eight separate but overlapping J-TEST lines of effort, each featuring a director: environment, security, strategy, integration, communication, information and personnel. The J-TEST would provide the human connectivity across the capability lifecycle to ensure systems integrate Joint mission-centric T&E considerations.
While integrating these findings, the study team developed a JTC pilot that envisions T&E across a campaign of learning through the iterative execution of six primary functions. The first two functions begin early in the capability lifecycle with the formulation of the JTC Capability lifecycle Campaign of Learning and Data Strategies that guide the rest of the functions. These strategies capture the overarching plans, process steps, implementation requirements (i.e., data, information, resource, timing, and process requirements), and related resourcing needs for a specific JTC campaign of learning. The Learning Strategy and Data Strategy create a pathway for the successful execution of the remaining JTC functions displayed in Figure 4.

Figure 4: The six primary Joint Test Concept Functions
The “Design JTC Capability Lifecycle Campaign of Learning Strategy” function is the first major step that follows initial ME. It should be revisited, updated, and executed as a JTC test program iterates across the capability lifecycle. This first function captures and tracks key T&E questions, related decision points, and associated data/information requirements that will be used to drive, coordinate, and integrate T&E activities and assessments at all three JTC Layers and across JTC structural elements. T&E questions should align with, plug into, and ultimately integrate the key program of record once created and captured in decision support tools, as appropriate. While not called out as separate functions, risk management and uncertainty quantification should be incorporated into any JTC Capability Lifecycle Campaign of Learning Strategy. A T&E risk management plan will be vital early in the capability lifecycle and to inform the program manager’s risk management plan if the system under test matures to a program of record.
The second function, “Develop JTC Capability Lifecycle Campaign of Learning Data Strategy”, occurs either concurrently with or directly following the development of the Learning Strategy. This function details the strategy, requirements, and action plan for managing data, information, and knowledge surrounding JTC activities to ensure data is visible, accessible, understandable, linked, trustworthy, interoperable, and secure (VAULTIS) compliant throughout the entire campaign of learning.
Functions 3-6 occur iteratively throughout the capability lifecycle, with some sequential relationships between functions. Function 3, “Monitor for JTC Strategy Decision Points”, carries out the JTC Capability Lifecycle Campaign of Learning Strategy formulated process for monitoring decision points and events. The “Refine JTC T&E Event Plan” function is prompted at the entry decision point for a distinct JTC joint-level test event or if the JTC T&E program reaches a planned/known test event within the timetable specified in the JTC Capability Lifecycle Campaign of Learning Strategy. Therefore, Function 3 can have a sequential relationship with Function 4. The “Execute JTC T&E Event” function occurs when a distinct JTC event is planned. The JTC T&E program will organize and conduct the planned test event and then use the JTC Capability Lifecycle Campaign of Learning Strategy and JTC Capability Lifecycle Campaign of Learning Data Strategy to conduct a comprehensive assessment based on outcomes against the key T&E questions, decision points, and related evaluation criteria.
The “Update JTC T&E Strategy” function seeks to integrate new information that informs refinements to any of the strategies, plans, processes, and related planning artifacts created throughout the execution of Function 1-5. This serves as an important feedback loop to ensure the JTC T&E process is adaptable and flexible.

Figure 5: The Joint Test Concept Layers and Structural Elements
The JTC Functions loop, emphasizes that a JTC capability lifecycle campaign of learning requires iterative execution and feedback between functions to ensure the layers and structural elements (pictured around the layers in Figure 5) are adequately incorporated in T&E activities across the capability lifecycle.
In its entirety, the JTC pilot, or the current instantiation of the JTC, applies an end-to-end capability lifecycle approach, anchored in mission engineering, reinforced by decision support tools, and supported by LVC environments to assess material and non-material solutions’ performance, interoperability, and impact to service and Joint mission execution. Figure 6 visualizes a notional JTC capability lifecycle campaign of learning.

Figure 6: Joint Test Concept Campaign of Learning Approach
The iterative nature of mission engineering and JTC execution throughout the capability life cycle shifts away from the current binary pass/fail T&E construct. This shift requires that both the JTC Capability Lifecycle Campaign of Learning Strategy and the JTC Capability Lifecycle Campaign of Learning Data Strategy identify and leverage efficiencies and reduce resource commitments such as Service and Joint operational and training exercises, digital models, and LVC environments, and well-written contracts.
Additionally, the JTC recognizes the existing and developing decision support processes that may generate data relevant to JTC activities. As such, the JTC reference architecture must support interoperability with existing or developing decision support processes to leverage efficiencies, reduce resource commitments, and enable timely data-informed decisions.
The Joint force will be contested in all domains during the execution of distributed, potentially non-contiguous, combat operations. Furthermore, the attainment of comprehensive Joint readiness will stretch traditional T&E capabilities further than ever before. The JTC pilot offers a paradigm shift in how the DoD approaches T&E by evolving to a continuously iterative campaign of learning across the capability lifecycle; one that touches the entire T&E enterprise and associated functions, data, and information artifacts for which each stakeholder is responsible.
To change how we approach and execute T&E, it is critical to ensure that stakeholders are speaking a common language and using a common framework for architecture and solution development as well as JTC implementation. The JTC has the potential to make an outsized impact on existing and emerging capability gaps by generating efficiencies throughout the capability lifecycle that ensure DoD provides warfighters with the right tools at the right time with the right training.
The JTC multi-year effort is ongoing. Future JTC workshops at the close of FY 24 and throughout FY25 will continue to identify and address unique JTC challenges and areas of discovery. Additionally, close coordination with the JT&E office will expand the COI to ensure representation across all necessary competencies. COI participation is critical for both JTC efficacy and adoption where COI members serve as innovation change agents, facilitating concept integration. The VT-ARC study team will continue to integrate new organizations and individuals into the COI. The JTC comprehensive, collaborative innovation process will benefit from COI expansion, ensuring the JTC creates benefits across the defense ecosystem.
The JTC COI workshop participants continue to be critical to ensure JTC outcomes meet the needs of the diverse T&E-enterprise and stakeholder population. The reports and workshop design and execution rely on input from a smaller COI working group including VTNSI, MITRE, TROIKA, GTRI, JHU-APL, and Hepburn and Sons. The VT-ARC team would like to thank all these organizations in addition to government collaborators from DOT&E, USD(R&E), NSWC Dahlgren, ATEC, and OPTEVFOR for their support and dedication to the project.
Beck, Stephen Jerry Jackson, Mark Kasunic, 2023. Reference Architecture, Mission Threads and Software Integration. https://apps.dtic.mil/sti/tr/pdf/AD1085448.pdf (accessed May 2, 2024).
Brenner, Walter, and Falk Uebernickel. Design thinking for innovation: Research and practice. Cham: Springer, 2016.
Department of Defense Office of the Undersecretary of Defense for Research and Engineering, 2023. The Mission Engineering Guide (MEG) 2.0. https://ac.cto.mil/wp-content/uploads/2023/11/MEG_2_Oct2023.pdf (accessed May 2, 2024).
Department of Defense Director, Operational Test and Evaluation, 2022. DOT&E Strategy Update 2022.
Department of Defense Director, Operational Test and Evaluation, 2023. DOT&E Strategy Implementation Plan – 2023.
Joint Warfighting Concept 2.0, May 2022. https://www.jcs.mil/Doctrine/Joint-Concepts/
Kotter, John P. Leading change. Vancouver, B.C.: Langara College, 2023.
Lipmanowicz, Henri, and Keith McCandless. The surprising power of liberating structures: Simple rules to unleash a culture of innovation. Seattle, WA: Liberating Structures Press, 2016.
Nix, Maegen, Christina Houfek, Timothy Crone, Kobie Marsh, Tanushri Roy, and Daniel Wolodkin. Training in Innovation and Emerging Technology Adoption. Washington, DC: Acquisition Innovation Research Center. (2023).
Pontefract, Dan. Open to Think: Slow Down, Think Creatively and Make Better Decisions. Figure 1 Publishing, 2018.
Shanks, Michael. “An Introduction to Design Thinking Process .” Stanford.edu. Accessed May 3, 2024. https://web.stanford.edu/~mshanks/MichaelShanks/files/509554.pdf.
Christina Houfek joined Virginia Tech’s Applied Research Corporation in 2022 following her time as an Irregular Warfare Analyst at the Naval Special Warfare Command and as Senior Professional Staff at the Johns Hopkins University Applied Physics Laboratory. She holds an M.A. in Leadership in Education from Notre Dame of Maryland University, and a Graduate Certificate in Terrorism Analysis awarded by the University of Maryland. Her areas of expertise include concept development, curriculum design, requirements and capability analysis, design thinking, and conflict resolution.
Maegen Nix , Ph.D. is a veteran and a former intelligence officer with 25 years of experience in the national security community and academia and currently serves as the Director of the Decision Science Division at Virginia Tech’s Applied Research Corporation. Her civilian career has focused on the development of portfolios related to irregular warfare and insurgencies, cybersecurity, critical infrastructure security, national communications, autonomous systems, and intelligence. Dr. Nix earned a Ph.D. in government and politics from the University of Maryland, an M.A. in political science from Virginia Tech, and a B.S. in political science from the U.S. Naval Academy.
Natalie Wells , is a systems engineer and project manager at Virginia Tech’s Applied Research Corporation. Previously, she was a project manager and analytic methods analyst at Johns Hopkins University Applied Physics laboratory developing streamlined methods for DoD user decision support. She also worked previously at the Department of Homeland Security and Office of Naval Intelligence. She holds an M.S. in Systems Engineering with focuses on Engineering Management and Human Systems Engineering, and a B.A. in Intelligence Studies. Her areas of expertise include systems engineering, intelligence analysis tradecraft, process engineering, and analytic methodology development.
JUNE JOURNAL
READ the Latest Articles NOW!