JUNE 2025 I Volume 46, Issue 2
JUNE 2025
Volume 46 I Issue 2
IN THIS JOURNAL:
- Issue at a Glance
- Chairman’s Message
Workforce of the Future
- Encouraging Diversity in AI Test and Evaluation
Technical Articles
- Model Based Test and Evaluation Master Plan Technical Introduction
- Integrating RAG, HCD, and PD in MBSE for Mission Problem Framing
- Then What? The Need for Iterative Assessments to Achieve Successful Operational Capabilities
- Surpass the Adversary: Enhanced Mission Training through Digital Engineering
- Adaptive Algorithms for LIDAR Semantic Segmentation on Edge Devices
- 2025 AI in T&E Forum
- UC UK ITEA Event Summary
- AI and ML Methods in Verification and Validation
News
- Association News
- Chapter News
- Corporate Member News
![]()
Then What? The Need for Iterative Assessments to Achieve Successful Operational Capabilities

Hans Miller
Chief Engineer, Research & Advanced Capability Development (N223)
The MITRE Corporation; McLean, VA
![]()
![]()
This is the third in a series of articles that looks at how the United States (U.S.) Department of Defense (DoD) can leverage various existing methodologies and new capabilities to enable a connected campaign of learning that will help guide technology development and adapt operations and infrastructure to achieve rapid advances in combat capability. This paper focuses on some of the information we need to know, and when to collect it, to inform the strategy for transitioning innovative technology to a combat capability.
The first article covered the overall concept of test and evaluation as a continuum (Collins and Senechal, 2023). The second covered how the DoD can apply a decision support evaluation framework as a disciplined methodology to inform decisions across the spectrum of technology development (Beers, 2024).
The view, opinions, and/or findings contained in this report are those of The MITRE Corporation and should not be construed as an official Government position, policy, or decision, unless designated by other documentation. This technical data deliverable was developed using contract funds under Basic Contract No. W56KGU-18-D-0004.
Abstract
This article examines the DoD’s need for continuous iterative assessments of new technology to effectively and efficiently accelerate acquisition and achieve a successful combat capability. It highlights the importance of understanding how new technologies interact with existing systems and whether warfighters can use them. It discusses the necessity of early and continuous data collection and experimentation to inform decision-making and emphasizes the need to grow an understanding of DOTmLPF-P (Doctrine, Organization, Training, Materiel, Leadership, Personnel, Facilities and Policy) equities throughout an experimentation campaign. This ensures that new technologies are not only fielded but also that the warfighter is postured to successfully integrate these technologies into operations.
Keywords: Transition, speed of relevance, integrated capability, experimentation, campaign of learning, valley of death
Introduction
“Success does not go to the country that develops a new technology first, but rather, to the one that better integrates it and more swiftly adapts its way of fighting”
– James Mattis (Mattis, 2018)
The current state of global strategic competition has created a clear imperative for the Department of Defense (DoD) to move faster to bring new technologies into combat operations. In response, the DoD has adopted new adaptive acquisition frameworks, initiated efforts to reform requirements development, and written a new science and technology strategy. In addition, the war in Ukraine has shown the asymmetric impact autonomous systems can have against legacy capabilities—one response is the DoD Replicator Initiative to field mature commercial technologies “to supplement” more exquisite systems (Congressional Research Service, 2024).
These efforts trend toward more iterative approaches to getting a delivery to the field, taking risks, and improving capability over time. The language of these initiatives focuses on delivering the technology, but it raises a question—then what? How well do we understand how these technologies are used with other systems? How can we posture the warfighter to use these systems quickly and effectively? How are we collecting and sharing the data to understand the improvements needed for the next iteration?
To efficiently address these questions for iterative acquisition, the definition of success must shift from simply fielding technology to achieving integrated operational capability with the technology. Historically, understanding fully integrated capability has come from analysis of processes established prior to getting a requirement approved (like Joint Capabilities Integration and Development System (JCIDS)) or as part of formal operational testing tied to specific program events. For example, supportability and logistics engineering guidance recommend formal testing to understand system production support such as maintainability, reliability, and sustainability (U.S. Department of Defense 2013). However, a formal linear testing process may not be an efficient means of delivering a capability. Formal testing of a system often occurs late in the development of a system as part of a linear progress of testing and often occurs too late to have much influence on the system design.
In the same way the DoD recognizes the need to move away from linear event-driven milestones to more iterative acquisition processes, there is a corresponding need for a paradigm shift to more iterative data collection. This data collection, for both rapid development and fielding efforts, should focus on demonstrating operational outcomes that are not just accomplished during formal designated testing (Miller, 2017). Some accelerated acquisition efforts are not formal programs of record and may only have experimentation events to generate the functional and operational understanding of a system’s operational integration and performance. This article focuses on those accelerated acquisition efforts that do not have a formal developmental and operational test strategy, and it looks at the questions DoD acceleration efforts should ask early and continually to enable successful mission outcomes.
Background – Drivers and the Need to Understand Integrated Operational Capability
DoD leadership is driving the need for technology development and acceleration efforts to focus on integrated operational capability. In an August 2023 speech, Deputy Secretary of Defense Kathleen Hicks stated, “we must deliver safe and reliable, combat-credible capabilities at speed and scale.” In September 2023, U.S. DoD Directive 7045.20 on Capability Portfolio Management was released “to frame DoD-level capability decisions in a mission context to ensure delivery of integrated and innovative risk-informed solutions to meet strategic objectives.” These statements drive not only the need to deliver capability quickly, but also to have an integrated operational capability that is risk informed.
While the process of developing prototypes to understand how a technology capability can integrate into an integrated and networked force as part of a system of systems is well-established and shown in events such as Joint Capability Technology Demonstrations, following a linear process increases the overall time it takes to achieve the integrated operational capability. Established prototype processes and subsequent formal test for programs of record help to address “safe and reliable” and “combat credible,” but they take time. This should be balanced with the direction of “speed and scale.” The direction for increased speed is aided by the discipline of mission engineering (ME), which helps inform the important question of building the right things to help determine which systems are needed and at what scale to have a mission impact. Mission engineering informs system design, integration considerations, and capability development, ensuring that systems are designed and built to meet mission requirements effectively. To address both the suitability and effectiveness of systems as well as the need for speed and scale of the current operational environment, there is a need for balancing disciplined data gathering and analysis with learning just enough about the mission value and the feasibility of a technology to be integrated into combat operations. Part of that balance is achieved by not just binning the analysis into formal linear processes but by iteratively weaving in the understanding of those equities across development activities.
Early Assessment Focused on Operational Value
“We will accelerate the transition of the technologies that have the most operational value from prototypes into products and prepare for production and acquisition early on.” – 2023 U.S. DoD National Defense Strategy for Science and Technology
Determining the most operational value from a proposed technology depends on two main criteria: 1) Is the technology solving an operational problem or adding mission value (this includes questions related to suitability, effectiveness and survivability), and 2) What are the forecast system integration and operational integration challenges? Figure 1 shows an early assessment decision tree that includes these questions.
Figure 1 – Early Questions for Capability Development Initiatives
The starting point for this decision tree could come from either an innovation from a technology development effort (lab, Federally Funded Research and Development Center (FFRDC), Defense Advanced Research Projects Agency (DARPA), etc.) or a mature commercial capability applied to a DoD context. In either case, the first question of this decision tree should be, what is the context? Early research and development should be broad in exploration of concepts and ideas. However, as the technology readiness advances outside of a lab toward a concept prototype or when assessing commercial technology for DoD use, there should be an initial understanding of the type of user and operational application. Is this something intended to solve a focused tactical problem or is it a technology intended to be broadly applied, impacting multiple weapon systems? This understanding can be informed by model-based mission analysis or wargaming tabletop scenarios or other means. This context is not fixed over time and can be updated with iterative acquisition efforts, but an initial understanding is necessary for future decision making.
The second question reflects a DoD Defense Innovation Organization best practice highlighted in a 2023 RAND report (Brodi et al., 2023), which states that work “be led by a DoD problem rather than the innovative dual-use technology.” Several efforts today address this issue—one such example is the Rapid Development Experimentation Reserve initiative, which uses mission thread analysis and wargaming to inform technology acquisition decisions. Through these types of initiatives, it is important to capture the critical mission assumptions used in the initial experimentation activities so those assumptions can be continually assessed and validated as the technology matures and becomes operational.
The third question addresses the fact that for a technology to become a combat capability, an early understanding of the feasibility of integrating the technology into a system or into operations is required. Figure 1 captures this by asking the question, “What are the system integration and operational integration challenges and are they feasible to overcome?” While transition readiness levels, manufacturing readiness levels, production readiness levels, and integration levels all exist to help provide a broad and informative landscape for understanding transition, missing across these criteria are questions related to whether the technology is feasible for operational use. Similarly, U.S. DoD 5000.81 (Urgent Capability Acquisition), U.S. DoD 5000.80 (Operation of Middle Tier Acquisition), and U.S. DoD 5000.85 (Major Capability Acquisition) include entry criteria that talk to validated warfighter issues and acquisition strategies and scaling, but they do not talk to feasibility of a solution for operational use. As these updates made great strides to improve timelines for acquisition, they lost a discussion and emphasis of understanding DOTmLPF-P (Doctrine, Organization, Training, Materiel, Leadership, Personnel, Facilities and Policy) equities that can help inform feasibility of a technology to be integrated and used in operations.
The point here is that a balance is needed between going fast and gaining an initial understanding of the equities for a system to be used in combat operations. While there is a valid argument that many updates and lessons can occur in the field, that argument assumes the system can be used by the operators. There can be situations where a technology’s fundamental approach creates a barrier to its operational use; the potential for this situation drives the need for some initial questions that should be understood regarding feasibility prior to initial deployment. Similarly, we can learn about improvements in logistics and supportability over time but there may be fundamental aspects of the technology for a given context that make it infeasible to field. For example, experimental capabilities for a Marine rifle’s scope addressed specific operational challenges, but the scope required batteries, and this posed an operational disqualifier for individual riflemen to have to carry spare batteries to use their rifle. (Bermiss 2025)
An initial analysis of potential operational use also helps inform the transition strategy. It provides an offramp to decide whether to continue development, reset expectations, or shift to another commercial solution. If system and operational integration is not well-understood, an effort could expend significant resources to get a technology into the field, but it will not be a true capability unless it’s clear how it can be integrated and utilized in operations.
Accelerated acquisition programs rely on experimentation events to address questions about system performance. However, those same events can be used to validate and answer the questions suggested in this section, providing a knowledge base that will help acquisition activities understand how these new systems fit into operations and what risks are associated with adopting the new technology.
Transition Planning
Accelerated acquisition efforts still need to plan for technology transition. As the Office of the Secretary of Defense (OSD) Director for DoD Innovation and Modernization, Jon Lazar stated in an interview for the RAND report on Defense Innovation Organizations, “Seventy percent [of successful transition] is getting it right from the beginning (Brodi, 2023).” Technology transition does not have one definition, but regardless of the definition the intent should be to move from laboratory or commercial technology to an integrated operational capability.
Although the intended application and use case are not fixed over time and may be updated with iterative acquisition efforts, understanding the context for that operational capability at the start of transition planning is necessary for future decision making.
A use case that involves a focused tactical problem, with limited mission integration, limited scale, and small changes to concepts of operations (CONOPS), can significantly narrow the scope of testing required, the number of stakeholders, and the risk involved. An example of this is the transition of a technology that supports a focused special operations unit where tactical teams can make small adaptations quickly. This is very different from a use case in which the introduction of a major new technology advancement needs to integrate with weapons systems and concepts of operation and could have impacts on DOTmLPF-P that require leadership adoption and potential policy changes. There have been situations where a technology development effort has the context of the second use case but only applies the limited assessment and transition strategy of the first use case. This failure to understand the context of mission and system integration up front can lead to improper planning of the transition strategy, key decision points, and experimentation design.
Once the broad context is understood, two critical tools in transition planning are the use of model-based system engineering (MBSE) to understand the other systems or components within a system that impact functionality and the use of ME to understand the impact (or not) of a technology on an overall mission outcome across various mission threads. The discipline of ME helps inform the important question of building the right things to help determine which systems are needed and at what scale to have a mission impact. ME informs integration considerations, and capability development, ensuring that systems are designed and built to meet mission requirements effectively. This is complemented by the MBSE analysis that looks at whether we are building the system right. Both tools inform technology transition planning by identifying the key engagement points for mission integration and system integration whether the goal is to quickly get a technology directly to a combatant command or integrate a technology with an existing program of record.
An Iterative Understanding of DOTmLPF-P
An early understanding of system and operational integration challenges (often captured by understanding DOTmLPF-P) is a means for understanding an overall mission capability. Whether an effort is for technology development or rapid procurement like Replicator (Congressional Research Service, 2024), it is critical that the warfighter not only gets a technology that answers a problem but is also provided information to inform the enabling and integrating changes necessary to make it an integrated operational capability.
A 2018 article, titled “Promoting Disruptive Military Innovation: Best Practices for DoD Experimentation and Prototyping Programs,” assessed case studies of 17 military technologies that successfully introduced and significantly changed the character of military operations, rendering obsolescent the military systems and practices that predated them. The author found that in every case, the path from prototype to military capability involved complementary changes to DOTmLPF-P. The assessment concluded that “this [corresponding changes in DOTmLPF-P] is not a success factor so much as a universal requirement (Dougherty, 2018).”
By contrast, in the new 5000-series Instructions for DoD Acquisition and in the success criteria for several DoD innovation organizations, DOTmLPF-P is not mentioned. The JCIDS manual has a DOTmLPF-P annex that was “intended to ensure Sponsors adequately address non-materiel aspects of a capability during requirement definition and capability development (U.S. Department of Defense, October 2021).”
| The “Valley of Death”
The concept of the “Valley of Death” is widely used in the discussion of technology transition, and it is often presented as a singular chasm into which technology development efforts fall. It provides a visual that agencies can point to and look to for answers on how to “cross” it. There are many factors and challenges to the valley of death, but the fallacy of the term is that the valley of death is a singular thing. In reality, it is a series of challenges and decision points along the path of technology development. There is no one right answer. Instead, it’s important to know the various challenges that exist and how they affect a given effort’s development and transition strategy. And, just as important, is understanding where an effort is ultimately headed—what does the rim of the valley look like? |
Even though some acquisition pathways do not require adherence to JCIDS, that does not remove the need to understand these non-materiel aspects of a technology. As the DoD emphasizes speed, with many organizations and policies going after programmatic and bureaucratic hurdles, it should not do so at the expense of understanding DOTmLPF-P equities (National Academies Press, 2004).
Figure 2 below captures the different phases of technology development, the types of decisions needed to advance a technology, and the types of “valleys” or challenges a development effort faces on its path. Although this figure is typically associated with traditional acquisition processes, it nevertheless highlights the elements an accelerated acquisition program should consider, albeit in a faster and more condensed way. Understanding the mission context will help inform the challenges that apply to a technology development effort and to transition planning. A key point in Figure 2 below is what happens if you are successful in navigating all those challenges that collectively form the “valley of death?” In the visual of a valley, what happens when you get to the “rim”? If that “rim,” or the criterion for success, is just getting a technology to the field, then it may miss the understanding of the system and the operational integration challenges needed for a true capability.
Figure 2 – Integrating DOTmLPF-P and CONOPS Understanding into the Development Cycle
Figure 2 depicts not just the decisions, challenges, and activities from concept to fielding, it also includes the goal of an integrated operational capability. It shows how building and understanding DOTmLPF-P, concepts of employment, and CONOPS are part of the entire experimentation campaign (Hayes and Alberts, 2005). At the top of this diagram are development stages for a technology. An effort like Replicator aligns to a scaled demonstration development stage (Congressional Research Service, 2024). At each stage of development or commercial technology assessment, there are decisions on continued investment in more advanced prototypes, procurement, scaling, and adoption of the technology into operations.
The analysis and experimentation activities (from mission engineering to large scale live events) share data and inform the design of other events. They can and should happen in an iterative approach and do not necessarily happen in a serial progression. For example, a small-scale live event can inform the representation of a system in a large mission virtual environment, or a mission engineering analysis can inform the design and objectives of a large-scale live test. Historically, these events often have been experiments or demonstrations to inform objectives related to the performance of a technology or a demonstration of operational value. The large arrow represents how those iterative activities should also support an increased understanding of DOTmLPF-P equities that the intended operational warfighter can use to continuously update and plan for the future technology and accelerate the integrated operational capability (National Academies Press, 2004).
The disciplined process of how to identify measures and data collection needs based on those decisions is discussed in detail in the ITEA article, Decision-Supporting Capability Evaluation throughout the Capability Development Lifecycle (Beers, 2024). Often, these analysis, experimentation, and test activities are disconnected, and the operational warfighter gets disconnected information or is sometimes not entirely informed about results. In particular, accelerated acquisition programs need to maximize data and analysis from every collection opportunity available to them. This can help various DoD organizations meet the DoD Directive on Capability Portfolio Management to help “frame DoD-level capability decisions in a mission context to ensure delivery of integrated and innovative risk-informed solutions” (U.S. Department of Defense, September 2023) and get timely data to Combatant Command organizations to support their ability to integrate and operate the new technology.
It is important to note Figure 2 was created from phases of development, decisions and challenges with new technology or integrating a commercial technology for a complicated system (e.g., a new physical technology with deterministic outputs). The need for an iterative understanding of updates to DOTmLPF-P and CONOPS also applies to a complex system where the system itself is dynamically changing (e.g., technology using an advanced machine learning algorithm that adapts and changes continuously.) The growing influence of software foundationally forces this paradigm shift to continuous iterative testing, and the understanding of effects on other parts of the system or systems in the operational environment. Since a complex system is dynamic, the body of knowledge and understanding of DOTmLPF equities and CONOPS has a short half-life. Software (that is adaptable by design) can be updated quickly as tactics are developed and the operational environment changes. In their 2022 article on software defining tactics, Weiss and Patt demonstrate a short cycle between build and operate, where the ability to quickly change software drives a trial, learn, adapt, trial philosophy. This creates an iterative cycle that continues to the right of the integrated operational capability in Figure 2, where “build prototype” is part of the cycle, followed by some experimentation and test to validate functionality and gain some understanding of equities, and then ultimately fielding. A key resource necessary to successfully execute that cycle, highlighted in that article, is the need to create environments that allow for automated tests to capture not only the functional “-ilities” like interoperability but also representations of the operational environment.
While software changes may not drive physical equities such as facilities or personnel changes, they can drive doctrine and training where the scope of the change should be understood before deployment. A tragic example of a minor software change with significant training and safety implications was the Boeing 737-Max, where a change to the augmentation system was just a software change, but the lack of training and documentation on the change and its implications for the operation of the aircraft were critical factors in two fatal crashes.
The Application of DOTmLPF-P Equities Across an Iterative Experimentation Campaign
Accelerated acquisition programs need to ensure their focus is not just on technical performance, but also on operational use and the users. DOTmLPF-P is well known and established, and it provides a framework that could help programs and other accelerated efforts ensure better transition to an operational context. A well-designed experimentation campaign can both help inform the value of a capability and provide the understanding of the operational integration equities to effectively prepare operational forces to utilize a given capability (see the callout box for a discussion of experimentation and experimentation campaigns).
Figure 3 below illustrates how a notional experimentation campaign could iteratively assess various DOTmLPF-P equities to inform future operational integration. The intent is to go beyond using operators in a demonstration to help inform operational value and instead deliberately build and adapt DOTmLPF-P equities through iterative interactions with operators as part of a holistic campaign. This does not require a complete understanding of DOTmLPF-P or a full set of data and analysis that could serve to slow an accelerated acquisition effort. Rather, the intent is to include DOTmLPF-P equities in an experimentation campaign to gather knowledge of these equities as much as is practical in parallel with other activities versus as a dedicated event before or after other activities.
| Experimentation and Experimentation Campaign
The 2023 U.S. DoD Mission Engineering Guide defines experimentation as: “Testing a hypothesis, under measured conditions, to explore unknown effects of manipulating proposed warfighting concepts, technologies, or conditions.” Experimentation explores more than technology performance. It is a means to explore and develop doctrine, organization, training, materiel, leadership, personnel, facilities, and policy (DOTmLPF-P) that collectively constitute the mission capability of a military force. It encompasses a spectrum of activities, such as studies and analyses, seminars and conferences, work by subject matter experts, war games, modeling and simulations, and small, focused experiments, as well as large field events with live forces. An experimentation campaign is a series of related activities that explore and mature knowledge about a concept or innovation of interest. The campaign develops the knowledge needed to inform major decisions about future forces and confirms that planned capability development and directions will enable forces to perform as expected (Standard, 2012). |
Figure 3 – Notional Example of DOTmLPF-P Equities Across an Experimentation Campaign
Consider doctrine as an example. It takes the DoD years to develop it, but an initial set of rules of engagement and concepts of operation can be used in early mission analysis and experimentation campaigns and updated to become an initial set of tactics, techniques, and procedures for operational use. The operational environment, operational constraints, and mission definition are all part of a mission engineering guide for the DoD, and these concepts are exercised in the execution of mission engineering. The opportunity space is in connecting those mission engineering analyses with a broader experimentation campaign and a connected set of measures and shared data traceable to operational mission objectives that both support operations and inform future capabilities.
An initial cadre of operators involved in early demonstrations can support not only the assessment of the technology to address an operational problem, but they can also inform the training, personnel and organizational equities depicted in Figure 3 with targeted virtual and live experiments. For example, as a rapid fielding organization demonstrates autonomous swarm Unmanned Aerial System (UAS) technology, the initial cadre of operators can not only validate the technology’s operational mission value but they can also provide valuable feedback on the future training that may be needed to effectively employ the technology, an understanding of the workload and the number of people that would be needed for employments, and provide initial inputs on the structural organization changes needed for adoption and utilization of the technology.
The “m” in DOTmLPF-P is the additional materiel needed to support the system being fielded. Understanding what materiel is needed and improving that knowledge during experimentation helps to posture the Service and Combatant Command to operate the new system. Leadership buy-in and engagement is an important aspect of successful technology transition and integration into operations. “Facility” includes both the means to improve the capability and the means to store and sustain the capability. Understanding the policy is essential to staying ahead of potential roadblocks to planned employment. Several of these DOTmPLF-P equities can require long lead times in order to have the targeted operational users postured to use the technology as a capability.
As shown at the top of Figure 3, these equities should be iterated on, not just as part of a live experimentation campaign, but through leveraging and predominately utilizing digital tools and adaptable digital environments. The operational environment, and to some extent the operational user, can and should be represented in the digital environment, and this should be iterated over time as the operational environment changes. As Garcia and Tolk discuss in their 2013 paper on executable architectures enabling fit-for-purpose assessment, to answer the how and why of systems engineering in a digital environment “the system’s architecture must be observed over time within its executable context.” They observed that while significant time is dedicated to the development of an operational scenario for digital modeling, there can sometimes be static representations of the context of the operational framework. As a system is progressing towards operational deployment, it is necessary to represent not only the changes in the system but the dynamic changes from the operational environment and changes in the approach of the operational user.
A clear challenge arises from this need to continually iterate on the representation of the operational environment as resources and time do not exist to manually do this. Thankfully, there are several tools and a growing body of work on how to leverage modeling and simulation with artificial intelligence (AI). Tolk, Barry, and Doskey provide an excellent discussion of this dynamic ecosystem and how AI can be used to “identify endogenous or exogenous requirements for self-modification and self-organization of components, the computational creativity needed to compose such self-modified components into new systems, and the reasoning capability to select the most feasible composition.” Further, they point out that “AI can be used for the continual integration of the simulation environment and the system being developed to ensure tight coupling of the virtual and real systems to ensure accurate simulation results.” To be relevant and successful as a fielded capability, technology assessments should be iterative and include representation of operational context to collect an understanding and evolve DOTmLPF-P equities over time.
All the DOTmLPF-P equities could impact a technology’s ability to be used in combat operations, in terms of both the direction of technology development and the operational posture to receive the technology. Quickly updating combat capability and combat operations requires gaining, growing, and communicating our understanding of these DOTmLPF-P equities as we demonstrate, experiment, and test a new technology.
Conclusion
“Achieving success in these operational areas requires tightly linking our concepts and capabilities for operational forces.” – 2022 U.S. DoD National Defense Strategy
The pace of technical development is urgently driving the DoD to change its approach to requirements and acquisition from linear processes, time-consuming documentation, reviews, and milestones into iterative processes that accept risk and improve over time. There are some accelerated acquisition efforts that are not formal programs of record and may only have experimentation events to generate the functional and operational understanding of a system’s operational integration and performance. These trends drive the need for a disciplined, iterative approach to experimentation and test design across the development cycle to continuously build a knowledge of system and operational integration over time.
To achieve the intent of the National Defense Strategy, the direction of the DoD Capability Portfolio Management and, most importantly, to prepare the warfighter to employ a given technology, the focus needs to be on not just delivery of the technology, but also the knowledge to posture the operational warfighter to accept it as an integrated operational capability. As noted in the ITEA article, Test and Evaluation as a Continuum, “While technology plays an important role in modern conflict, advances in technology do not generally make up for shortfalls in tactics and training, either. Accordingly, the DoD’s acquisition strategies should place a premium on interactions with operators throughout the entire development lifecycle (Collins and Senechal, 2023).”
The growth of complex adaptable systems (such as those driven by machine learning algorithms) will require an iterative approach to operational integration equities since the operational context is not fixed. Complex systems also create additional challenges on how to represent those changing operational context to understand those equities. Advances in the science of modeling and simulation, and the rapid development of artificial intelligence tools, make this ability to quickly adapt the representation of the operational context to obtain relevant data feasible to achieve.
When the DoD discusses the term “speed of relevance,” the relevance comes in the form of mission impact. The time frame for speed of relevance is not the time needed to deliver a capability to the field, but the time it takes for that technology to be integrated and to swiftly adapt our way of fighting. Doing that requires gaining, growing, and communicating our understanding of DOTmLPF-P equities as we demonstrate, experiment, and test a new technology. Speed is critical, but without a disciplined approach to collect data early to meet information gaps, focused on mission outcomes, that speed risks spending significant resources on technologies that aren’t relevant to overall mission success.
References
Beers, Dr. Suzanne. 2024. “Decision-Supporting Capability Evaluation throughout the Capability Development Lifecycle.” International Test and Evaluation Association (ITEA) 45 (1).
Bermiss, Dr. Hassan, interview by Hans Miller. 2025. Logistics Concerns on Fielding New Technologies (June 3).
Brodi Kotila, Jeffrey A. Drezner, Elizabeth M. Bartels, Devon Hill, Quentin E. Hodgson, Shreya S. Huilgol, Shane Manuel, Michael Simpson, Jonathan P. Wong. 2023. Strengthening the Defense Innovation Ecosystem. Research Report, Santa Monica, CA: RAND Coporation.
Collins, Christopher, and Kenneth Senechal. 2023. “Test and Evaluation as a Continuum.” International Test and Evaluation Association (ITEA) 44: 14.
Congressional Research Service. 2024. “DoD Replicator Initiative.” https://crsreports.congress.gov/product/pdf/IF/IF12611/1, March 12. https://crsreports.congress.gov/product/pdf/IF/IF12611/1.
Dougherty, Col George M. 2018. “Promoting Disruptive Military Innovation: Best Practices for Experimentation and Prototyping Programs.” January. https://www.dau.edu/sites/default/files/Migrate/ARJFiles/arj84/ARJ84%20Article%201%20-%2017-782%20Dougherty.pdf.
Garcia, J.J. and Andreas Tolk. 2013. “Executable architectures in executable context enabling fit-for-purpose and portfolio assessment.” Journal of Defense Modeling and Simulation: Applications, Methodology, Technology 1-15.
Hayes, David S. Alberts and Richard E. 2005. “Code of Best Practice: Campaigns of Experimentation.” Command Control Research Program. http://www.dodccrp.org/files/Alberts_Campaigns.pdf.
Hicks, HON Kathleen. 2023. “The Urgency to Innovate.” Ronald Reagan Institute, Washington, D.C.: https://www.defense.gov/News/Speeches/Speech/Article/3507156/deputy-secretary-of-defense-kathleen-hicks-keynote-address-the-urgency-to-innov/, August 28.
Mattis, J.N. 2018. “Remarks by Secretary Mattis on the National Defense Strategy. U.S. Department of Defense.” January 19. https://www.defense.gov/News/Transcripts/Transcript/Article/1420042/remarks-by-secretary-mattis-on-the-national-defense-strategy/.
Miller, Thomas H. 2017. “Requirements Management: The Need to Overhaul JCIDS.” Department of Defense AT&L. Jan 30. https://dair.nps.edu/handle/123456789/3012.
National Academies Press. 2004. “The Role of Experimentation in Building Future Naval Forces.” Accessed page 3, 28. https://nap.nationalacademies.org/read/11125/chapter/2.
Office of the Under Secretary of Defense for Acquisition and Sustainment. 2023. “DoD Directive 7045.20 Capability Portfolio Management.” September 25.
Standard, Todd. 2012. “Experimentation Integrated Warfighting Capabilities Task #5 .” [Powerpoint Slides] NAVAIR.
Tolk, A., Barry, P. and Doskey, S.C. 2022. “Using modeling and simulation and artificial intelligence to improve complex adaptive systems engineering.” International Journal of Modeling, Simulation, and Scientific Computing 13 (2): 1-19.
U.S. Department of Defense. 2022. “2022 National Defense Strategy.” https://apps.dtic.mil/sti/trecms/pdf/AD1183514.pdf.
—. 2013. “Department of Defense Handbook 502A Product Source Analysis.” March 8. https://www.ldac.army.mil/api/resources/lec/downloads/standards/MIL-HDBK-502A.pdf.
—. 2023. “DoD Directive 7045.20 Capability Portfolio Management.” September 25. https://www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodd/704520p.pdf.
—. 2019. “DoD Instruction 5000.80 Operation of the Middle Tier of Acquisition.” December 30. https://www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodi/500080p.PDF?ver=2019-12-30-095246-043.
—. 2019. “DoD Instruction 5000.81 Urgent Capability Acquisition.” December 31. https://www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodi/500081p.PDF?ver=2019-12-31-133941-660.
—. 2021. “DoD Instruction 5000.85 Major Capability Acquisition.” Novermber 4. https://www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodi/500085p.pdf?ver=2020-08-06-151441-153.
—. 2020. “DoD Instruction 5000.87 Operation of the Software Acquisition Pathway.” October 07. https://www.esd.whs.mil/Portals/54/Documents/DD/issuances/dodi/500087p.PDF?ver=virAfQj4v_LgN1JxpB_dpA%3D%3D.
—. 2023. “DoD Mission Engineering Guide.” October 01. https://ac.cto.mil/wp-content/uploads/2023/11/MEG_2_Oct2023.pdf.
—. 2021. “Joint Capabilities Integration and Development System (JCIDS) Manual.” October. https://www.dau.edu/cop/iam/documents/dod-jcids-manual-oct-21.
—. 2023. “National Defense Science and Technology Strategy.” https://media.defense.gov/2023/May/09/2003218877/-1/-1/0/NDSTS-FINAL-WEB-VERSION.PDF.
Weiss, J. and D. Patt. 2022. Software Defines Tactics: Structuring Military Software Acquisitions for Adaptablility and Advantage in a Competitive Era. Hudson Institute. https://www.hudson.org/national-security-defense/software-defines-tactics-structuring-military-software-acquisitions.
Author Biographies
Hans Miller Col USAF (ret), is Chief Engineer for Research and Advanced Capabilities Department and Senior Principal T&E SME at the MITRE Corporation. He has over 28 years of experience in combat operations, experimental flight test, international partnering, command and control, and transition strategies of defense weapon systems. Prior to MITRE, Mr. Miller was Division Chief of Policy, Programs and Resources at USAF Headquarters for Test and Evaluation. Mr. Miller was Commander of the 96th (now 406th) Test Group at Holloman AFB and Commander of the Global Power Bomber Combined Test Force at Edwards AFB supporting B-1, B-2 and B-52 testing. Mr. Miller worked with international partners through a NATO assignment and as program manager of the DoD Foreign Comparative Test Program. Mr. Miller graduated from the US Air Force Academy with a B.S. in Aeronautical Engineering and a Masters of Aeronautics and Astronautics from Stanford University.
Dewey Classification: L 681 12

