2024 MDO Tutorial Abstracts

2024 MDO Tutorial Abstracts

Tuesday, May 14

Morning Sessions
8:00 AM – 12:00 PM

T&E Fundamentals and CTEP Foundations

A one-day course on the fundamentals of Test & Evaluation focused on the 4 CTEP domains to  better prepare for the ITEA Certified Test & Evaluation Professional (CTEP) exam.

In this high-level one-day course we will review the fundamentals of T&E which are covered in the CTEP Foundational exam.  The course includes an introduction with a short history of T&E in the Federal Government and an overview of the DoD Acquisition Process (with a few references to DHS and FAA acquisition processes). The class then focuses on the CTEP Body of Knowledge (BOK) including the four subject domains: Test and Evaluation Planning (Requirements Analysis, T&E strategy, Evaluation approach, Test Planning, T&E Cost Management, Contracting for T&E, Organizational planning, Risk identification and management, Specialized Types of testing); T&E Design (Test Adequacy and Scientific Test and analysis techniques); Test and Evaluation Execution (Test Control Management, Data Management, and Test Safety / Certification); and Test Data Analysis, Evaluation and Reporting (Data Verification and Validation, Validation of Test Results, Evaluation, Reporting, Cyber Resilience / Cybersecurity Analysis, Model Validation, and Data Analytics).

The course will cover the first two domains, Test Planning and Test Design, and will be covered in the morning. Test & Evaluation Execution and Test Data Analysis and Reporting will be covered in the afternoon.   

Mission Engineering

Abstract coming soon.

Laser Systems Propagation T&E Challenges

An introduction to the challenges of testing and evaluating the propagation of Laser Systems.  An overview of the basic physics and terminology of these systems is included. The unique propagation effects of Laser Systems are also discussed to provide a foundation for test objectives.  Test and evaluation needs for propagation of Laser Systems including diagnostic beam propagation and atmospheric measurements are briefly examined.  

Cognitive Electronic Warfare

This tutorial will present an overview of how AI can be used in EW. Dr. Haigh will describe opportunities for using AI in situation assessment and electronic support (ES), and decision-making techniques for electronic protect (EP), electronic attack (EA), and electronic battle management (EBM). She will present AI techniques from Situation Assessment, Decision Making, and Machine Learning, and discuss tradeoffs.

Dr. Haigh will describe approaches to the important issue of real-time in-mission machine learning, and evaluation approaches that demonstrate that a cognitive system that learns how to handle novel environments. The tutorial is intended to be a voice track to the 2021 book Cognitive Electronic Warfare: An Artificial Intelligence Approach (Artech US and Artech UK). The intended audience is RF people–experts in EW, cognitive radio, and/or cognitive radar–who want to learn more about how and where to use AI. Our goal is to help triage and guide EW system designers in choosing and evaluating AI solutions. Cognitive EW is one of these critical advances that will determine the outcomes of future battles.

Digital Engineering

This short course / tutorial will review digital engineering concepts in general and then deep dive into specifics for test and evaluation (T&E) in a digital engineering environment.  The course will review concepts, methods, tools, and best practices for five Digital Engineering topic areas including models, an authoritative source of truth, technological innovation, innovative infrastructure, and workforce. Each topic area will be addressed in general, followed by discussion of specific issues and challenges for T&E. Discussion areas will include:  

  • How planning and the evaluation components of T&E need to evolve in the DE environment, given Model Based Systems Engineering, Mission Engineering, and automated testing.
  • The characteristics of T&E tools within the DE environment and considerations and methods for automated tools selection.
  • Data access, data sharing, and hurdles for building an authoritative source of truth.
  • Special concerns for Cyber T&E in a Digital Engineering environment.
  • Digital Engineering infrastructure and infrastructure providers.
  • T&E workforce within a Digital Engineering ecosystem.
  • Gaps in current infrastructure, capabilities, workforce, etc.

This course is intended for T&E professionals who are new to Digital Engineering or are beginning to implement Digital Engineering in their T&E practices.   

 

Afternoon Sessions
1:00 PM – 5:00 PM

Fundamentals of T&E – Part 2 Test & Evaluation Execution and Test Data Analysis and Reporting

Part II will include Test & Evaluation Execution and Test Data Analysis and Reporting.

Contracting Opportunities in T&E

This tutorial will provide the T&E professional an overview and process for inclusion of T&E equites into the acquisition contracting artifacts.  The goal of this tutorial is not to make T&E professionals contract experts, but rather provide them a keen understanding of their “Key” role, responsibilities, processes, and as key players within this process ensure T&E equities are included within acquisition contracts.  

It is critical that our T&E professionals have a full understanding of their “Key” role within the program contract development process.  Without the T&E professional working side-by-side the contracting Team there are NO guarantees that T&E equities will be clearly articulated and communicated within the contracting documents.  The T&E professional is the key to ensuring that T&E is accurately, effectively, and with clarity included within the program contract actions, thereby reducing confusion, misrepresentation, and unclear requirements. 

There is a gap for our T&E professionals within this area of knowledge, and it’s for that reason, this tutorial is recommended. 

Human Machine Integration in T&E

Human Systems Integration (HSI) is a field of systems engineering that is focused on addressing human performance and safety in the design, development, fielding, and operations of technology and systems. To determine if a solution is effective and suitable for use, test and evaluation events must be planned and conducted throughout the lifecycle. This seminar will provide T&E practitioners an introduction to HSI and the important role it plays in test and evaluation.  This will include a tool kit to implement HSI into programs and will highlight methods and metrics associated with each domain area of HSI.  This tool kit can be used to support the development of evaluation frameworks, parameters, thresholds, and objectives necessary for evaluating requirements to ensure HSI considerations are adequately addressed throughout the acquisition process.   

JMETC 101

The Test and Training Enabling Architecture (TENA) was developed as a DoD CTEIP project to enable interoperability among ranges, facilities, and simulations in a timely and cost-efficient manner, as well as to foster reuse of range assets and future software systems. TENA provides for real-time software system interoperability, as well as interfaces to  existing range assets, C4ISR systems, and simulations. TENA, selected for use in JMETC events, is well designed for its role in prototyping demonstrations and distributed testing.  

Established in 2006 under the TRMC, JMETC provides readily-available connectivity to the Services’ distributed test capabilities and simulations. JMETC also provides connectivity for testing resources in the Defense industry and incorporation of distributed testing and leveraging of JMETC-provided capabilities by programs and users has repeatedly proven to reduce risk, cost, and schedule. JMETC is a distributed LVC testing capability developed to support the acquisition community during program development, developmental testing, operational testing, and interoperability certification, and to demonstrate Net-Ready Key Performance Parameters (KPP) requirements in a customer-specific Joint Mission Environment.  

JMETC is the T&E enterprise network solution for secret testing, and uses a hybrid network architecture – the JMETC Secret Network (JSN), based on the SDREN. The JMETC MILS Network (JMN) is the T&E enterprise network solution for all classifications and cyber testing. JMETC provides readily available connectivity to the Services’ distributed test capabilities and simulations, as well as industry test resources. JMETC is also aligned with JNTC integration solutions to foster test, training, and experimental  
collaboration.  

TRMC Enterprise Big Data Analytics (BDA) and Knowledge Management (BDKM) has the capacity to improve acquisition efficiency, keep up with the rapid pace of acquisition technological advancement, ensure that  effective weapon systems are delivered to warfighters at the speed of relevance, and enable T&E analysts across the acquisition lifecycle to make better and faster decisions using data that was previously inaccessible, or unusable. BDA is the application of advanced tools and techniques to help quickly process, visualize, understand, and report on data. JMETC has demonstrated that applying enterprise-distributed BDA tools and techniques to T&E leads to faster and more informed decision-making that reduces overall program cost and risk.  

TRMC has been working with Joint Staff and Air Force JADC2 Cross-Functional Teams (CFTs) regarding JADC2 and Multi-Domain Operations (MDO), to inform them on TENA/JMETC and other TRMC capabilities that could be leveraged to support the emerging Joint Staff Joint Domain Environment (JDE). Additionally, TRMC has been engaged with Army Futures Command (AFC) throughout the year in a number of areas including assessing TENA/JMETC Support coupled with Big Data Analytics (BDA), expanding OSD TRMC collaboration and cooperation to other mission areas including, but not limited to, Cyber, BDA, Knowledge Management (KM), Machine Learning (ML),  and Artificial Intelligence (AI).  

This tutorial addresses using the well-established TENA and JMETC tools and capabilities combined with BDA tools and techniques to reduce risk in an often-uncertain environment; regularly saving ranges time and money in the process.  

Test and Evaluation of AI-Enabled Systems

The growth of Artificial Intelligence (AI) and Machine Learning (ML) enabled systems has presented unique challenges to the Test and Evaluation community. This course provides an introduction to AI/ML, discusses important issues for AI/ML including bias and trust, and defines the unique challenges for Test and Evaluation.