JUNE 2025 I Volume 46, Issue 2
JUNE 2025
Volume 46 I Issue 2
IN THIS JOURNAL:
- Issue at a Glance
- Chairman’s Message
Workforce of the Future
- Encouraging Diversity in AI Test and Evaluation
Technical Articles
- Model Based Test and Evaluation Master Plan Technical Introduction
- Integrating RAG, HCD, and PD in MBSE for Mission Problem Framing
- Then What? The Need for Iterative Assessments to Achieve Successful Operational Capabilities
- Surpass the Adversary: Enhanced Mission Training through Digital Engineering
- Adaptive Algorithms for LIDAR Semantic Segmentation on Edge Devices
- 2025 AI in T&E Forum
- UC UK ITEA Event Summary
- AI and ML Methods in Verification and Validation
News
- Association News
- Chapter News
- Corporate Member News
Editorial – ITEA Journal – June 2025
DIVERSIFYING T&E WORKFORCE AND
DIGITISING T&E PLANNING
Introduction
Since the ambitious event plans for ITEA were announced for 2025, there is growing uncertainty that US Government employees will be allowed funding for attendance at educational symposia and workshops. Notwithstanding this uncertainty, locals in Washington were able to gather for the first ITEA event of 2025 on 19 March concerning testing of AI ML, followed similarly by the Defense and Aerospace Test and Analysis Workshop (DATAWorks) – Potomac Yard, Alexandria on 22-24 April. Finally, without the US Government restrictions, the European Chapter shook off its long Winter with their inaugural workshop on ‘Accelerating the pace of T&E’ in Portsmouth, United Kingdom, 20-21 May. For those who could not attend, Erwin Sabile has provided an overview of the AI ML T&E event, while Cathy O’Carroll and Andy Cunningham have provided their overview of the UK event on accelerating the pace of T&E. We also have two guest editors curating the technical contributions from the successful DATAWorks for our September Edition; namely, Dr Madeline Stricklin of the Los Alamos National Laboratory and Vicky Nilsen of NASA Headquarters. They are already doing peer reviews on a record number of submissions from the many researchers who profiled their developments in Defense and Aerospace Test and Analysis.
Technical Articles – Diversifying T&E Workforce.
This edition has two important themes. The first is on diversifying the T&E workforce, central to ITEA’s mission. As Western populations age, there is a declining entry-level workforce, such that professions need to appeal to all possible entrants to succeed. T&E aims to inform decision-making, especially for engineered systems with safety implications in the Government, but to do so efficiently, it requires entrants interested and competent in Science, Mathematics and Engineering Careers (SMEC), and increasingly AI ML. For a myriad of reasons that our authors explore, young women are often reluctant to undertake such career pathways, largely holding myths about these professions and being largely unaware of the benefits. Like a good preview test, faculty and staff from Virginia Tech, undertook some exposure workshops in AI ML for female graduates, writing up the reasons and findings in an article titled, ‘Workforce Development: Encouraging Diversity in AI Test and Evaluation’, this article was led by Danielle Kauffman who we are especially proud to have had published in this journal as she spent many years as the assistant editor before last year handing the role to Viruben Watson. Similarly, staff from the University of New South Wales in Canberra, Australia, led by Dr Olga Zinovieva and Dr Li Qiao, continue to run the Young Women in Engineering (YoWIE) event, writing up the reasons and findings in a second article titled, ‘Empowering Future Female Engineers through Early STEM Engagement’. Unfortunately, this second article from Australia still requires some ethics approval and so has been delayed until our December Edition. Imagine the future of T&E if these efforts were replicated at scale across all universities and industry. We are especially proud that Danielle has published her article in this journal as she spent many years as the assistant editor before last year handing the role to Viruben Watson.
Technical Articles – Digitising T&E Planning
The second theme of this edition is an increasingly dominant theme of this journal, namely, digital reform of how engineered systems are developed and thus tested and evaluated. Digitisation and digitalisation are multi-faceted and in this edition we have a great sample of those facets. As anyone who knows digital engineering knows, one constant objective and cultural challenge, especially in Government bureaucracies, is to remove document-centric decision-making and replace it with a digital dashboard of the same stakeholder needs, artifacts, contributions, and consensus. Along with every other document, the shrine of T&E Master Plans (TEMP) will be digitalised. Our third article is led by Dr. Craig Arndt of the Georgia Tech Research Institute, titled ‘Model Based Test and Evaluation Master Plan Technical Introduction.’ The article concerns significant research into the development of a model-based TEMP reference architecture that integrates and links data from multiple digital models to a standardised set of acquisition, technical, and T&E decisions. We hope to run future articles on the validation of this approach in departments.
Our fourth article is by a doctoral student in Systems Engineering at Old Dominion University (Norfolk, Virginia), Rafi Soule. Her research article is titled, ‘Integrating Retrieval-Augmented Generation (RAG), Human-Centered Design (HCD), and Participatory Design (PD) in Model-Based Systems Engineering (MBSE) for Mission Problem Framing.’ Mission engineering has to be the birth of all digital engineering and thus the touchstone by which stories frame the sprints and scrums, end-to-end scenarios shape development, and representative users test and evaluate design iterations. ODU is leading mission engineering in the US, particularly for maritime systems, and we are therefore very proud to feature Rafi’s work assisting Military Sealift Command.
Our fifth article is by Hans Miller at MITRE, who is an experienced test pilot, program manager, and director of large flight and ground test organisations. Hans’ article is titled, ‘Then What? The Need for Iterative Assessments to Achieve Successful Operational Capabilities.’ The modern highly networked defence force capability sets are systems-of-systems, meaning they are multi-proprietary and multi-generational. As new technologies are integrated with increasing software functionality and AI, the pace of such changes and iterative development increases. In such a digital engineering environment, just like with T&E Master Planning earlier, it is easy to lose sight of essential supportabilities. Hans’ article discusses the necessity of early and continuous data collection and experimentation to inform decision-making and emphasizes the need to grow an understanding of Doctrine, Organization, Training, Materiel, Leadership, Personnel, Facilities and Policy (DOTmLPF-P) equities throughout an experimentation campaign. His proposed approach ‘ensures that new technologies are not only fielded but also that the warfighter is postured to successfully integrate these technologies into operations.’
Our sixth article is by the current Assistant Editor, Viruben Watson, who is a highly experienced Systems, Software, and Flight Test Engineer. ‘Rubes’ article is titled, ‘Surpass the Adversary: Enhanced Mission Training through Digital Engineering.’ In a thread similar to Hans Miller’s article, he focuses on several of the DOTmLPF-P by using ‘conceptual modelling to parameterise training systems and mission requirements as objects represented by attributes of performance and function.’ As envisaged by Rafi Soule he applies ‘digital engineering and mission engineering techniques’ to ‘analyse the attributes in the mission thread design, and build a set of relationships between the training system characteristics and the mission attributes.’ He then uses those threads to relate the training system to the likelihood of the training system’s ability to fulfil the mission objectives, so he can optimise the mission. Finally, he transforms the support design from a conceptual framework to an applied trainer or procedure. His case study is the ominously topical ‘defence of commercial shipping against an attacking USV swarm.’
Our seventh article is led by Dr Billy Geerhart at Army Research Laboratories, and titled, ‘Adaptive Algorithms for LIDAR Semantic Segmentation on Edge Devices.’ Robotic systems are increasingly use LIDAR for situational awareness and path planning to avoid obstacles at speed; something they must do on the limited computational processing of ‘edge computing’. The Army Research previously optimised an algorithm to achieve a 3x speedup, enabling real-time inference of LIDAR and image data, whereas they have now produced adaptive algorithms using ‘three new hyperparameters – image size, LIDAR resolution, and k-nearest-neighbors (KNN) – from the existing hard-coded network structure.’ They ‘demonstrate that the adaptive hyperparameters can reduce inference time from 90ms to as low as 25ms, while maintaining target performance in computationally intensive scenarios. This article certainly should get you in the mood for September’s focus on the Defense and Aerospace Test and Analysis (DATAWorks) special edition.
Technical Note
Our final contribution is the first Technical Note of my tenure as Chief Editor. Such notes present a partially complete piece of work, perhaps theoretical or applied where it has not yet been fully scaffolded to the knowledge front or validated with peers, such as through a case study. These we anticipate will be published to invite collaboration to formally conclude such works as completed research papers. Our technical note is by John Frederick, Director of Innovation and Testing Strategies at Veracity Engineering, titled ‘AI and ML Methods in Verification and Validation: Operationalizing Advanced Concepts Through Digital Twin Technologies.’ His work provides verification and validation strategies and solutions to support the FAA and other government missions. He argues, ‘Digital Twin technology, powered by Artificial Intelligence (AI) and Machine Learning (ML), is a game-changing approach that enables real-time decision-making, predictive analytics, automation, and optimization.’ Similar to our articles by Rafi and Viruben, he argues digital twins are the critical wherewithal so ‘stakeholders can simulate, analyze, and refine complex operational strategies within risk-free virtual environments’ for ‘a more robust and efficient approach to system verification.’ Please reach out to John if you are interested in working with John to help fully research and report his note.
Closing
This edition with our dedicated ITEA team is half-way through my three-year term as Chief Editor. While I appear to lose a portion of every week to this volunteer role, I am enriched and renewed by what I read — I hope you are too. I’m also very proud of the younger minds stepping up to contribute articles and edit — there are capable volunteers to take over in 2027. With that thought, I will cede the next edition’s editorial to guest editors, Dr Madeline Stricklin and Vicky Nilsen, to cover the contributions from the Defense and Aerospace Test and Analysis Workshop. The peer reviews they are receiving are a heartening voluntary collaboration by many like-minded but diverse researchers, all doing very challenging T&E on complex technologies.
Dewey Classification: L 681 12


