MARCH 2026 I Volume 47, Issue 1

Building Trust in Autonomous Systems through an Alternative Test Strategy

Building Trust in Autonomous Systems through an Alternative Test Strategy

CDR Ryan Agte

CDR Ryan “PC” Agte, USN, CSEP

Aerospace Engineering Duty Officer, Program Manager, and Systems Engineer in Naval Air Systems Command (NAVAIR)

Jean Paul Santos, Ph.D.

Dr. Jean Paul “JP” Santos

Chief Innovation Officer (AISD)
Chief Engineer, Point Mugu Sea Range Future Capabilities Office / Lab (FCO/L)
Naval Air Warfare Center Weapons Division (NAWCWD)

Gregory Bo Marzolf

Dr. Gregory “Bo” Marzolf Col, USAF (Ret.), PMP

Associate Professor of Systems Engineering, Colorado State University

DOI: 10.61278/itea.47.1.1004

Abstract

The program management personnel inside of the Department of War (DoW) are the people that are expected to realize the development, fielding and sustainment of systems that meet performance requirements and fall inside of the schedule and fiscal constraints placed on them from higher echelon command. This responsibility is a juggling act where the fiscal element receives great scrutiny and drives behaviors in Program Management execution that prefer proven technologies over experimental opportunities for their dependability in schedule and cost which allows for stronger arguments when defending a budget. Program Managers want to trust the technology they are funding. This is not an incoherent methodology as it allows for progress even under fiscally strict periods and consistently progresses a new system through the acquisition process, but it does deprioritize technological advancement. This paper will discuss an approach and flight experimentation that works with the current acquisition system by increasing the technological readiness of an autonomous safety system via a low-cost alternative test strategy, allowing Program Managers to include edge technologies in acquisition planning.

Keywords: Trust Autonomous Safety Program Management.

Introduction

Program Managers (PMs) in the Department of War have a unique and daunting responsibility to convert dollars into military capability. The use of Program Manager here is referring to any decision authority in the entire acquisition chain of command and not just those with the formal title of Program Manager. The word capability above is used broadly as it could refer to anything required to meet our fighting force’s needs. When it comes to the sophisticated and complex technologies that surround aircraft or weapons systems, PMs must weigh alternative subsystem risks and opportunities to deliver the desired performance, inside of the expected cost and schedule boundaries. These risk tradeoffs and procurement path decisions will have drastic effects on both the acquisition timeline uncertainty and decide the long-term performance capabilities of the system. An example of this that is widely known at least topically, especially in Navy circles is the adoption of the Electromagnetic Aircraft Launch System or EMALS on new Ford Class Aircraft Carriers. This system is a new technology integrated into the new class of nuclear-powered aircraft carriers, a functionality previously allocated to a steam-based system. Unfortunately, it is widely known because of the delays associated with incorporating this technology into the new aircraft carrier. At the outset of designing this new ship, Program Managers were faced with a trade-off decision like previously described; the traditional, well-known, steam-based system, or the advanced, unproven, electromagnetic system. The steam system would have been as completely and technically informed as a system can be. Data on maintainability, supportability, reliability, availability, manpower requirements would have been well documented. Therefore, from a cost, schedule and performance perspective the steam system would have been highly trusted, yet possibly inadequate for the requirement of the future ship. The new electromagnetic technology which General Atomics describes as “electromagnetic technology to launch aircraft from the deck of naval aircraft carriers and offers significant benefits over current launch systems” with improvements listed as: “increased launch operational ability, flexible architecture to suit different platforms, capable of launching a wide range of aircraft weights, reduced manning and lifecycle costs, [and] reduced thermal signature” would be an untrusted, potentially more adequate system. [1] The decision space between the two alternatives, known as “technical risk assessments”, would be whether the traditional system meets the needs with known cost, schedule, and performance in hand or whether it is an acceptable risk to move forward with new technology to capture performance gains with the associated uncertainty in cost and schedule. [2] It is important to note that the complexity of this particular trade-off is far more in-depth than discussed here but rather is used in a simplistic form to introduce the concept.

This programmatic judgment call is fundamentally driven by a technology’s maturity, often quantified by Technology Readiness Levels (TRL). A low TRL, as was the case with the EMALS system during its initial selection, signifies a higher risk in cost and schedule but promises greater performance. Conversely, a high TRL offers reliability at the potential cost of future capability. The core challenge for any PM is navigating this trade-space. This paper directly addresses this challenge within the context of missile telemetry modules and autonomous flight termination systems (TM/AFTS). It will present a methodology designed to significantly reduce programmatic risk by increasing the TRL and creating robust performance documentation for these critical technologies before the key acquisition decisions must be made. By maturing the technology early and independently from the missiles test and evaluation strategy, this approach aims to provide decision-makers with the confidence to adopt advanced capabilities without incurring the prohibitive schedule and cost overruns that have damaged past programs.

Technology Readiness vs Programmatic Risk

TRLs, shown below in Figure 1, are defined by NASA are “a type of measurement system used to assess the maturity level of a particular technology”. [3] The TRLs themselves have been around since the 1970s at NASA but were formally defined in 1989 and have since been normalized across engineering domains as the standard of measure with slight verbiage adjustments depending on the user. [4] The US Government Accountability Office maintains a manual of best practices for the assessment of the readiness of technology, and in Appendix IV lists multiple different TRL scales depending on the industry that uses them. [5] This measurement level helps us to understand how far along any particular technology is, with 1 being the lowest level and 9 being the highest level. There is some subjectivity in between each level depending on the technology, fidelity of any prototypes, and the environments they have been tested in, however it is relatively straightforward to codify the TRL of a technology by an informed expert. In practice, engineers familiar with the technology can confidently refer to a technology at a specific TRL or at least somewhere in the band of two TRLs, like level 6 or 6/7.

Figure 1: Technology Readiness Levels [6]

The programmatic risk associated with a TRL level is not nearly as easy to define and it will vary greatly depending on the situation. Subsystems that need to be highly reliable or set precedence for the integration of other subsystems will have a higher impact on programmatic risk than others. This type of analysis is nuanced and requires expertise in programmatic skill coupled with an understanding of the technology itself and the specific challenges. The overall goal is to gain an understanding of the technologies in question and how they integrate into an overall development plan.

Figure 2 below shows three notional programs and how they are differently affected as TRL increases. Numerous factors will determine the programmatic risk associated with the technology, like phase of the program, criticality of the technology, and risk posture for the program as a whole. For example, a safety system critical to the direct prevention of injury to the personnel that use the system may look like Program A below. In this case, the technology is not considered usable until it is a high TRL with little flexibility. A different use case could have a profile like Program A due to the phase of the program, like a case where the program is near an acquisition milestone and it will hold up progression without achieving a TRL of a certain level. It is the same curve as the previous example, but for very different reasons. Another important factor to note in the figure is that TRL is mostly objective, with defined boundaries between levels, whereas programmatic risk is largely subjective, but still has to be quantified by the program team. The combination of all factors creates the data set that the program team will use to inform risk decisions.

Figure 2: Risk vs TRL

Figure 3 below shows the Department of War Acquisition Process broken into different phases depicted by the different colored boxes, the Milestone Decisions depicted by triangles, and the other different reviews and decision points throughout. [7] Program Management teams use this process to acquire new systems and need to be in compliance throughout the acquisition. That adherence provides constraints on when it is appropriate to take different levels of risk. In general, if the program is in the Technology Maturation and Risk Reduction phase prior to Milestone B, then it is easier to accept the risks of lower TRL technologies, as one of the purposes of the phase is to identify useful technologies and mature them in this phase. [8]

Figure 3: Acquisition Milestones [9]

Determining acceptable risk and adopting a specific acquisition strategy and the technologies it will use is the hard part. For the complex systems that the DoW procures, it requires large teams of experts across all functional areas to build a program plan that can be executed successfully.

Autonomous Safety Systems for Beyond Line-of-Sight Testing

The subject technology that this effort is concerned with are the telemetry and autonomous safety systems used on missile systems. Specifically, these subsystems are integrated onto weapon systems to allow them to test their capabilities while simultaneously allowing for safe operations. Data collection is acquired via telemetry and safety operations are controlled via flight termination in the case of off nominal trajectories that may endanger non-participants. Like the EMALS example above, the DoW has a similar situation when it comes to flight safety systems. Missile acquisition strategies up until now how have relied on line of sight technologies that are becoming obsolete with the performance requirements that go beyond line of sight, and the advent of autonomous safety systems in use on vertical launch systems today. [10] A program management team faces significant constraints when accepting the risks associated with a new technology, particularly when a missile system’s test strategy is completely dependent upon it. This effort was undertaken to address that challenge by simultaneously increasing the system’s TRL through in-flight testing while decreasing the risk for program managers by demonstrating the technology’s capabilities. This strategy of proving the capability of the TM/AFTS provides a feasible alternative for incorporation into a Program Manager’s acquisition and test plan. The TM/AFTS test team’s goal was to execute a test event and document TM/AFTS performance that would allow missile program decision makers to determine that incorporation of this new technology is an acceptable risk for the opportunity it provides.

The Alternative Concept and Flight Test Strategy

The strategy from the autonomous system development team was to find a way to execute a live test flight plan at a dollar amount that was easy to resource but would provide a meaning full data set and TRL increase for the TM/AFTS. In May of 2024 at the International Test and Evaluation Association event on Test and Evaluation Challenges for Beyond Line of Sight, an event where one of the authors, Dr. JP Santos, was a Technical Chair, the Technology Transfer Program Officer at NASA’s Armstrong Flight Research Center (AFRC) on Edwards Air Force base was presenting. The presentation revolved around the AFRC’s unique capability to get low TRL technology into flight. NASA leverages their small fleet of highly adaptable aircraft with customizable interfaces and staff of test experts to allow this. [11] After this presentation, a plan was made to bring the Navy’s technology to AFRC’s aircraft and demonstrate the TM and AFTS capability to send a termination signal anytime that the aircraft violated a pre-determined mission-planning boundary. This flight occurred on September 13, 2025, at cost that was more than 50% less than similar tests performed via traditional methods.

The central concept of this effort is an application similar to the gated process put forth in the DOD 5000 process discussed earlier, however it is purposefully decoupled from a traditional, linear acquisition program and timeline. In conventional defense programs, milestone reviews are critical events where a subsystem failure can trigger cascading schedule delays and cost overruns across the entire program. Systemic pressures within the acquisition process to pass milestone reviews can create harmful incentives for programs to avoid risk even when new technology offers superior performance. [12] The approach detailed here mitigates this risk by creating an independent, iterative test loop that provides the freedom to discover failures, redesign, and re-test a developing system without the immense pressure of holding up dependent programmatic milestones. It is a method to advance the technology of a critical subsystem but not force the program manager to make their acquisition plan dependent on that subsystem unless desired. The novel instantiation of this concept was demonstrated through a test plan to mature a missile’s TM/AFTS by using a crewed, fixed-wing aircraft as a dynamic surrogate testbed. This plan moved the system beyond the laboratory and into a relevant flight environment. The test involved installing the complete TM/AFTS onto a NASA aircraft and executing flight profiles designed to intentionally approach and violate a pre-defined virtual boundaries, or ‘geofences,’ simulating a missile deviating from its intended course. The primary technical objective was to verify the system’s end-to-end logic: its ability to correctly detect a boundary breach, generate the corresponding termination command, and successfully transmit that command. This live demonstration, conducted on a recoverable asset, provides invaluable performance data, increases the system’s Technology Readiness Level (TRL), and validates this as a repeatable, cost-effective strategy for lowering subsystem risk associated with a burgeoning safety system technology.

Autonomy Specifically

The concepts discussed above are meant to apply broadly to any technology that is a potential candidate for integration into a larger system. However, the specific testing conducted was to increase the TRL and therefore trust in an autonomous system. More impactful, an autonomous safety system. As the Department of War progresses into procurement strategies for autonomous systems, like the US Air Force’s YFQ-42A and YFQ-44A, questions are raised on how test teams and forward operating aircrews will interact with these systems and how it can be proven that interactions will occur safely. [13]

This challenge is central to the field of safety assurance, which focuses on creating a “safety case” or argument to formally prove that autonomous-based systems can be trusted in critical applications. [14] The autonomous safety systems for missile and weapons testing is experiencing the same issues, and so avenues to increase trust in these systems via innovative pathways will play a critical enabling role to allow program decision makers to include them in their procurement strategies. Currently, if a DoW Program Manager has to choose between a legacy type of technology or a new autonomous technology even if they are the same TRL, it is easier to choose the legacy option, even if there is guaranteed lower performance. The test effort discussed below is an example to change that paradigm and give a more informed risk decision to those decision makers. Legacy Telemetry Module / Flight Termination Systems (TM/FTS) vs TM/AFTS systems are at this juncture right now. As discussed in the December 2025 ITEA Journal Article “Modern Beyond Line-of-Sight T&E with Autonomous Systems”, the limited but highly trusted legacy system cannot perform at the level of its autonomous counterpart, however AFTSs are not yet being incorporated broadly in the DoW. [10]

Technology in Test & Synonymous Interfaces

On a missile system, both the telemetry and flight termination systems are operated separately but must work in close conjunction. Traditionally, the telemetry system is a fixed-function device, requiring its operational parameters like data rate, modulation, and center frequency to be configured in hardware long before a mission, in concert with the spectrum managers at a given test range. The flight termination system traditionally involves a separate set of hardware, including dedicated receivers and safe/arm devices, and relies on a human-in-the-loop to transmit termination commands over a pre-planned, fixed frequency.

This work, however, leverages a modernized architecture that replaces these legacy components with more capable and flexible systems. The flight termination function is handled by an AFTS [15] [16], which utilizes the Space Force’s Core Autonomous Safety Software (CASS). This system from Sagrad Inc., makes termination decisions autonomously based on pre-programmed GPS boundaries and flight rules (e.g., boundary violation, gate crossing), eliminating the need for a human-in-the-loop and the associated ground infrastructure for command transmission. The primary system-level benefit is the significant Size, Weight, and Power (SWaP) savings realized by consolidating functionality.

A critical aspect of this program is the integration of an advanced, government-developed telemeter, which represents a significant technological advancement. This system’s modern architecture is the key enabler for its enhanced flexibility and superior performance, leveraging Software Defined Radio (SDR) technology. [17] [18] [19] This state-of-the-art telemeter provides a versatile platform capable of transmitting complex data streams in a manner both sufficient to meet strict safety requirements, while simultaneously providing design advantages not available on legacy systems. [20] During the live flight test, the System Under Test (SUT) experienced the data flows required in a live operational environment, which validated its overall effectiveness and advanced design.

The combination of an AFTS with an SDR-based telemeter creates a highly integrated and agile system. The SDR provides the platform needed to transmit complex data streams, including the real-time status of the AFTS, while the AFTS reduces SWaP and reliance on legacy ground infrastructure. This modernized architecture moves away from the constraints of fixed-hardware systems and enables a more dynamic, efficient, and future-proof approach to missile and flight test instrumentation. Fundamentally such technologies and architectures enable the ability to utilize common elements for use in future and parallel platform Test & Evaluation efforts. For this test effort, each of the interfaces that the AFTS and SDR-based telemeter will interact with on a live missile flight is replicated by a synonymous interface on the NASA aircraft. Although there are differences between the way the system was mounted inside of a fixed wing aircraft than when compared to being installed inside of the body of a missile, the SUT experienced the same data flows required if it were inside of a weapon.

Flight Test Execution and Results

Overall, the flight test was designed to validate the integrated performance and reliability of the SDR-based telemeter and the AFTS in a live, operational environment. The test was executed using hardware on an operational test range with AFTS boundaries that mimic a true flight corridor. In working with NASA’s Armstrong Flight Research Center on Edwards Air Force Base, CA, the SUTs were demonstrated through flight test on a fixed-wing aircraft. The execution and subsequent results demonstrated an absolute success, proving the viability of this modernized technology suite for complex Test and Evaluation (T&E) applications.

Prior to the flight, both the AFTS and the telemeter were programmed with mission-specific parameters, a process that highlights the flexibility of the architecture. The AFTS was loaded with a mission data file containing the safety rules and operational boundaries relevant to a missile flight profile. These rules include the precise GPS coordinates for the main flight boundary and specific “gate crossing” locations. The unit was programmed to execute specific actions based on its real-time position relative to these zones, such as terminating the flight upon a boundary breach, disabling after a gate crossing, or terminating after a prolonged loss of GPS signal. The SDR telemeter was configured by pre-loading mission data required for the operation, including settings to handle AFTS status and GPS position data in a single cohesive system to execute the flight test.

The telemeter demonstrated an exceptionally stable and robust RF link, with a data capture rate exceeding 99.99% at an operationally representative range and elevation. The near-perfect link stability was maintained throughout all phases of flight, with the few instances of data drops being predictably correlated with aircraft banking maneuvers (antenna shadowing), not system failure. The AFTS performed flawlessly, passing all configured test sets, correctly issuing termination commands after boundary crossings, initiating a termination on simulated GPS loss, and properly disabling its capability upon crossing a pre-defined gate.

Beyond pure technical validation, the data and results gathered from this flight test serve as a critical decision enabler. The unequivocal success and high-fidelity performance data, gathered in a relevant scenario, provide the necessary confidence to engage with key stakeholders. This data facilitates substantive meetings with systems engineers and Program Offices to determine possible candidate Programs of Record for technology transition and to establish potential timelines for integrating this advanced SDR and AFTS architecture into future systems. This test has effectively paved the way for the next generation of range telemetry and safety systems.

Conclusion

The concepts, methodology, and resulting flight test data presented in this paper demonstrate an impactful strategy for introducing technological advancements into acquisition strategies accounting for the present constraints. For Program Managers, tangible test data gathered in a relevant environment enables in-depth feasibility analysis, preventing a promising technology from being dismissed prematurely due to programmatic risk. This approach resulted in a successful test and data capture at less than half the cost of traditional methods, while providing data of comparable applicability. With these results, subsequent discussions with program offices have created new opportunities for this technology to be considered a viable option in future procurement strategies.

Acknowledgements

Michelle Agte
Cheri Hamilton
Amanda Small
CAPT Todd “Toby” Keith, USN
Greg “Crewser” Crewse
Felipe Jauregui, SSTM
Leonard “Lenny” Meuse
John Johnson
Andy Corzine, SSTM
Thomas Dowd, SES

Distribution Statement A: Approved for public release; distribution is unlimited. NAWCWD PR26-0011.

Works Cited

[1] General Atomics, “Aircraft Launch and Recovery Systems,” 2026. [Online]. Available: https://www.ga.com/alre.

[2] Department of Defense, “DOD INSTRUCTION 5000.88,” Office of the Under Secretary of Defense for Research and Engineering, Washington D.C., 2020.

[3] C. G. Manning, “Technology Readiness Levels,” NASA, 27 9 2023. [Online]. Available: https://www.nasa.gov/directorates/somd/space-communications-navigation-program/technology-readiness-levels/. [Accessed 16 1 2026].

[4] G. Salazar and M. N. Russi-Vigoya, “Technology Readiness Level (TRL) as the foundation of Human Readiness Level (HRL),” 29 9 2020. [Online]. Available: https://ntrs.nasa.gov/api/citations/20210000183/downloads/EID%20Abstract%20Summary%20092920_Technology%20Readiness%20Level%20(TRL)%20as%20the%20foundation%20of%20Human%20Readiness%20Level%20(HRL).docx.pdf. [Accessed 16 1 2026].

[5] U.S. Government Accountability Office, “Technology Readiness Assessment Guide, Best Practices for Evaluating the Readiness of Technology for Use in Acquisition Programs or Projects,” U.S. Government Accountability Office, Washington D.C., 2020.

[6] NASA, “Technology Readiness Levels,” NASA, [Online]. Available: https://www.nasa.gov/wp-content/uploads/2023/09/trl.png. [Accessed 16 1 2026].

[7] Department of Defense, “DOD INSTRUCTION 5000.02 OPERATION OF THE ADAPTIVE ACQUISITION FRAMEWORK,” Department of Defense, Washington D.C., 2022.

[8] Defense Acquisition University, “Technology Maturity and Risk Reduction (TMRR) Phase,” Defense Acquisition University, [Online]. Available: https://aaf.dau.edu/aaf/mca/tmrr/. [Accessed 28 Jan 2026].

[9] AcqNotes, “Defense Acquisition Milestone Overview,” AcqNotes, 2 Feb 2024. [Online]. Available: https://acqnotes.com/acqnote/acquisitions/milestone-overview. [Accessed 28 Jan 2026].

[10] R. Agte, J. P. Santos and G. Marzolf, “Modern Beyond Line of Sight T&E with Autonomous Systems,” The ITEA Journal of Test and Evaluation, vol. 46, no. 4, December 2025.

[11] NASA Technology Transfer Program, “Arm\strong Flight Research Center,” NASA, [Online]. Available: https://technology.nasa.gov/center/afrc. [Accessed 16 1 2026].

[12] N. C. Smith, E. D. White, J. D. Ritschel and A. E. Thal, “Counteracting Harmful Incentives in DoD Acquisition Through Test and Evaluation and Oversight,” The ITEA Journal of Test and Evaluation, vol. 37, pp. 218-226, 2016.

[13] Secretary of the Air Force Public Affairs, “Air Force designates two Mission Design Series for collaborative combat aircraft,” 3 March 2025. [Online]. Available: https://www.af.mil/News/Article-Display/Article/4092641/air-force-designates-two-mission-design-series-for-collaborative-combat-aircraft/. [Accessed 16 Jan 2026].

[14] A. V. S. NETO, J. B. C. JR., J. R. A. JR. and P. S. CUGNASCA, “Safety Assurance of Artificial Intelligence-Based Systems: A Systematic Literature Review on the State of the Art and Guidelines for Future Work,” IEEE Access, 2022.

[15] A. Saban-Fosch, E. Diez-Lledo, S. M. and M. Sureda, “Towards the first European autonomous flight safety system-software and hardware design,” Acta Astraunica, vol. 236, pp. 417-431, 2025.

[16] J. B. Bull and R. J. Lanzi, “An Autonomous Flight Safety System,” American Institute of Aeronautics and Astronautics, 2007.

[17] M. Rice, H. Croft, J. Gillis, Z. Hilton, R. Kirkwood, P. Walker, P. Lundrigan and W. Harrison, “A Comparison of Two Software Defined Radios for Aeronautical Telemetry,” International Telemetering Conference Proceedings, 58, 2023.

[18] M. L. Don, “A Low-Cost Software-Defined Telemetry Receiver,” International Foundation for Telemetering, Aberdeen Proving Grounds, 2015.

[19] H. Wang, Y. Qu, S. Liu and G. Zhang, “Airborne reconfigurable data processing and analysis technology based on air-ground bidirectional link,” 5th International Conference on Big Data & Artificial Intelligence & Software Engineering (ICBASE), Wenzhou, 2024.

[20] R. O’Connell, “Advances in Packet Based Bi-Directional Telemetry Solutions.,” International Telemetering Conference Proceedings, vol. 57, 2022.

Author Biographies

CDR Ryan “PC” Agte, USN, CSEP Ryan “PC” Agte is an Aerospace Engineering Duty Officer, a Doctoral Candidate in Systems Engineering at Colorado State University and is a Certified Systems Engineering Professional through the International Council on Systems Engineering. He has experience as a Test Director and Operational Test Pilot in the MH-60S helicopter as well as five deployments around the world throughout his career as a Naval Aviator. He is A Navy Acquisition Professional Member, with Test and Evaluation, Program Management and Engineering expertise on manned, unmanned and autonomous systems, and is a Subject Matter Expert on the application on Flight Safety Systems on Military Aerospace Systems.

Jean Paul “JP” Santos, Ph.D. Dr. Santos is currently the Chief Innovation Officer at the Airborne Instrumentation Systems Department (AISD) as well as the Chief Engineer for the Point Mugu Sea Range Future Capabilities Office & Lab at the Naval Air Warfare Center Weapons Division (NAWCWD) in Point Mugu, CA. JP earned a B.S. (summa cum laude) with honors in Electrical Engineering from the University of Utah in 2013 and an M.S./Ph.D. in Electrical Engineering from UCLA in 2015 and 2021 respectively. He has worked with NAWCWD since 2015, starting as an electrical engineer with the Offensive EW Systems Division, and as the Lead RF Design Engineer for the Airborne Instrumentation Division. He holds various patent filings and has numerous published conference and journal papers in RF system, wireless communications, and radar design. As the CIO for AISD, he is currently managing a multimillion-dollar research portfolio and leads a team in tackling complex RF challenges in beyond line of sight communications, telemetry, and flight termination as well as low SWaP-C RF software defined radio modules.

Col Gregory “Bo” Marzolf, USAF retired, PhD Dr. Gregory Marzolf is an Associate Professor of Systems Engineering that as a previous active-duty military officer held diverse assignments including working with large Department of Defense aviation systems, conducting operational test and evaluation for all Air Force fighter, bomber, unmanned, and high-altitude platforms. He joined Systems Engineering in the summer of 2017 and teaches advanced systems engineering courses to include Foundations of Systems Engineering, and Introduction to Systems Test and Evaluation.

ITEA_Logo2021
ISSN: 1054-0229, ISSN-L: 1054-0229
Dewey Classification: L 681 12

  • Join us on LinkedIn to stay updated with the latest industry insights, valuable content, and professional networking!