Interview with Dr. James J. Streilein

DECEMBER 2023 I Volume 44, Issue 4

Dr. James J. Streilein

Memories from a Career in Army T&E: A Conversation with Dr. James J. Streilein
 

 

 

 

 

 

 

 

Dr. Streilein is a retired U.S. Army civilian and former Executive Director of the Army Test and Evaluation Command (ATEC).

 

Interviewed by J. Michael Barton, Ph.D., Parsons Corporation

Q: You were commissioned in the Signal Corps after completing Army Reserve Officers’ Training Corps (ROTC) at Carnegie Mellon University? Did that influence your choice to become an Army civilian instead of taking an industry or academic position as a freshly minted mathematics Ph.D.?

A: I was happy to accept the job offer to work for the Army Materiel Systems Analysis Activity (AMSAA) at Aberdeen Proving Ground (APG).  In fact, this was the only permanent job offer I received. I could have taken post-doctorate positions at a number of universities, but I had seen post-doctorates working hard for little money only to have to look for another post-doctorate position after their initial employment term ended.  No one was getting on a tenure track assignment. Seeing what was happening to pure mathematics graduates, I audited operations research and statistics graduate courses.  I also assisted at the statistics consulting lab. I suspect my ROTC based signal corps commission and auditing relevant courses helped convince AMSAA to offer me a job. I also thought my Army commission would benefit me at AMSAA since I was familiar with the Army and its systems, organization, and mission.

I was hired at AMSAA to work on the mathematics and statistics of reliability growth. AMSAA was developing statistical reliability growth tracking and prediction techniques in support of Army systems acquisition. I spent the summer of 1974 reading books on reliability when I had free time while I was attending the Signal Officer Basic Course at Ft. Gordon Georgia.  Both of which turned out to be very helpful when I got to AMSAA.

Almost immediately after reporting to AMSAA, I attended an all-employee meeting where it was announced that AMSAA was getting a new mission responsibility as the Army organization doing development Test, Design and Evaluation (TD&E) for major and designated high interest non-major acquisition systems.  I was re-assigned to work in the ground vehicles section of the vehicles branch of the Reliability, Availability, and Maintainability Division (RAMD).

TD&E was a new mission for AMSAA and there were no policies, guides, organizational relationships, models, or approaches telling AMSAA workers or supervisors how to do TD&E.  Soon AMSAA workers redefined TD&E as Travel, Despair, and Exasperation. TD&E was very different from the standard AMSAA mission of system analysis studies. TD&E included a lot of travel to meetings with project managers (PMs), to test sites, to contractor sites, to Army schools and centers, and to headquarters of the Army Material Command (AMC) and the Army Training and Doctrine Command (TRADOC). Everyone seemed to have concerns or complaints about what AMSAA proposed for any number of reasons. Once testing was completed and evaluation was being developed it seemed everyone in the Army had different conclusions than AMSAA. Conflict was standard in TD&E.

Part of the concern with TD&E were the interactions with the broader community involved with system acquisition. For instance, when I was a RAMD member of an AMSAA Systems Team (AST), we went to a review meeting on a major system to present our emerging evaluation.  When I presented our reliability estimates, I got a question from the audience.  The questioner identified himself as an engineer with the system contractor who told me I couldn’t say negative things about their system as all the reliability failures that occurred were going to be fixed.  I was told that if I persisted in presenting reliability as being below the requirement, I was going to get sued! I felt better when my boss explained the government would support me.

Other people would lay a guilt trip on me, such as, “If you evaluate that the system I work on doesn’t meet the requirement, the system will get cancelled and I’ll lose my job, my house, and so on.” One of my responsibilities as a RAM analyst was to be a member of the Army reliability scoring conference. Any system reliability failure was attributed by the system contractor and PM to tester error or to soldier abuse or their error in operating or maintaining the system. If a system contractor couldn’t get agreement blaming the tester or soldier, they would switch to an excuse that the Army did not properly inform them in the requirements documents of the environment or use conditions the system would see.  Participating in these scoring conferences got old quickly.

Q: You came to test and evaluation (T&E) as an analyst via AMSAA, which is now part of the U. S. Army Futures Command, DevCom Analysis Center (DAC). What was the relationship of AMSAA to Army T&E organizations?

A: Prior to this 1974 mission realignment, the Army Test and Evaluation Command (TECOM), also at Aberdeen Proving Ground, had responsibility for developmental test (DT) and evaluation for the Army.  TECOM would do the planning and coordinate the plans for approval across the Army.  TECOM would also produce test reports and evaluations of tests for the systems. AMSAA would support Army organizations with specific system performance studies on topics including ground mobility, air mobility, gun fire control, ammunition lethality, reliability, maintainability, survivability, etc.  AMSAA was organized into commodity divisions: Ground Warfare, Air Warfare, Combat Support,Command, Control, Communications and Electronics division and two all systems divisions, RAMD and the Logistics and Readiness Analysis Division (LRAD).

Prior to the TD&E mission responsibility assignment, Army organizations AMSAA supported with various analyses included the Army TRADOC and its schools and centers; AMC, the home command of AMSAA and its Readiness and Development Commands; and Department of the Army Headquarters (DAHQ) General Staff (G-1, G-2, , etc.) offices. AMSAA had interactions with TECOM. AMSAA would obtain system test data from TECOM and assist TECOM to a limited extent in test design. AMSAA had very limited if any interaction with the Operational Test (OT) and Evaluation Agency (OTEA). AMSAA had very strong connections with the Army Ballistic Research Laboratory also located at APG and other Army Labs. (In 1992, The Ballistic Research Laboratory and other Army Laboratories were consolidated into the Army Research Laboratory.) The labs concentrated in various performance areas where AMSAA sections worked system specific performance areas and their mission impacts, such as mobility, gun accuracy, and reliability.  The labs worked system research at the basic science and engineering levels; AMSAA worked at system performance estimation, and Development and Readiness Commands worked at their Program Manager’s (PM’s) specific system designs.

AMSAA also provided performance estimates on systems that were input data for the Army mission level operations research conducted at the TRADOC Analysis Center (TRAC) and the Concepts for Army Analysis (CAA). AMSAA would perform studies requested by Department of the Army Headquarters, G-3,5,7, and G-4.

Q: What was AMSAA’s response to its new TD&E mission?

A: AMSAA developed an approach to TD&E that fit with its experience, expertise, tools, Army connections, and internal structure. AMSAA set up internal teams with a team leader from one of the commodity divisions and a co-leader from RAMD and support from other commodity divisions, the Logistics and Readiness Analysis Division and outside AMSAA contacts: a TECOM HQ’s representative and contact points from Army labs. Commodity divisions were Ground Warfare (GWD), Air Warfare (AWD), Combat Support (CSD), and Command, Control & Communications (C3D). These teams were termed AMSAA Systems Teams, mentioned above.

One of the first systems I was assigned to work on was the XM-1 tank (which eventually became the M-1 Abrams) which at that time was a competitive development program with several competitors and several international options like the German Leopard tank. I also had involvement with the: XM-2/3 (Bradley), Family of Military Engineer Construction Equipment (FAMECE), Universal Engineering Tractor (UET), High Mobility Multi-Purpose Wheeled Vehicle (HMMWV), and numerous other ground wheeled and tracked systems. The XM-1 AST’s leader was from the GWD with a RAMD co-lead, mobility support from CSD, logistics support from LRAD, and communications support from C3D.

The AMSAA developed approach to TD&E involved multiple products and steps. The AMSAA System Team would develop an Evaluation Approach/Framework for approval by AMSAA Management.  This would be followed by a system evaluation plan for approval, followed by test design plans for approval, and finally a system evaluation after testing was complete.  All these steps would obtain input and advice from across AMSAA, Army labs, TECOM, and system PMs and would be repeated as the system moved through Acquisition T&E steps of DT/OT I, DT/OT II and DT III, and Initial Operational Test & Evaluation (IOT&E). Of course, the steps were seldom neatly followed.  Other decision points AMSAA supported included source selection evaluation boards. AMSAA would also provide input into PM trade off analyses and engineering decisions.

The AMSAA aim with its evaluation plan and reports was to have a Continuous Comprehensive Evaluation (CCE) that would address the big question of mission success for the unit employing the system against potential threats in potential situations and environments by employing operational mission Models and Simulations (M&S’s). This CCE would be a complete collection of all the data and information AMSAA gathered on the system.

AMSAA had small unit operational M&S’s that were used for these mission success estimates.  As an example, there was a tank-versus-tank M&S named the Ground Warfare Armor Systems Simulation (GWARS) that would take tank sensor performance, mobility performance, ballistics performance, lethality, survivability, and other estimates of performance (as well as loads of non-system info) as input to a force-on-force M&S and would produce tank-versus-tank exchange ratios.  These M&S results would form the basis of the AMSAA evaluation reports and were also input data to larger unit level M&S run by TRAC and CAA.  The Army used the AMSAA evaluation reports as well as other inputs to arrive at acquisition and fielding decision for combat systems. This approach contrasted with TECOM’s, which consisted of conducting and reporting the results of standard test operating procedures and consolidating a list of requirements met or not met versus system requirements.

While I was an analyst, I worked on reliability estimates and reliability growth estimates for numerous systems: tanks, personnel carriers, trucks, engineer equipment.  I also led statistical test design and analysis for the XM-1 main gun ballistics performance design and data analysis where we compared how well the tank prototypes main gun pointing performance matched the gun and ammo ballistics equations.  I also supported the XM-2/3 ballistics, and the Improved Tow Vehicle missile performance test design amongst others.

Q: Why was TD&E responsibility assigned to AMSAA? What difficulties were associated with the reassignment of duties?

A: It is my recollection that the assignment of the responsibility for developmental TD&E was given to AMSAA to perform in-depth statistically sound test designs, develop detailed performance estimates to compare to requirements and complete mission impact studies. At least these were the areas around which AMSAA designed its TD&E procedures. AMSAA was given some additional personnel resources to accomplish the TD&E mission, but a lot of the TD&E effort was seen as already resourced in the AMSAA systems analysis mission. This created tension within AMSAA and with AMSAA system analysis customers as a lot of AMSAA studies did not fit directly into the TD&E of systems in acquisition. This resource limitation may have contributed to another Army T&E reorganization in 1996.

This AMSAA T&E mission also conflicted with the OTEA mission. Since system PM’s had resource limitations including a small number of prototype systems, limited money, and tight timelines for T&E, neither developmental test and evaluation nor operational test and evaluation could get all they wanted for their T&E efforts.  The OTEA back in the old days concentrated on scheduling a test unit of regular Army forces, a test location at a field training site, and a unit that functioned as a threat unit.  These forces would conduct force-on-force exercises and perhaps some live training type operations to gather information such as reliability data and firing range data with Army operators and Army maintainers. These different test locations and approaches led to the DT evaluation from AMSAA and the OT evaluation from OTEA often being in conflict.

Q: Your career with AMSAA was focused on logistics and RAM. Did that come about because of specific needs AMSAA had at the time, or it was a connection because of your academic background?

A: Although I served in mostly in the AMSAA RAMD and LRAD, I was often working on TD&E of systems or other system associated work that I connected to system T&E later in my career. While I was in the RAMD from 1974 -1986, I was heavily involved in TD&E for ground tracked and wheeled systems.  Initially I was an analyst assigned to various system teams and then from 1980-1986 I was the chief of the Ground Vehicle Section of the Vehicles Branch. This TD&E work on systems reliability required mission understanding to enable development of failure definitions and scoring criteria, mission profile development to define mobility test courses, time and distance traveled on each, rounds fired, and other use measures, which drove DT reliability testing. Reliability data was scored by the Army scoring committee of which AMSAA RAMD was a member in relation to mission impacts, maintenance, and logistics requirements.  Required usage testing was determined statistically and scored data was used to statistically estimate system reliability during the test, reliability improvement with system fixes applied, and reliability potential with additional testing and fixing.

There are confounding issues with reliability, availability, maintainability (RAM) testing, like limited test usage can be accomplished during testing due to test time and number of prototypes for testing, and testing is conducted at a limited number of locations for a limited time during particular seasons of the year.  These limitations led to unknowns and large statistical uncertainties in RAM and durability estimates.

To gather additional RAM data, I proposed initiating a program to collect fielded systems usage and maintenance data during training exercise around the world where Army units were located.  I developed the approach and the statistics to estimate the number of locations, the amount of usage data to collect, and the approach to analyze the collected data. This Field Exercise Data Collection (FEDC) experience was useful in later T&E situations I encountered, and I saw this as a natural extension of TD&E as part of the AMSAA CCE philosophy. I also had the opportunity during 1986 to serve a six-month development assignment as the acting chief, Ground Vehicles Mobility Section, Mobility Analysis Branch of the AMSAA Combat Support Division.  This expanded my knowledge and experience of ground vehicles mobility performance TD&E and mobility impacts on mission performance as part of system evaluation.

To get promoted, I applied for and was selected as the Chief of the Logistics Studies Branch (LSB) in LRAD, where I worked from 1986 to 1989. While this branch was not heavily involved in the standard parts of the AMSAA TD&E mission, I learned a lot about the impacts of systems RAM parameters, including fuel usage, ammo usage, repair parts usage, system survivability logistics impacts and more. All these areas relate directly to broader aspects of the evaluation of system performance characteristics on a systems mission performance evaluation.

While I was chief of LSB, I became very involved with the Live Fire Testing and Evaluation (LFT&E) portion of system T&E. I worked closely with the part of BRL where LFT&E was a major effort.  This part of BRL developed and ran the M&S that predicted combat damage from enemy munitions. Working with them, we developed methodologies and processes to use the AMSAA operational mission models and simulations to predict direct and indirect fire damage events to assist in LFT&E planning and to incorporate LFT&E results into an evaluation of mission impacts of combat damage and survivability for the TD&E mission. This part of the BRL transitioned into the Survivability/Lethality Analysis Directorate in 1992 when ARL was formed.  The Army Ordnance Center and School provided combat damage repair teams that developed methods, tools, techniques, etc. for field use in combat damage repair. These experiences in RAMD and LRAD fit nicely into my expanding understanding and expectations for Army T&E.

For 6 months, 1990-1991, I served as the Acting Chief, Ground Warfare Division, AMSAA. As such I was the senior manager of the lead analysts of the AMSAA TD&E teams responsible for tanks, infantry fighting vehicles, artillery systems, and small arms and their ammunition.  GWD also performed systems analysis studies and developed and conducted M&S for ground warfare systems. During this period GWD supported the Army preparation and conduct of Desert Shield and then Desert Storm with special studies and analyses providing information directly to deploying and deployed forces on topics such as enemy systems vulnerabilities and techniques for attacking enemy systems and our systems mobility in specific types of environments.  Specific feedback after the conflict was elicited by Army teams to feedback into TD&E and systems analysis and M&S efforts.

From 1991 to 1996, I was the SES Division Chief of the RAMD of AMSAA.  In this position I was the lead Division Chief across AMSAA for TD&E methodology and statistics and systems reliability, maintainability, availability, and durability. I expanded my knowledge set to include aviation and air defense systems TD&E, system analysis, and performance/mission M&S. Within resourcing and time constraints, AMSAA worked to implement CCE.

One of my favorite areas I got involved in as RAMD chief was the work of the reliability methodology group.   I finally got to spend some time looking at the statistics of reliability growth in testing and evaluation which is what I thought I would be working on when I started working at AMSAA.  New statistical techniques were added to what was termed the AMSAA reliability growth model.  Techniques included projection and prediction techniques for reliability of a system progressing through its acquisition cycle of DT’s and OT’s.  System contractors and PMs were always confident that the system was going to demonstrate the system reliability requirement at the final planned test no matter how badly it was doing at the most recent test.  The AMSAA statistical techniques gave us the tools to track progress and predict future progress with statistical confidence. These techniques provided an invaluable evaluation tool for the AMSAA evaluation.  (A quick web search while I was writing these interview responses showed that the AMSAA Reliability Growth Guide is still available.)

Another common system contractor and PM claim was that they understood the reason for every reliability failure that occurred during testing and that a system improvement would be implemented before the next test or fielding of the system.  This often didn’t turn out to be factual in the AMSAA TD&E experience.  Several employees of the reliability methodology group were taking graduate course in reliability engineering at the University of Maryland (UMD) and learned about the field of Physics of Failure (PoF) engineering techniques.  AMSAA obtained funding to support research at UMD in PoF relating to microelectronics components of the types commonly used in Army systems.  From conditions of use and failure mode data from testing, PoF contributed to better initial designs for components and to more quickly improved designs to fix failure mechanisms.  Work was also initiated into mechanical component type failure mechanisms.  I always thought this type of research was very important.

I was very disappointed when the methodology group did not transfer with the TD&E responsibility when it was removed from AMSAA.  Having to go outside the T&E organization for reliability methodology support made it harder to work closely with the methodology experts on the individual programs. It also made it harder for the methodology group to stay current on the experiences in T&E reliability events occurring during ongoing T&E.

An interesting event occurred when I was the RAMD Chief, and thus the TD&E lead.  AMSAA got to ‘evaluate’ proposed requirements in coordination with Army labs using AMSAA test and the labs research and experimentation experience.  I remember one major system’s requirements evaluation. The AMSAA evaluation concluded that to meet the proposed requirements would require a big step ahead in technology.  I was surprised when I got an invite to an internal TRADOC requirements review meeting chaired by the Commanding General (CG).  When I got to the review meeting, I found that I was the only attendee who was not an Army General Officer (GO).  Attendees were TRADOC schools and centers GOs involved with the system. As the review progressed through the requirements document, all the attendees except me agreed with the requirements.  The TRADOC CG would note that AMSAA raised concerns, but the CG stated that the Army needs to challenge Program Executive Office’s (PEO’s), PMs, and contractors to move technology ahead.  I cautioned that the Army might end up not getting anything for all the money spent but the requirements were approved.  It wasn’t long until AMSAA was removed from the requirements review process.  The system did get cancelled eventually because it was not progressing toward meeting the approved requirements.

Q: The Army undertook a major reorganization of test and evaluation in 1996. What was behind that, and do you think it accomplished its goals?

A: In my personal opinion, the years of TD&E in AMSAA were not without issues, constraints, and problems.  AMSAA could only recommend T&E plans and had to work within the PM’s and PEO’s resources and timelines.  This frequently meant that not all the testing and resulting data AMSAA wanted was actually achieved. So statistical confidence in some system performance estimates and RAM parameters was low.  At that time in history, the AMSAA mission M&S was slow to set up, slow to run, was limited in the number of systems represented, and was hard to understand let alone visualize what was happening, when it was happening, and why it was happening. Only a limited set of runs could be accomplished within time and resource constraints, which limited the locations, environments, threats, and weather conditions represented.  There was no statistical design possible by varying input performance parameters (like weapons accuracy, sensor performance, human performance variation, etc.). These constraints limited AMSAA evaluation conclusions. 

OTEA was responsible for the system operational test and evaluation.  OTEA had the responsibilities and developed T&E plans for acquisition systems and had other resources, such as enemy threat representative systems, for the OT phases of the acquisition system.  These separate plans/requirements often conflicted with the AMSAA plans. AMSAA and OTEA evaluations were sometimes in disagreement.  These disagreements led to senior leader concerns about both evaluations with no easy resolution except for senior leaders to apply military “judgement” to reach Army decisions on the acquisition actions to take on the system being evaluated.

In my opinion, both AMSAA and OTEA evaluations did not reach strong conclusions on operational mission success.  OTEA relied mostly on OT data. An OT, and even the Initial Operational Test (IOT) which was touted as the final exam before deployment, had limitations.  An OT was relatively short term so limited data was collected. It relied on casualty assessment systems which are M&S based to determine casualties/damage, it was in one location with one type of environment, within one season of the year, with one unit with soldiers/leaders new to the system, against one “representative” threat unit. There wasn’t any approach available to generalize the data and results or apply any statistical confidence to much of the data. AMSAA evaluations also had limitations. AMSAA M&S for performance estimation was limited to a few locations and the small unit operational mission M&S could only be run a few times because the set up and run times took a lot of time and effort.  So again, the evaluations did not address a lot of potential missions against a lot of threats.

In my view, OTEA and AMSAA evaluations were largely a list of requirements that were clearly not met, requirements that were “just” below, at, or “just” above met, and some requirements that were clearly met.  The mission impacts of the situation were not really addressed, so the evaluation findings were based on “military” judgement.

Moving AMSAA TD&E responsibilities to OPTEC did lessen the conflicts between the developmental and operational organizations working T&E, which certainly was an improvement in the situation.  However, in my personal opinion, the reorganization did not provide a mission evaluation based on all the available information looked at through an operational mission M&S tool perspective and was certainly not the CCE AMSAA has once envisioned.

Q: For several years, leading the Army Evaluation Center, you continued as an analyst, or more specifically, as the lead analyst or evaluator for the Army. How did evaluation change over your time there?

A: Army reorganizations of T&E organizations continued.  OTEA became the Operational Test and Evaluation Command (OPTEC) with the addition of the TECOM and the test boards and some other TRADOC organizations associated with T&E.  Live Fire T&E became a major mandate for selected systems.  The Office of the Secretary of Defense, Director of Operational Test and Evaluation (DOT&E) was a major driver of operational tests and LFT&E.

In 1996, the DT TD&E mission was moved from AMSAA to OPTEC. Around one hundred personnel spaces were moved from AMSAA to OPTEC to accomplish the mission.  A few spaces from Army labs were moved to work the LFT&E mission and survivability/lethality issues .  A few personnel and spaces were moved from TECOM Headquarters staff, who wrote consolidated test reports for non-designated non-major systems as well.  These personnel and spaces became the OPTEC, Evaluation Analysis Center, (EAC).  I was selected as the first EAC director. EAC worked closely with the Operational Evaluation Center (OEC) as members of the OPTEC Systems Teams (OSTs).  However, there were some difficulties with this structure because of similar expertise between people at EAC and OEC or because of a mismatch between the best person for a team being in the “wrong” center for the standard OST membership role. The developmental tester, TECOM, was still in the Army Materiel Command with differing command and resourcing lines, which led to some confusion and conflicts. AMSAA retained the role and the bulk of the expertise in system performance estimation and small size force-on-force M&S.  EAC and the OPTEC Evaluation Center (OEC) lacked the personnel, expertise, and tools that AMSAA was able to apply to TD&E when it had the TD&E mission.

In 1999, the Army combined TECOM and OPTEC to form the Army Test and Evaluation Command (ATEC), which is how the Army is still organized for test and evaluation.  I was selected as the first Director of the AEC. AEC was formed by combining EAC and OEC in ATEC. Evaluation in AEC was different than in AMSAA.  The intent for test design and evaluation remained the same:  design tests using the best available approach to obtain the needed data to conduct a comprehensive evaluation of safety, effectiveness, suitability, and survivability of the system in the context of the mission environment prior to fielding and evaluate progress toward that goal during system development. AEC could now design testing across DT and OT and consolidate test data into a single evaluation which limited conflicts in test design and evaluation.

Moving forward in OPTEC, I had the chance to work some efforts based on what I had experienced in AMSAA. While I was AEC Director, DOT&E got a Congressional requirement and funding to conduct cyber readiness assessments of field forces.  DOT&E is a small office and generally relies on the service OT&E organizations to conduct field efforts.  Based on my AMSAA FEDC efforts, AEC set up field cyber assessment teams.  These teams would participate as a cyber aggressor during training exercises and evaluate the unit cyber defense readiness.  An evaluation would be provided to the unit and DOT&E and would be used by AEC to plan T&E for systems under T&E.

AMSAA had expertise and M&S tools to develop estimates of multiple performance parameters which could be used in small scale force-on-force M&S to address the system’s mission impacts of the evaluation estimates of system performance.  Those tools and the personnel with the relevant expertise remained in AMSAA and did not transition to EAC or AEC. EAC and AEC did not have the personnel to conduct such efforts in house or the dollar resources to pay for AMSAA or other engineering level organizations to conduct such work.  Thus, AEC evaluations were largely limited to comparisons of systems parameter estimates to system requirements to which military judgement was applied when requirements were not met to address potential mission impacts.

Some of us in Army T&E and analysis kept the AMSAA idea of CCE alive.  When Future Combat Systems (FCS) started into acquisition, it had lots of resources, interest, and complexity. There was a push to establish a Virtual Proving Ground (VPG) to link all the various system contactors, prime integrating contractor (Boeing), and government organizations into using the same collection of M&S supported by the same input database. The intent was to continually grow and improve the M&S and input database as additional systems engineering and T&E were conducted and evaluated.  As the threat changed its systems or tactics, techniques, and procedures during the years of system acquisition from when the requirements were established, the VPG input data could be updated to reflect the changing environment of the mission.  Changes in the threat from the requirements base were always a point of contention in test planning and evaluation between the T&E organizations and the PMs and system contractors.

There were to be two main hubs for this VPG, one at Aberdeen Proving Ground and another at the contractor facility in California.  The intent for the VPG was to be able to have operational mission M&S run in modes from personnel, systems, and/or system M&S’s in the loop to pure M&S constructive runs.  The constructive version was for speed to do design of experiments. It could also function as a wrap-around for field experiments and operational tests. There was going to be an integrated plan to use the VPG construct for FCS activities through the whole acquisition cycle and continue through fielding to support training and operational mission planning.  The data input for and output from the VPG was going to be the official knowledge base for the FCS. This plan seemed to me to be the AMSAA CCE realized. Of course, FCS failed big time and the FCS VPG never made it into existence.  Evaluations continued to be a comparison between the best estimates of system characteristics compared to requirements evaluated using military judgement.

Q: I’m guilty of perceiving evaluation as the final step in the T&E process. In a sense it’s the beginning and the end. Do changes need to be made to involve evaluators more explicitly throughout the T&E process?

A: I have been guilty of the same bias since I started to work TD&E in AMSAA back in 1974.  In my understanding, the reason for testing is to support an evaluation for decision making purposes.  Evaluation planning needs to drive testing plans. There is no need for a test or more testing than required for an evaluation. In the Army, ATEC drives evaluation planning which drives test planning and the data from the test supports the evaluation.  AEC is deeply involved explicitly throughout Army T&E from early developmental testing to the final operational test before an acquisition and fielding decision is made by the Army.

I do think that evaluators and testers should be involved sooner in the requirements process and continue data collection and evaluation after fielding.  I think the Army would benefit from an integrated process that begins during requirements determination and continues through the entire acquisition cycle.  My understanding of the Army Futures Command Top-Down Requirements Determination Process is that design of experiments is used in the input of a constructive operational mission M&S which produces runs from the input data sets for statistical analysis to determine a range of input variables for a potential system and that analysis drives the new requirements. I suggest that this same operational mission M&S be rerun as often as needed during the acquisition cycle for T&E needs, for logistics planning, for Artificial Intelligence (AI) training and T&E, for unit training for unit mission planning, for engineering trade- off analysis, etc.  This integrated process would provide a knowledge base for multiple Army purposes and would be continually improved and updated: as the threat changes; as Army systems are improved; and as new systems or tactics, techniques, and procedures are improved. The operational mission M&S that is the key to this process needs to be able to run with personnel, real systems, system emulators or simulations, in distributed mode, at multiple levels of classification, provide visualization of actions in the M&S runs, retain data throughout the runs to enable replaying of the specific run and a list of other capabilities.

For years I was not sure such modeling and simulation was possible.  I retired from federal service from DOT&E in 2013. I then started working for MITRE as a consultant supporting the DOT&E and Customs Border Protection (CBP) test and evaluation. CBP was developing and acquiring new sensors to deploy along the border. As part of the T&E process, MITRE conducted a Simulation Exercise (SimEx) for CBP in a MITRE lab which utilized One Semi-Automated Forces (One SAF) as an operational mission M&S wrap-around exercise driver.  These SimEx’s could run locally or distributed; with personnel/live systems in the loop or purely as a simulation, could run in various levels of classification, could be run for various missions and systems of interest to the CBP.  The CBP SimEx was used to refine sensor requirements; plan for DT and OT in the field at various locations; run with CBP personnel in the M&S for training and to develop tactics, techniques and procedures; was going to be used for an evaluation to decide on sensor acquisition and fielding in a report to headquarters of the Department of Homeland Security and Congress to support decisions; and was to be provided to CBP in the field to decide on sensor and forces locations as the threat reacted and changed through time.

I had been involved with One SAF when it was being developed as a training system for the Army. It was an acquisition system ATEC addressed.  At that time, it was capable of handling a few Bradley Fighting Vehicle System trainers against a few threat systems.  I was pleasantly surprised at the progress One SAF had made since I had last seen it.  It seemed to me that the MITRE SimEx set-up using One SAF as a M&S driver could serve as an example of an integrating tool to support the Army requirements, engineering, research, acquisition, T&E, training, logistics planning, mission planning, etc. I did not continue working for MITRE supporting CBP to see how the CBP plan to use their SimEx border protection M&S turning out.  I moved from the DC area to our current home in the Western NC mountains and I could not work on law enforcement sensitive information from home.  It would be interesting to learn if or how CBP followed through on the proposal to have the SimEx -One SAF M&S as an integrating tool.

Q: The wars in Iraq and Afghanistan saw dramatic increases in the demand for materiel, and a related increase in T&E requirements. Rapid acquisition was one result and the Army response, which you led, was creation of Capabilities and Limitations testing. Tell us how that came about and how it became a standard part of DoD testing.

A: When the wars in Iraq and Afghanistan began, ATEC was already thinking about what might be done to support the efforts. The best I recall that even before anyone asked for improvements, the Aberdeen Proving Ground survivability community was already working on increasing survivability of systems anticipating and after receiving initial reports of side and bottom attacks against U.S. vehicles with rocket propelled grenades, mines, and Improvised Explosive Devices (IEDs). Armor additions/improvements were quickly developed and tested and evaluated for the Army.

The Army staff began to hold all Army Saturday morning videoconferences with Iraq and Afghanistan representatives on how the campaigns were going and what the theaters were experiencing and what help they needed.  ATEC as a direct reporting command to the Chief of Staff of the Army had a seat at these video conferences.  I frequently attended these as the ATEC representative. ATEC was already working established channels with PM’s, the Research, Development, and Engineering Centers, and ARL on expediting new items to support the theater’s campaigns.  The standard approach to field these items was a safety release and an initial evaluation. However, the field reported that lots of non-standard stuff was in theater already and more continued to come from well-meaning people and companies directly to people or units they knew in theater.  I was asked why ATEC wasn’t involved, and I explained that ATEC was involved in the official items reaching theater.  I further explained that I understood that no one in the Army should be using an item that did not have a safety release issued by a responsible Army organization, primarily ATEC.  The Army staff then commanded the theater not to accept items that did not come through ATEC for a safety release. 

Several procedures were established to inform Army organizations and potential suppliers of requirements for getting items to theater to meet their urgent needs.  ATEC publicized and formalized the procedures that had always been used for quick safety releases and initial evaluations for items used in demonstrations or experiments conducted by the Army. These procedures led to a standard quick look by ATEC to produce an initial safety release and a quick evaluation addressing what had been learned about item capabilities and limitations (C&L) for use by the Army to decide to ask theater if they wanted the item shipped to theater.

ATEC also initiated Forward Operational Assessment (FOA) teams.  FOA teams were usually led by the Operational Test Command and consisted mostly of military personnel with an occasional civilian from across ATEC.  The FOA teams would continue to track items shipped to theater after a safety, capabilities and limitations evaluation was issued for an item. FOA teams would assist theater in receiving and training to use the item.  The FOA team would get an abbreviated evaluation plan requesting data to be gathered on the item.  FOA teams might accompany the receiving units into the field as the item was being used or conduct an after-action review with the using unit.

Another T&E effort contributing to the theater actions was conducted by the Joint T&E teams.  While I was ATEC Executive Director, I was the Army representative and for a year I was the chair of the coordinating committee for Joint T&E.  I worked 15 Joint T&E quick reaction test efforts.  For example, tests were conducted on Joint Contingency Operations Bases, Joint Forward Operations Base Force Protection, Joint Counter Remote Control Improvised Explosive Device Electronic warfare, etc.  These quick reaction test efforts supported urgent theater concerns and were well received by the Joint Commands.

I was surprised one day when I was at the Pentagon at a meeting, and I got called out of the meeting by the Undersecretary of the Army.  That was a surprise in itself, but what he wanted was for me to go to see the Secretary of the Army.  The Secretary had some questions about body armor performance evaluation. He was getting a lot of questions and concerns from Congress whose members were getting complaints from the public about injuries and deaths of body armor wearers.  I admitted that I didn’t remember ATEC working body armor.  I had some idea of how body armor worked from my time at AMSAA, since body armor was developed at Aberdeen Proving Grounds. I said I would go back to my office and get him answers.  Back at the office I found out that body armor was considered as an individual piece of equipment and was not addressed by ATEC and was fielded by the PEO. When I informed the Secretary of this situation, I was asked to look into body armor performance and provide him an update. 

An ATEC team was formed to review body armor performance.  We reported our findings back to the Secretary and he decided body armor would be added to the list of systems ATEC was responsible for T&E. In response, ATEC developed a body armor testing facility at Aberdeen Test Center and conducted testing of body armor stocks and reported back to the Secretary on the results.  ATEC also had our FOA teams look into combat events involving body armor in theater gathering data on the event and any injuries suffered by the wearer.  The FOA teams also collected samples from body armor stocks in theater for performance testing back at Aberdeen. An ATEC evaluation on performance was reported back to the Secretary, who directed that the PEO and ATEC would conduct a press conference on body armor.  I attended the press conference, and to my relief I wasn’t asked any questions.

The Secretary of the Army then directed that ATEC would also be responsible for ballistic protection helmet T&E.  ATEC developed a helmet test facility at APG and had the FOA teams collect information from theater on field performance of helmets.

I learned a lot about body armor and ballistic helmets fairly quickly.  These were my only interactions directly with the Secretary of the Army.

Q: Did you ever receive unusual test requests?

A: Not often but here’s an example. When the services were heavily involved with T&E of urgent requirements for theater, we’d also get proposed items pushed to a service or to the Joint IED Defeat Organization for capabilities and limitations T&E for possible fielding to theater.  We got some items that we didn’t think would be of value in theater or might even have negative impacts.  One item I remember well was aluminum underwear.  Someone was pushing the idea that aluminum underwear would protect soldiers in theater from the dangers of heat related injuries.  The ATEC quick look was that aluminum underwear would cause people to overheat in environments not even close to the heat in theater.  When our initial C&L went to the Department of the Army, I got called in to see the Vice Chief of Staff of the Army.  I was told that the aluminum underwear idea had a lot of support amongst some members of Congress whom the Army wanted to keep happy.  I was ordered to do a complete C&L of the aluminum underwear. 

I went back to ATEC and we formed an ATEC T&E team which developed a C&L T&E plan.  Working with the Human Research Directorate of ARL, we formed a team of volunteers to test the aluminum underwear while performing soldiers’ tasks under heat lamps in an environmental chamber, while having their body temperature and other physical conditions monitored.  (You can guess where the volunteers had their thermometers inserted.) They also conducted the same tasks without the aluminum underwear. The results were as ATEC predicted.  The aluminum underwear was a big negative for soldiers performing tasks in warm theater conditions. The C&L was strong enough that the idea was dropped.

As I was leaving ATEC to go to office of the DOT&E, the Vice Chief of Staff of the Army (VCSA) asked me how to speed up procurement of urgently needed command-and-control systems.  In the VCSA’s experience, DOT&E always says the system didn’t meet requirements and can’t progress toward fielding even if the Army feels it is an urgent need for the theater.  I suggested that the Army propose procuring a limited quantity for experimentation and limited fielding for feedback from the field.  These actions would all be in preparation to write good requirements for an eventual acquisition program.  The first step would be an open invite to potential suppliers of the desired type of command-and-control systems to participate in an experiment using the systems to do a C&L. Using these results, the Army would propose a larger experiment using the best systems from the experiment.  Of course, the Army would really need a brigade set of equipment to properly do a good experiment and to be useful to a unit in the field.  This approach would be consistent with the urgent need process. Using the experimentation results and theater FOA feedback, ATEC would prepare a C&L report.  This could all be funded with urgent needs funds.  If the C&L supported the system being of real value, a formal requirements document could be written, and an acquisition could be initiated. This acquisition program could be expedited with the C&L information and the continued urgent need by theater.  A couple of years later when I was in the office of the DOT&E, I got an invitation to attend an Army Network Integration Evaluation event.  It reflected what I had discussed with the VCSA, though I certainly claim no credit for these events, which were conducted by the Army for several years and may have continued after I retired.

Q: Has did your perception of T&E change as you moved from AMSAA to AEC to Executive Director of ATEC to the Office of the DOT&E?

A: My perception of T&E didn’t really change across my career.  I always believed that the purpose of T&E was to assist the field forces by ensuring they had the systems that would serve them well toward accomplishing their mission.  As I progressed in my career, I did have different things I could do at higher level places with more senior people to make a larger impact ensuring the equipment coming to the T&E community was evaluated as well as we could at the time.

Q: How did your relationship with ITEA begin? I remember serving with you on the Publications Committee and I know that you received the Allen R. Matthews Award in 2010.

A: Best I can recall, I was visited when I was EAC Director by Jim Fasig, who at that time was the Technical Director of the Aberdeen Test Center.  I had worked with Jim on various systems during my TD&E activities at AMSAA.  Jim was involved with the Francis Scott Key chapter of ITEA.  Jim introduced me to the ITEA activities and asked me to support the chapter by participating myself and allowing my staff at EAC to participate. I thought the idea of a T&E community organization made a lot of sense. In AMSAA, we were involved with the Army Operations Research Symposium (AORS) and the Military Operations Research Society, (MORS).  Neither AORS nor MORS had more than a small involvement in T&E activities.  ITEA offered the opportunity for the T&E community to come together for learning, sharing, and networking.  My exposure and involvement in the broader T&E community was helped by my ITEA participation as when I was Chair of the Francis Scott Key chapter and we ran the annual ITEA Symposium in Baltimore. I did serve on the ITEA publications committee, but I don’t remember when that was exactly.  I got a lot out of my ITEA participation, and I think my workforce did also.

Q: Over your career, how has technology affected the way we did and currently perform T&E?

A: Wow, technology has changed immensely since I started in T&E back in 1974. The technologies in the systems under test have gotten more sophisticated and more complicated.  This technology growth has led to more T&E challenges.  Back in 1974 systems were largely mechanical with some analogue electrical components.  Systems were stand alone, operating independently of other systems, and directed by their operators.   Operators communicated with others via analogue radio. Testing back in 1974 consisted of operating the systems as they were intended to be used in representative environments for set periods of time.  Operating performance was largely recorded by analogue devices and/or people with various physical measuring devices. Computers that we used for T&E were still running on punch cards.  Programming was still a mix of machine instructions and a limited set of pre-programmed instructions.  The statistical design of experiments I worked had to start with writing the equations from basic principles and the data analysis to compute statistics had to be entered on punch cards from hand recorded data sheets to be used as input data to my hand programmed equations.

Looking back at the early days, everything was slow and involved a lot of people doing a lot of work by hand by themselves or working together at a single location. Through the years, the systems under T&E became more complex and sophisticated.  Systems incorporated more digital electronic devices, became more connected, and provided more decision assist information to operators.  These changes made T&E change. New test technology was developed to test these new features more quickly and more accurately.  Among the new technologies were artificial test environments for areas of performance like tank aiming accuracy against moving targets, artificial test tracks to test reliability, and mobility and durability without field test with drivers. Test sensors were developed to position on the systems to automatically record and report data.

M&S improved as computers improved. New languages made M&S much easier to program.  Statistical computer applications made planning and data analysis much easier and quicker. Of course, all the “improvements” introduced new concerns.  System electronics needed to be hardened for the field environments and to survive ballistic attacks. Electronic warfare became a major concern. Cyber security was a whole new field of T&E. Keeping up with all the changes that occurred during my career was certainly challenging but stimulating at the same time.  Luckily there were always great teams of testers and evaluators who understood all the new technologies to keep T&E progressing in the mission of making sure systems being fielded were supportive of mission success.

Q: Have you witnessed a change in attitude about modeling and simulation and its acceptance for program of record evaluation?

A: The acceptance of M&S for T&E is an interesting question with a range of answers. M&S can range from basic science principles to engineering equations to human dynamics to small scale force-on-force M&S, and up to large scale force-on-force M&S. Each of these levels of M&S have associated uses and concerns with their use in T&E.  I’ve always believed that M&S is a potentially useful tool to generate data to support better evaluations.  Each potential M&S to support T&E though should be subjected to thorough verification, validation, and accreditation (VV&A) before it is used in support of T&E for a system.

There have always been proponents and skeptics of M&S in support of T&E.  I’ve always have been willing to consider the use of M&S when I can review the M&S VV&A myself.  I try to remember that all our physical operational tests are in actuality a M&S of an actual combat mission.  We (hopefully, although we do have accidents) don’t intend to cause the actual death or destruction of personnel or systems.  We use M&S to determine what might happen during interactions between systems in combat. There are simulators of firing to which Real Time Casualty Assessment (RTCA) M&Ss are applied to generate a system event.   My understanding from human research types is that people in a RTCA M&S environment may well react totally different than how they might react in actual combat. People generally put a lot of trust in OT force-on-force physical simulations.

For as long as I’ve been involved, force-on-force M&S have been used in requirements determination, unit training, and operational tests. I’ve accepted these VV&A’ed M&S results as further input for an evaluation.  I’ve long been an advocate of using the same M&S though out the acquisition process as an integrating feature of developing a body of knowledge about a system and its impact on operational combat missions.

At levels of M&S below force-on-force, the basic science and engineering models also have their uses in T&E, but I’ve seen a lot of M&S predictions of system performance proven wrong in actual physical testing. I’ve always heard that these M&S are so good now, that systems will work reliably as intended in all environments. I don’t recall any system that arrived for Army T&E that didn’t have improvements that needed to be made to correct problems discovered in T&E.  Probably the systems would have had even more problems when first delivered than they did if the M&S had not been done? I remember many times when contractor personnel would admit that some area wasn’t properly handled in their system M&S efforts leading to an issue discovered in testing. I’m not a believer in replacing physical testing with a pure M&S T&E regime.

I remain a believer in using M&S when VV&A’ed, but we also need to conduct a broad range of physical testing simulating expected field conditions to verify the system reasonably works.

Q: What were your largest T&E workforce challenges in ATEC?

A: I’d say the biggest work force challenge was finding highly qualified technical personnel for recently evolving technologies that were being quickly incorporated into systems or testing facilities. I remember looking for people that understood night vision devices, ceramic armors, electronic warfare, cyber security, robotics, etc. Once we found a technically qualified person in the field, they generally would need to learn about the Army and the field environment.  They they’d need to learn about T&E and Army acquisition. Sometimes we’d locate a technically adept employee and send them for additional training in an emerging technical area.  Sometimes this would work, and sometimes they would leave the government for higher pay.  This happened several times in the cyber area.  Sometimes we’d have a technical support contractor provide a technical expert.

Finding and keeping good technical people was a continuous challenge.

Q: What was your role in the office of the DOT&E?

A: ATEC HQs was being relocated from the DC area to Aberdeen Proving Ground in 2010. I was not going to move back to the Aberdeen area.  I let it be known that I was going to retire if I did not find another position in the DC area.  The then Director, of Operational Test and Evaluation offered me the position as the Deputy Director of Netcentric, Space, and Missile Defense Systems. I had worked closely with the DOT&E from my Army positions in ATEC.  I was already the Army lead for T&E in all these areas and had worked multiple joint programs in these areas as well.

The Army was involved with most of the Netcentric Systems DOT&E had under oversight, so I was familiar with the programs and the T&E in this area. The Army was not the lead for many of the space programs, but the Army often had a ground station that interfaced with the space systems. The Army was the lead service for National Ground Based Mid-Course Missile Defense and ATEC had an Evaluation Branch of AEC which I had set up while I was AEC Director, so I was known in the missile defense community.  While I was AEC Director, I started the ATEC field cyber security assessment teams in the netcentric area. The DOT&E had been given a congressional requirement and resources to assess the cyber security of forces in the field.  In the pattern of the FEDC and the FOA teams, we at AEC established a cybersecurity team made up of government cyber personnel and cleared contractors led by an AEC employee to deploy a team in the field to perform cyber events against US forces during field exercises.  This field cyber assessment approach was eventually adopted by the other services OTA’s, and in the office of the DOT&E I oversaw the program, and we prepared a DOT&E annual summary report to Congress.

In the office of the DOT&E the major differences were the level of detail that was directly managed by the DOT&E versus by the services.  The DOT&E was responsible for approving the services’ plans and conduct of their OT&E.  At the reporting level, DOT&E would conduct independent system evaluations based on service developed data.  DOT&E also produced an annual report to Congress and would be called for participation in Congressional hearings on programs of special interest to Congress.  The Director would attend these hearings and I and some DOT&E staff would sit behind him and pass him information as required. The DOT&E would have a representative at oversight system acquisition decision meetings, which I had attended as the Army T&E representative in some cases, and in the DOT&E I would attend as support to the Director or as the DOT&E representative very similarly to what I did in the Army.

Generally, though, I was performing much the same evaluation role on systems as when I was AEC director, only on just systems under DOT&E oversight and systems in my area across all services.  I did not have involvement in all the testing issues of facilities, resources, instrumentation, physical security, etc. which were often major issues requiring a lot of management attention and often involving travel to test locations.

Q: How has testing changed as we have lost the military personnel and former military civilians from the workforce?

A: I retired in 2013 from my position in the office of the DO&TE. At that time, I still had a fair number of uniformed staff and a fair number of civilian and contractor staff with military backgrounds.  I always valued the experience and expertise associated with time served in the military.  My background in the military was a commission in the US Army Signal Corps via the ROTC program. I attended Signal Officer Basic course and completed my military service as a Captain in the Delaware National Guard having served as a platoon leader in a Signal Battalion, a Signal Command staff officer and as the adjutant (S-1) in a Signal Battalion. My military experience was helpful to me in my positions in T&E.

If the current situation in T&E organizations is a lack of military and civilians or contractors with a military background, then I think there may be difficulties in properly conducting mission-oriented T&E. The Army used to have “greening” training available to Army civilians without a military background so they could become more familiar with the military environment. Greening experiences might be a possibility to improve the understanding of a workforce without military experience.  Another option might be more information exchange with personnel in units that are intended to eventually receive a system undergoing T&E during T&E planning reviews, observing, and commenting on field and proving ground testing, and so on. These participation actions occurred when I was consulting with MITRE in support of CBP.  There were few uniformed CBP officers working with us, but a fair number of exchanges of information and actual visits to and from CBP in the field.  These activities helped to keep CBP T&E focused on the current field situation.

Q: What advice do you have for people just entering the T&E career field?

A: Good question! I wish I had a good answer. I recommend learning about statistics.  Everything varies in its interactions with the environment when in use. A person working T&E needs to consider what causes the variations in response to what, and how big the impact might be. They should use statistical design of experiments in test planning and statistical tools in the analysis of the data. It makes sense to me that they should look at the standard classical T&E areas and build expertise in depth in one area and understanding of as many other areas of T&E that are relevant where they are working and eventually where they want to work. They should develop an understanding of the operational mission of the systems they are working and apply that knowledge in test planning and make sure to relate test results to mission impact.   They should read widely about the history and predictions for the future (maybe military Sci-Fi) of the mission area they are working. They should be sure to make their work relate to the mission. They should look for challenging assignments to take on. They should look for assignments to lead and manage teams. They should participate in the broader T&E community via ITEA or explore other opportunities to learn, network, and share knowledge and expertise with the T&E, military and technical community associated with the mission area they are working in.

Q: Do you have any closing thoughts?

A: I would like to share two more experiences that are on a personal level. In 2001, ATEC headquarters was in Alexandria, VA on Ford Road just west of interstate 395, maybe two miles from the Pentagon. On 9/11, the ATEC staff was meeting in our conference room to work a plan to locate ATEC employees for their status. As we were watching TV live feeds for New York, a plane flew below our 10th floor window heading toward the Pentagon. As we went to the windows, we heard a thump and then saw smoke rising. Yikes! I was supposed to be at a meeting in the Pentagon that morning but had skipped it because of the ATEC staff meeting.

Finally, as an Army GO/SES in the DC area, I was on the special event assignment list. One 4th of July I was tasked to represent the Army at a dignified return of remains at Dover Air Force Base.  The ATEC honor guard and I flew on an Army helicopter to Dover Air Force Base where we marched the coffin off the plane into the reception area and I presented the American flag to the family of the deceased American. Few things could be more compelling to reinforce the urgency and importance of our work.

As a final note, I would like to offer my thanks and share my opinion that my career was made possible by the great people I had the opportunity to share my time working with. In my opinion, T&E people are dedicated, hardworking, highly intelligent and a lot of fun.  I don’t have space or the great memory to name them all, but if we crossed paths working T&E or analysis since I started back in 1974, you have my sincere thanks. Hoaah!

END NOTE:  I shared a draft of this interview with several old co-workers who have become senior leaders in Army T&E and analysis.  I was pleased to learn that ATEC has established an initial Live, Virtual, and Constructive M&S capability which runs in a distributed manner across ATEC and other sites. I think this is a great advancement and I congratulate ATEC on this accomplishment.

 

Biographies

James J. Streilein Ph.D. Jim received a BS in mathematics from Carnegie Mellon University and a PhD from Pennsylvania State University.  He started working in 1974 in Army analysis, test and evaluation in the Army Systems Analysis Activity (AMSAA) as a federal civil servant. He was promoted to a Senior Executive Service (SES) position in 1991 in AMSAA. He worked as a senior leader in the Army Operational Test and Evaluation Command and finished his career of nearly 40 years in the Office of the Director, Operational Test and Evaluation in the Office of the Secretary of Defense as the Deputy Director for Netcentric, Space and Missile Defense Systems, a 3-star protocol position. During his career he was awarded two Presidential meritorious SES rank awards and two meritorious and two exceptional civilian service awards. He was recognized by ITEA with the Mathews award and by the National Defense Industrial Association Hollis award, both for career excellence in test and evaluation. He was elected as a member of the Army Operations Research / Systems Analysis Hall of Fame. He has been denoted as an Outstanding Eagle Scout Alumni by the National Eagle Scout Association.

J. Michael Barton, Ph.D., Parsons Fellow, has worked on the Aberdeen Proving Ground since 2001 spending the first 10 years supporting the US Army Developmental Test Command and later the Army Test and Evaluation Command. He joined the Army Research Laboratory Computational and Information Sciences Directorate in April 2015, working in large-scale data analytics, high-performance computing, and outreach to test and evaluation and other ARL stakeholders. Dr. Barton’s entire career is in physics-based modeling and simulation. He spent 6 years as a consultant in the aerospace industry; 12 years as a contractor supporting the Air Force at the Arnold Engineering Developmental Center in Tennessee and the National Aeronautics and Space Administration Glenn Research Center in Ohio; and the first 4 years of his career with The Boeing Company in Seattle. He has worked for Parsons Corporation for the past 8 years. He received Bachelor of Science and Ph.D. degrees in engineering science and mechanics from the University of Tennessee-Knoxville and a Master of Engineering degree in aeronautics and Astronautics from the University of Washington.

ITEA_Logo2021
  • Join us on LinkedIn to stay updated with the latest industry insights, valuable content, and professional networking!