2013 ITEA Journal Issues

2013 ITEA Journal Issues

jite-34-04-cov1December 2013 – T&E in the Global Marketplace

The theme for this issue is T&E in the Global Marketplace, and the issue is packed with two Guest Editorials, an Interview of a key newly-appointed T&E leader, Inside the Beltway, a Hot Wash-Up feature, two Historical Perspectives features, and nine technical articles that strongly adhere to the theme. The theme was selected because the test and evaluation (T&E) community recognizes that the economy is global with a degree of connectedness which has never existed. Ownership of companies, operating locations for companies, teaming and partnerships for building products and systems, markets, test and evaluation, and professional associations are fluid and worldwide. The inter-relationships also present challenges including multiple time zones and differences in culture, language, units of measure, legal systems, security, and protection of intellectual property. T&E in the Global Marketplace validates the ‘‘I’’ in ITEA.

Our two Guest Editorials discuss Congressional initiatives related to test ranges and multinational T&E. In their article, ‘‘Congressional Range and Testing Center Caucus,’’ United States House of Representatives Caucus Co-Chairs, the Honorable Diane Black from the Sixth Congressional District of Tennessee and the Honorable Derek Kilmer from the Sixth Congressional District of Washington highlight Caucus support for the Major Range and Test Facility Base. The mission of the Caucus is to educate Members of Congress on the strategic value of ranges, the supporting test centers, and thousands of personnel operating the ranges.

For our second Guest Editorial, ‘‘Multinational Test and Evaluation: T&E of the Future,’’ Mr. Dave Duma, Principal Deputy Director, Operational Test & Evaluation, Office of the Secretary of Defense, discusses agreements that collectively have become known as the International Test and Evaluation Program (TEP). Mr. Duma describes the three primary provisions of the TEP that, through sharing of T&E assets and cooperative testing, promotes efficiencies on a bilateral basis. He then proposes Multinational TEP (MTEP) agreements that will allow accelerated T&E collaboration similar to coalition operations; accelerating multinational T&E agreements will allow greater efficiencies and testing in a true coalition environment.

For our special Interview feature, Dr. C. David Brown, Deputy Assistant Secretary of Defense, Developmental Test & Evaluation, and Director, Test Resource Management Center (TRMC), graciously allowed ITEA to conduct this interview shortly after his appointment to this position on August 29, 2013. He discusses his background, priorities in office, and visions for developmental T&E and the workforce.

In our Inside the Beltway feature, ‘‘An Update on the Joint Mission Environment Test Capability (JMETC) and the Distributed Testing World,’’ by Mr. Bernard ‘‘Chip’’ Ferguson, Deputy Director, Interoperability and Cyber Test Capability, TRMC, Director, National Cyber Range, and Program Manager, Joint Mission Environment Test Capability (JMETC) Program, discusses the status and future of several programs. Mr. Ferguson discusses that JMETC was established under TRMC, JMETC capabilities and connectivity, the National Cyber Range (NCR) as a key part of the Cyber T&E Infrastructure, and maturing NCR capabilities.

In his Hot Wash-Up feature, ‘‘Why we should stop worrying so much, and embrace international collaboration,’’ Dr. Wilson Felder, Distinguished Service Professor, School of Systems and Enterprises, Stevens Institute of Technology, discusses his thoughts on international verification and validation of software. Dr. Felder highlights the advantages of globally seamless technologies, and he tackles the risks commonly expressed when projects are shared across companies, country boundaries, and continents.

In our first Historical Perspective feature, ‘‘The Dummies Did It….and We’re All Safer Today,’’ Dr. Jim Welshans, our ITEA Historian, provides an interesting and meaningful narrative on the entities that represent us in dangerous tests. Dr. Welshans highlights how crash test dummies helped in various tests, evolved to mimic human physical variations, were awarded with international standards, and instrumented to better capture force profiles in crashes.

In our second Historical Perspective feature, ‘‘Realtime Telemetry Processing System Celebrates 40 Years of Operations,’’ Ms. Theresa Hopkins, Atlantic Test Ranges Business Communications Group Team Lead, looks at the evolution and use of this system. Real-time Telemetry Processing System (RTPS) helped flight test engineers monitor an aircraft under test in ways they never could before, and the systems has evolved to RTPS IV in its tenth year. Ms. Hopkins discusses the uses and users of the RTPS and then highlights the improvements being undertaken to improve service in its fifth decade of use.

In our first of nine technical articles, ‘‘Perspectives from a Small Corner of the Globe: T&E within the Royal New Zealand Air Force,’’ Wing Commander Daniel ‘‘DJ’’ Hunt, Director of Systems Evaluation for the Royal New Zealand Air Force (RNZAF), highlights the fact that the RNZAF is over halfway through replacing or upgrading the fleet. The scale of the overlapping replacements or improvements for 48 different airframes presents challenges in terms of training and testing concentrated over a relatively short period of time. The author presents challenges overcome and future.

In ‘‘Applying High Throughput Testing of Software,’’ Amir Sidek, Custommedia Sdn Bhd (Malaysia), et al, present a design for software testing that is called ‘‘all pairs’’ so that strengths of main effects and  interactions can be evaluated. The nature of this design type is such that the size of the test array is minimized while the amount of information gained concerning the main parameters and their interactions is maximized. Often for this type of high throughput testing, the test design is orthogonal or nearly orthogonal; when the design is orthogonal, the contributions of each tested parameter can be judged independently of the other parameters. The article includes an example of testing an Online Web Portal.

In ‘‘Statistically Defensible (and Practical) Test and Evaluation,’’ Todd Remund, Mathematical Statistician in the Statistics Department at Edwards Air Force Base, discusses the difference between confidence intervals and tolerance intervals. Mr. Remund highlights the strength of the tolerance intervals for defining requirements, and he concludes that, in many cases, tolerance intervals supply analyses that are more specific, informative, and useful as compared to confidence intervals.

In ‘‘Division-Scale Live/Virtual/Constructive Experiment Coupling Network Simulator 3 and One Semi-Automated Forces via Distributed Interactive Simulation,’’ Christopher Kosecki, Lockheed Martin, et al, discuss the unique use of simulation during the annual Network Modernization exercises. At these exercises, high fidelity network modeling augmented the live exercises. Network simulation and emulation allowed modeling of a virtual brigade that was integrated into the command and control network that, in turn, linked them to live tactical assets. The authors also describe future efforts to improve the Live/Virtual/Constructive environment.

In ‘‘An Unambiguous Taxonomy for Verification Methods,’’ Dr. W. Clifton Baldwin, Senior Systems Engineer at the Federal Aviation Administration, discusses the scope and function of verification. The author discusses the possibility that some new systems engineers may have confusion about the methods to complete verification. Dr. Baldwin offers a matrix format to help reduce this confusion regarding application of differing methods for verification.

In ‘‘International Council on Systems Engineering Systems Engineering Handbook Version 3.2 Integration, Verification, and Validation via Design Structure Matrix Analysis of ContextDiagrams Set,’’Mr. Aditya Akundi and Mr. Eric Smith, both of the University of Texas El Paso, discuss an analysis of the Systems Engineering Handbook in various ways. The authors evaluated the relationships between inputs (including controls and enablers) and outputs in the requirements analyses process via Design Structures Matrix methods. The authors’ analyses may have located some disconnects in the Systems Engineering Handbook that may indicate future necessary improvements.

In ‘‘Network Capability Assessment and Standardized Measures of Performance (MOP) Framework,’’ Mr. Michael Badger, Senior Network Engineer for the United States Army, et al, describe an analytic measurement framework to provide the ability to capture the contribution of networked information systems to the execution of command and the resulting contributions to operational effectiveness of units and the overall force. The framework establishes a level of consistency, reusability, traceability, and standardization of performance and effectiveness metrics. The authors that the framework will promote cost avoidance by improving development of test objectives and by streamlining the instrumentation planning process.

In ‘‘Historical Snapshot and Value of Coalition Interoperability Assurance and Validation (CIAV) and Coalition Test and Evaluation Environment (CTE2),’’ Mr. Jeff Phipps, Chief of the Coalition Branch at the Joint Interoperability Test Command, et al, present the CIAV program that uses the CTE2. The CIAV programincorporates the flexibility of reflecting a United States Department of Defense management structure or an international Coalition management structure. The CIAV verifies the correct exchange of data by looking at the data flow from creation to presentation to the commander as a basis for action or decision. The CTE2 was developed to replicate any operational mission environment, and it includes participants across United States Service and Component laboratories and many Coalition partner laboratories. The authors state that CIAV supports rapid discovery, design, development, integration, information assurance, and interoperability of mission-based systems.

In ‘‘Automating the North American Electric Reliability Critical Infrastructure Protection (CIP) Compliance Test Lab to Meet CIP Standards,’’ Mr. Chuck Reynolds, Chief Technical Officer of Technical Systems Integrators Incorporated, and Mr Alex Henthorn-Iwane, Vice President of Marketing for Quali- Systems, stress the need for adequate testing to ensure strong cyber security and maintaining critical compliance standards. The authors state that manual processes for testing may be producing flawed results and that these manual processes dominate current test laboratory infrastructure management and typical testing processes. The authors propose that automated software for testing and creating a regimen of standardized, rigorous, repeatable, and documented test processes may better ensure protection of the electric utility infrastructures and better compliance with evolving national standards.

I hope you enjoy this last issue of 2013 for The ITEA Journal of Test and Evaluation, and I look forward to receiving your articles related to the next theme ‘‘How Much Testing is Enough?’’ for Issue 35-1. Subthemes would be related to knowing when to stop testing or how to right-size tests with the right sensors, data, and inputs for the evaluations. The deadline for manuscripts for Issue 35-1 is 1 December 2013, and we have posted all themes for 2014 and 2015. Please write early and often. And, I’m still waiting for the first article to be submitted to the Referees.


 

 September 2013 – Truth in Data

jite-34-03-cover-WEBThe theme for this issue is Truth in Data, and the issue is packed with a Guest Editorial, a Historical Perspectives feature, a new Hot Washup feature, Inside the Beltway, and nine technical articles that strongly or mildly adhere to the theme, depending on the reader’s taste. The theme was selected because the test-and-evaluation (T&E) community is moving to a more rigorous and scientific basis for setup and evaluation, with senior leadership in the Department of Defense championing rigor and objectivity in statistically defensible T&E. Many organizations have embraced statistics-based methods, including design of experiments, for test planning and evaluation. A scientifically based T&E process is naturally rooted in systems engineering and takes an end-to-end view of testing as one way to evaluate the effectiveness of the engineering. Looking beyond rigor in T&E, there are commonalities with the operationsresearch/systems-analysis community, which uses many of the same fundamental approaches to evaluate mission performance against desired operational outcomes, but during real-world operations. Truth in Data seeks insights into, lessons learned from, and success stories of scientific rigor in T&E and the quest for more efficient and effective testing. It also investigates common ground with the operationsresearch/systems-analysis community.

Dr. Michael L. Cohen (senior program officer for the Committee on National Statistics), in his Guest Editorial titled ‘‘Towards Re-Creating a Statistical-T&E Synergy,’’ provides his views on the current degree of collaboration between statisticians and the defense T&E community. Dr. Cohen traces this relationship back to the World War II era and identifies time periods when the relationship was strongest and, conversely, when the relationship was much weaker. Paying special attention to operational T&E, Dr. Cohen relates that these types of tests are marked with challenges that include small sample sizes, difficulty in estimating the reliability of systems of systems, and differences in test environments from developmental tests. Dr. Cohen then ends his editorial with several reasons for optimism that the statistical-T&E strategy is growing stronger now.

In Historical Perspectives, Dr. James S. Welshans (our ITEA historian) looks at ‘‘Truths, Torments, and Togas’’ to explore what truth in data really means. Dr. Welshans points out that great thinkers over the years have shared notions of truth. He first explores the notions of essential, fundamental, and unknowable truth. He then looks at the middle ground of socially constructed truth and at the context of this issue’s theme in terms of this theory on truth. He concludes that truth in data is constructed from the interactions between research and test personnel and the phenomena under study.

In a new feature, Hot Washup, we have the pleasure of hearing from former T&E executives who reflect on trends in current T&E. Our first featured article in this vein, ‘‘Data-Driven T&E: An Essential Element in  Verification and Validation of Complex Systems,’’ is from Dr. Wilson Felder, the former director of research, development, test, and evaluation for the FAA and now a Distinguished Service Professor in the School of Systems Engineering and Enterprises at the Stevens Institute of Technology. Dr. Felder looks at three daunting challenges that face  T&E: turning it into a profession, establishing a theoretical basis for verification and validation in particular, and developing a new theory of test that is more effective at addressing the evolving nature of complex distributed systems. He states that we have made excellent progress on the first challenge (T&E as a profession) but are much further behind on the second and third challenges. He concludes that developing a coherent and successful approach to testing complex systems will be based on data, the way we process the data,and the theoretical framework the data allow us to validate.

In the Inside the Beltway feature for this issue, Dr. J. Michael Gilmore (Director of Operational T&E in the Department of Defense) states that he has pushed for a more rigorous approach to T&E since assuming his position as the director in 2009. He addresses his advocacy for the use of design of experiments as a rigorous approach to test planning and analysis of test results. Dr. Gilmore then summarizes the major milestones and developments that have supported and instituted increasing rigor in Department of Defense T&E over the past few years. He ends the article with his bottom line for the necessity of scientifically defensible approaches for adequate testing.

Dr. Gilmore’s feature sets the stage for our nine technical articles. In ‘‘Answering the Question of Likelihood,’’ Mr. Charles Ricke III (senior military analyst for AVW Technologies Incorporated) presents an objective method for risk assessments to determine the likelihood of  a consequence and portray it in a riskreporting matrix. He uses a case-study method to demonstrate the applied method that illustrates a way to generate a likelihood probability-distribution function for risk-reporting matrices. Mr. Ricke also discusses the capabilities and limitations of the applied method, including the benefit that the method permits scalability for use at any resolution level.

In ‘‘Performance Evaluation of Intention Recognition in Human/Robot Collaborative Environments,’’ Mr. Craig Schlenoff (group leader, Cognition and Collaboration Systems Group, Intelligent Systems Division, National Institute of Standards and Technology) et al. propose a novel approach to intention recognition in robots based on the recognition of states in the environment. Mr. Schlenoff and his coauthor team propose a set of metrics to be used to assess the performance of this intention-recognition system. The article also includes use of a modeling-and-simulation environment in which to apply this method. Progress in this area of intention recognition could be significant in providing robots with some improvement in understanding of their environment and some improvement in projection or inference of future consequences of current actions. The ultimate goal is to reduce the physical separation of robots in operational missions from humans as part of the same missions due to safety issues.

In ‘‘Truth in Testing Requires an Effective Team of Teams,’’ Mr. Will Manning (operations-research systems analyst, aviation test director, Operational Test Command, United States Army) provides several suggestions to help T&E teams tackle the most complex problems. One of his key suggestions is for acquisition stakeholders and T&E teams to work together to provide the necessary inputs for test execution and to construct the data-collection methods to efficiently maximize the collection of data. Mr. Manning states that the most complex tests often require an iterative problem-solving approach, with data collected over many test events. These interdependent events are more dependent on clearly stated objectives, outputs, and inputs to allow discovery of truth in data vice just storage of the data gathered.

In ‘‘Understanding the Transition from National Airspace System (NAS) to Next Generation Air Transportation System (NextGen): Radar to Automatic Dependent Surveillance-Broadcast (ADS-B),’’ authors Terence McClain (mathematician and software engineer, Federal Aviation Administration) and W. Clifton Baldwin (senior system engineer, Federal Aviation Administration) discuss how use of the systems-of-systems approach was extremely advantageous for system definition and evaluation of the transition to NextGen. Use of an agent-based modeling-and-simulation environment allowed demonstration of radar-separation rules with differences between aircraft controlled by ADS-B and radar. Mr. McClain and Mr. Baldwin  discuss that the simulation environment indicated significant fuel savings when ADS-B is used for navigation.

In ‘‘A Systems Engineering Process for Modeling & Simulation,’’ Mr. Ji’on Crump (computer specialist, Federal Aviation Administration) et al. make the case that each of the components found within the modeling-and-simulation (M&S) domain are easily paralleled to the technical processes classified in the traditional systems-engineering model. The authors present a comparison of the M&S life-cycle processes to the systems-engineering life-cycle processes (in the classic V diagram), and they conclude that the two processes are clearly parallel and  interdependent. One side effect of management of M&S in the standard systems-engineering process would be stronger support for evaluating the validity of simulations, and this linking of the processes portends application of other systems-engineering processes in support of M&S.

In ‘‘Methods for Estimating the Upper 90th Confidence Limit on the 90th Percentile of Target Location Errors in Munitions Testing,’’ Mr. Grant M. Olsen (Raytheon Missile Systems) et al. discuss methods of portraying the data and augmenting the data to determine miss from target error and calculating the upper 90th confidence limit on the 90th percentile of target-location errors (denoted as TLE90C). The authors highlight the various ways to determine the most likely probability distribution of the data and to augment the data if more data are likely necessary. They use a case-study approach and supporting simulation environment in order to compare four methods of computing TLE90C, using three commonly used probability distributions and one test distribution to see how the four methods perform when an inappropriate distribution is used.

In ‘‘Surveys in Test & Evaluation,’’ Dr. Rebecca A. Grier (Human Systems Integration, research staff member, Institute for Defense Analyses) describes the best practices for incorporating surveys into operational test and evaluation (OT&E). Dr. Grier states that the full benefit of surveys is commonly not leveraged now in OT&E, and she pays special attention to survey design—importantly, defensible surveys. A survey is designed, in some part, by drafting survey items, selecting response options, and formatting the survey instruments, all of which Dr.  Grier discusses. Good surveys are developed through understanding the goals of the survey, knowing the common mistakes made by analysts and survey respondents, and designing the right survey instrument to glean the most needed information from the OT&E events.

In ‘‘Addressing Big Data Challenges in Test and Evaluation,’’ Dr. Udaya Ranawake (computational scientist, High Performance Technologies Group, Dynamics Research Group) discusses potential solutions to the issue of massive increases in the amount of test data generated in Department of Defense T&E events, particularly how best to organize, store, selectively retrieve, and analyze large volumes of test data. Dr. Ranawake highlights that the choice of database technique and computer system used for the database analyses is very important, and these  requirements can be decided after careful analysis of the requirements for speed, reliability, security, and ease of use. Dr. Ranawake evaluates the techniques and provides information on applicability, benefits, and drawbacks.

In ‘‘ProDV: A Case Study in Delivering Visual Analytics,’’ Dr. Derek Overby (research associate, Computing and Information Technology Division, Texas Center for Applied Technology, Texas A&M University) and his coauthors discuss and illustrate custom visual analytics. They describe a visual programming environment for configuring various interactive visualization and analysis capabilities within a dynamic environment. This article illustrates ways that tabular data can be visually arrayed to show the hidden information in the data set—the hidden relationships, interactions, and optimum values that can be more readily located and understood through visual analytics. The article also discusses lessons learned about the system after use in training sessions and development of user performance metrics to evaluate the success of the applications used.

I hope you enjoy this issue of the ITEA Journal of Test and Evaluation, and I look forward to the next issue, with the theme ‘‘T&E in the Global Marketplace.’’


 

June 2013 – The Changing Face of Developmental T&E

jite-34-02-cov-WEBThe theme for this issue is The Changing Face of Developmental Test and Evaluation, and we selected articles and features that touch on many diverse aspects of developmental test and evaluation (T&E). For many readers, developmental T&E may be thought of as predominantly defense-related, but developmental T&E includes tests, analyses, evaluations, inspections, and demonstrations of capability in laboratories, manufacturing plants, inspection sites, and component assembly areas for most of the products and systems we purchase as consumers. But, in many of these cases, the activities may be conducted under a different name than developmental T&E.

Within the Department of Defense (DoD), big changes have taken place in developmental T&E. The Office of the Deputy Assistant Secretary of Defense (DASD), Developmental Test and Evaluation (DT&E), was established effective June 23, 2009, ‘‘…to serve as the principal advisor to the Office of the Secretary of Defense and the Under Secretary of Defense for Acquisition, Technology and Logistics in matters relating to DT&E in defense acquisition programs.’’ The Air Force has restructured its developmental test centers into the Air Force Test Center. The Army Test and Evaluation Command sought efficiencies by reorganizing and incorporating the functions of its Developmental Test Command into higher headquarters. The United States Marine Corps has established a developmental testing organization. After three years of change in DoD developmental T&E, this issue of the ITEA Journal takes a current look at the status of developmental T&E, subsequent changes since the reorganizations, and suggested adjustments in developmental T&E processes. Topics such as the impact of the changes on the Military Services, their contractors, and industry; integrated testing; system of systems testing; joint and developmental evaluation; and new organizational conflict of interest policy were sought for this issue.

In his Guest Editorial, Dr. Steven Hutchison (Acting Deputy Assistant Secretary of Defense, Developmental Test and Evaluation, Director, Test Resource Management Center, OSD) states that much of what the DoD test and evaluation community does to support decision makers is late to need. He proposes correcting the trend of pushing critical test activities to the right on the timeline by, instead, making a concerted effort to Shift Left! His bottom line includes getting development right, verifying development through rigorous developmental T&E before DoD commits to production, and thus, delivering improved capability faster to the warfighter at reduced cost. Dr. Hutchison also discusses new developments in Interoperability, Cybersecurity, and Cybersecurity DT&E Methodology.

In Historical Perspectives, Dr. Jim Welshans, our ITEA Publications Committee Historian, describes his interview with Dr. Dwight Polk, who recently retired after over four decades working in the T&E field in uniform and as a civilian for the United States Air Force. Dr. Polk entered the Air Force as an Operations Research Analyst and was able to employ modeling and simulation to analyze the effectiveness of the new threat ZSU-23/4 and other anti-aircraft artillery. He also worked on the Red Baron project, joint exercise evaluation, CBU-89, and electronic warfare systems evaluations. Dr. Polk also gives his thoughts on how the amount of data generated from tests has increased, yet the current research designs do not seem any better from many years ago. He ends his interview with advice to anyone starting out in the test business.

Our selection of technical articles covers topics from modeling and simulation, integrated processes, cluster testing, cyber testing, and analysis of explosive-related materials to test optimization, DT&E strategy development, the Air Force Test Center, and scientific test and analysis. In their article ‘‘Enabling Modeling and Simulation in Support of Test and Evaluation,’’ Stephanie Brown Reitmeier (United States Army AMRDEC, Redstone Arsenal, Alabama) and Robbin Finley (United States Army PEO STRI-PM ITTSTMO, Redstone Arsenal, Alabama) discuss how modeling and simulation plays a significant role in development testing. They also describe how simulations today combine tactical flight software, high fidelity seeker models, multispectral scene imagery, representative target models, and 6-degree-of-freedom missile models to answer critical questions about systems in development.

Then, Arekhandia Patrick Eigbe (Federal Aviation Authority, Atlantic City, New Jersey), in his article titled ‘‘Adopting Integrated Processes to Enhance Test and Evaluation Program Management Communications,’’ highlights that less than optimal communications between T&E functions and Program Management (PM) is a real concern. Using a case study research approach, the author conceptualizes and demonstrates a T&E-PM model, extending the existing research to provide a new framework that provides a systems approach to T&E program management.

In ‘‘Cluster Testing: Reducing Air Weapon Test Execution Cycle Time,’’ authors George Axiotis (Department of Defense, Arlington, Virginia) and Steve Riker (Foxhole Technology Corporation, Fairfax, Virginia) propose a departure from the traditional serial shoot-analyze-shoot approach for air weapon test programs. The authors suggest an approach that slowly increases complexity as a way to reduce overall test timelines without significantly losing test point knowledge. The approach aims to address program risk early and to resolve issues prior to operational test.

For the next article, Brian DeMuth (ManTech International Corporation, Technical Services Group, Stafford, Virginia) and Joel Scharlat (ManTech International Corporation, Technical Services Group, Lexington Park, Maryland) in ‘‘Modeling and Simulation of Cyber Effects in a Degraded Environment’’ discuss a demonstration of quantifying cyber effects within an operational scenario that included maritime, space, and cyber domains. The authors present a high-level review of how a cyber range can simulate a cyber attack in a representative operational environment and show how a degraded network environment can affect sensor networks with operational consequences. They also introduce other potential uses for cyber range environments.

The next article is unique for the ITEA Journal in both the level of technical detail and the topic of analysis of explosive materials. In ‘‘X-Ray Fluorescence Spectroscopy for Analysis of Explosive-Related Materials and Unknowns,’’ Erica R. Valdes, PhD. and James L. Valdes, Ph.D. (both from the Department of the Army, RDECOM, Edgewood Chemical Biological Center, Aberdeen Proving Ground, Maryland) discuss development of expanded practices to address threats that are not well-defined, to use instrumentation that is not limited to rapid data-library search and match algorithms, and to analyze solid materials.

Specifically, the authors evaluate wavelength dispersive X-ray fluorescence spectroscopy (WDXRF) that is an optical approach with some limitations on the materials it can accurately analyze. Under previous assumptions, WDXRF was thought to rely on the assumption of smooth, flat, homogeneous samples in order to guarantee accuracy and precision. In the field of detection of explosives, the world is far from delivering these sorts of ideal samples, and the authors test WDXRF relative to various inhomogeneous samples to develop best-practices approaches for field situations.

In ‘‘Systems-Test Optimization Using Monte Carlo Simulation,’’ Eileen A. Bjorkman, Ph.D. (Air Force Test Center, Edwards Air Force Base, California), et al., describe a general method to develop optimal system T&E strategies. The authors use Monte Carlo simulation to evaluate different T&E approaches through comparison of the amount of residual uncertainty provided by each approach. In the analysis set-up, they establish specific test objectives and postulated test data sets. Then, Monte Carlo simulasimulation generates a residual uncertainty at the end of each proposed test, and the level of uncertainty is used as measure of test approach value.

In ‘‘DT&E Strategy Development: Start with the ‘E’; Then Build the ‘T’,’’ Darlene Mosser-Kerner (Office of DASD DT&E), et al., postulate that thoughtfully executed T&E can save taxpayer dollars through early discovery and remediation of performance deficiencies. For this reason, the authors recommend that the T&E team first define an evaluation framework, then build a test program to generate the data for subsequent evaluation, and finally, evaluate the data to gain the knowledge for informed decisions. To gather the information that is needed, DT&E planning would best begin with activities that define how the system will be evaluated and how information for decisions will be generated from test data.

In the next article of this issue, ‘‘When You Think of Developmental Test, Think of the Air Force Test Center,’’ Darcy S. Painter (Edwards Air Force Base, California) discusses the United States Air Force reorganization of Air Force Materiel Command that resulted in designation of the Air Force Test Center (AFTC) to include all Air Force developmental test centers. Darcy also conducts an interview with Brigadier General Arnold W. Bunch, Jr., the first commander of AFTC. General Bunch describes his vision for AFTC, mission statement, and challenges for DT&E of cyber.

The final technical article serves to tie together the topics in this issue and provides a bridge to the September issue. In ‘‘Scientific Test and Analysis Techniques (STAT) in Test and Evaluation (T&E) Center of Excellence (CoE): Designing and Implementing Statistical Rigor into Test and Evaluation for the Department of Defense (DoD),’’ Darryl K. Ahner (Scientific Test and Analysis Techniques in Test and Evaluation Center of Excellence, Air Force Institute of Technology, Wright-Patterson AFB, Ohio) highlights the activation of the STAT-T&E-CoE. The goal of this center is to assist acquisition programs in developing improved, efficient, and scientifically rigorous and defensible test strategies, designs, and plans for both developmental and operational defense testing. The STAT-T&E-CoE will be one means to help programs in leveraging mathematical and statistical tools and processes for these purposes.

We hope you enjoy this issue of the ITEA Journal, and your feedback is welcome. We are busily gathering articles by 1 June for our September 2013 issue with the theme ‘‘Truth in Data’’ where we will cover moving to a more rigorous and scientific basis of evaluation. Please feel free to submit an article or solicit others to submit an appropriate article for any upcoming issues. Our December theme will be ‘‘T&E in the Global Marketplace’’ with many articles already solicited.


 

jite-33-04-cover-WEB

March 2013 – Cultivating the T&E Workforce

Test and evaluation are professions not academic disciplines and as such we can’t merely recruit more as needed. We recruit engineers, physicists, computer scientists, mathematicians, chemists and other degreed professionals and train them in test and evaluation. As technology changes and systems and instrumentation become more complex, test and evaluation (T&E) professionals need to continue formal education as well as improve T&E expertise. In addition, we need to consistently attract young people to the disciplines of science, technology, engineering, and mathematics. Cultivating the T&E workforce requires asking the question: what should the T&E professional’s background consist of today and what should it be tomorrow? We need to prepare the future workforce for T&E, and prepare T&E for them. This issue features articles from students in high school and our service  academies, from faculty, from T&E leaders, and from our T&E professional society on certifying the T&E workforce.

Dr. Robert McGrath, Director of the Georgia Tech Research Institute, describes workforce challenges from an academic perspective in his Guest Editorial: encourage the current outstanding T&E force to stay as long as possible, recruit adequately prepared new hires from other parts of the government and industry, and hire new STEM graduates in the right academic mix to support T&E of today and the future. Dr. Catherine Warner, Science Advisor to the Director, Operational Test and Evaluation, uses Inside the Beltway to encourage early and continuous engagement between the testing and requirements community to better craft requirements that are mission oriented, realistic, and testable. In Historical Perspectives, Arnold Air Force Base Historian David Hiebert traces the 60 year history of what was originally the Air EngineeringDevelopment Center, one of the country’s premiere developmental testing complexes.

The articles begin with a special student section arising from the ITEA Annual Symposium Academia Day paper competition, held in Huntington Beach, CA. First place winner, Bradley Matheus, senior at California Academy of Mathematics and Science in Carson, CA, outlines the benefits of numerically controlled machines and 3-D printing for rapid prototyping. Second place winner, Zaki Molvi, senior at Troy High School in Fullerton, California, provides a future professional’s perspective on increasing the number of graduating scientists and engineers through a combination of STEM education, specialized courses and hands-on training during undergraduate years. Finally, third place winner Sara Pak, senior at Diamond BarHigh School in Diamond Bar, CA, reinforces the necessity of test and evaluation for future product development, to ensure to consumers that new products are safe and suitable.

Robert Arnold, Chief Technologist of the 96th Test Wing, Eglin Air Force Base, Dr. Eileen Bjorkman, Chief Technologist of the 412th Test Wing, Edwards Air Force Base, and Dr. Edward Kraft, Chief Technologist of the Arnold Engineering Development Complex, Arnold Air Force Base, present the Air Force Test Center’s new human capital strategy which emphasizes investment in technical competence on par with test infrastructure improvements and sustainment. James Gaidry, ITEA Executive Director, rolls out the new ITEA Test and Evaluation Certification Credential and reviews how it came about and its benefits. Dr. Michael Kendra of the Air Force Office of Scientific Research explains workforce development in two components, bringing new science and engineering personnel into T&E, and advancing the professional development of the existing T&E workforce through training, coursework, certification, and advanced degree awards.

Tony Stout et al., speaking for the Joint Interoperability Test Command, encourage shifting focus from managing people and projects to measuring the gap between current skills and future needs and using this knowledge to drive training, hiring, work allocation, contracting and human resource management decisions. Thomas Simms, et al. describe the Deputy Assistant Secretary of Defense for Developmental Test and Evaluation roadmap for workforce development. Dr. Raymond Hill, Professor of Operations Research with the Air Force Institute of Technology, defends using specialty training, for example in statistics and design of experiments, as a supplement to formal undergraduate and graduate education, not a substitute for it.

Second Lieutenant Luke Grant presents his senior engineering project from the United States Military Academy where he used computational fluid dynamics and high performance computing to study the physics associated with fluid flow over gas turbine engine blades. Dr. Mehdi Ghoreyshi et al. from the United States Air Force Academy compare reduced order modeling with full field equations modeling of fighter aerodynamics and verify its accuracy while improving computational speeds by several orders of magnitude. Daniel Carlson and Erich Brownlow of the 412th Electronics Warfare Group, Edwards Air Force Base, apply Bayesian techniques to combine simulation data with limited flight test data to get a better understanding of aircraft performance. The issue concludes with an article by Dr. Paul Fortier, et al. of the University of Massachusetts-Dartmouth, who offer microsystems for test and evaluation applications requiring reduced power, weight and space, and for new capabilities that could not exist otherwise.

Finally, the Publications Committee thanks one of its own, Dr. Steven ‘‘Flash’’ Gordon, who adopted this issue and took a leadership role in defining, shaping and populating it. Thank you, Flash, excellent job.