2017 ITEA Journal Issues

2017 ITEA Journal Issues

December 2017 – T&E for Enhanced Security

ITEA-Journal Dec17 Cov web

The theme for this issue is “T&E for Enhanced Security,” and the issue includes a Guest Editorial, an Inside the Beltway feature, the President’s Corner, and eight technical articles.

Our Guest Editorial, written by Steve Hutchison, Ph.D., is “Improving Operational Test and Evaluation Outcomes.” In this feature, he attacks the familiar problem of a low overall rate of favorable operational test and evaluation outcomes. In this case, he attacks the problem from a new direction. Dr. Hutchison presents a watch-list of factors affecting operational test and evaluation outcomes, and he concludes with a key suggestion to improve the rate of favorable test and evaluation outcomes. We thank Dr. Hutchison for helping us gather many of the articles for this issue.

Our Inside the Beltway article is titled, “Twelve Years of Innovation in Complex System V&V at the FAA.” Jaime Figueroa summarizes the most important issues discussed at the Federal Aviation Administration’s twelfth annual Verification and Validation Summit in 2017. Mr. Figueroa links these most important issues to the theme of the summit, “Achieving Complexity Consciousness,” and he concludes with a discussion of the similarities of the issues and the challenges being faced by the T&E community.

President’s Corner, written by William T. Keegan, our new President of the Board of Directors for ITEA, is the first of this featured article from him. This will be one of many articles where he discusses upcoming events, news about ITEA, and key themes and articles in The ITEA Journal.

Our first of eight technical articles, “Test and Evaluation Challenges for Advanced Image-Based Security Technologies,” written by Christopher Smith, Ph.D., discusses test and evaluation of increasingly complex, high-performance screening systems for both conventional and homemade explosives detection. He explains that the performance of automatic target recognition algorithms surpasses the discrimination expertise of the best human inspectors, yet presents challenges for testing.

The second technical article in this issue, “TSA Systems Integration Facility (TSIF) Test Program” written by Mike McCormick and Gregory Miller, states that the TSIF was established in 2007 to thoroughly test and evaluate products for live screening operations. They discuss the testing environment where new technologies from various vendors are tested in actual environments, including a baggage handling system in the checked baggage laboratory.

For our next article, “Risk Assessment for Specific M&S Characteristics” by James Elele, Ph.D., et al., the authors state that the focus of risk assessment is on specific characteristics of modeling and simulation (M&S) capability, accuracy, and usability. They demonstrate a method to derive quantitative and qualitative estimates of risk, and their process can be applied throughout the life of a program that is using M&S.

In our fourth technical article, “Testing and Evaluation of Passive Radiation Detection Equipment for Homeland Security,” David West, Ph.D., et al., review the different types of equipment used in these applications of passive radiation detection. The authors cover three unique challenges for this application, and they discuss a series of test and evaluation standards that help enable the test and evaluation process.

In the fifth technical article, “On Becoming a Board-Qualified Chief Development Tester,” Thomas Simms et al. discuss the qualifications and selection process for a Chief Development Tester (CDT), background of the developed requirement, role and professional development of the CDT, specific requirements and the board qualification process, and CDTs in designated Key Leadership Positions.

The sixth technical article, authored by Alejandro Hernandez, Ph.D., is titled “Guiding Construction of Better Test Designs by Modeling Random Latin Hypercube Correlation Values with the Gumbel Distribution.” The author explores ways to better use Design of Experiments for large dimensional experiments. For these large problem sets, some adjustments must be made to keep the test affordable in terms of size, assets, and time. The author proposes a selection criteria that reduces the correlations between factors and helps testers select a nearly orthogonal random Latin hypercube.

The seventh technical article, “Hybrid LDPC-Coded Free-Space Transmission Systems” by Zhen Qu, Ph.D., et al., describes a method to improve system performance for optical or radio-frequency carriers transmitting data in free space. The authors show that by optimizing the orbital angular momentum or using a rate-less LDPC-coded modulation, system performance is improved. The method shows promise in improving performance even in the presence of atmospheric turbulence.

For the last technical article for this issue, “Rethinking Design of Experiments for Developmental Test and Evaluation,” the author, George Axiotis, states that application of Design of Experiments (DOE) in developmental test and evaluation (DT&E) needs improvement, and he proposes a revised DOE process to more accurately satisfy the needs of DT&E. His proposed process is a more pragmatic approach that will lower the complexity of the DOE designs and hopefully accelerate the DT&E process.

I hope you enjoy this fourth (last) issue of 2017 for The ITEA Journal of Test and Evaluation. By the time you receive issue 38-4 in December, the March 2018 issue 39-1 is being finalized. That theme will be “Testing Using Facilities Around the World.” For the next issue (the second issue of 2018), 39-2, the deadline for submissions is just after March 1, 2018, and the theme will be “Unmanned and Autonomous Vehicle Testing (Issue 39-2, June 2018).” We have posted all themes and descriptions for the remainder of 2018 – 2021 on the ITEA website. Please provide feedback on the choice of themes, and please write early and often.


September 2017 – Test and Evaluation of Cybersecurity and Readiness

sept2017 ITEA Journal cover web

The theme for this issue is “Test and Evaluation of Cybersecurity and Readiness,” and the issue includes two Guest Editorial features, an Inside the Beltway feature, a History feature, the President’s Corner, and nine technical articles.

Our first Guest Editorial, written by Derrick Hinton et al., is “Cyber Warriors: DoD’s Most Advanced Weapons.” This is an update of one of his previous editorials. In this feature, he lists the statutory and regulatory missions of the Test Resource Management Center, and then Mr. Hinton discusses the buildout of the National Cyber Range Complex to address increased demand. He states that the cyber workforce is a make-or-break factor in establishing, maintaining, and providing operationally realistic cyber events in the National Cyber Range infrastructure. He ends his feature discussing Science, Technology, Engineering, and Math initiatives and the ultimate goal: workforce excellence.

For our second Guest Editorial, Chip Ferguson, in “The Test Resource Management Center National Cyber Range Complex and the Cyber Test and Evaluation Infrastructure,” lists the five strategic goals and implementation objectives of the Department of Defense (DoD) Cyber Strategy. He then describes the need to increase cyber range capacity and to provide a cyberspace test and evaluation infrastructure. Mr. Ferguson then covers the role and future responsibilities of the DoD Executive Agent for Cyber Test Ranges.

Our Inside the Beltway is titled, “Improving Cybersecurity DT&E through Mission-Based Cyber Risk Assessments,” and Sarah Standard and Rhiannon Hutton, PhD, state that the Deputy Assistant Secretary of Defense, Developmental Test and Evaluation (DASD (DT&E)) has the authority to assess system performance across the Major Defense Acquisition Program (MDAP), Major Automated Information System (MAIS), and other special interest acquisition programs. As one of several cyber DT&E improvement initiatives to help acquisition programs plan for and increase mission context in cybersecurity DT&E, DASD(DT&E) is encouraging the use of mission-based cyber risk assessments (MB-CRA) as a tool to plan mission-focused cybersecurity testing. Specifically, DASD (DT&E) is promoting the Cyber Table Top (CTT) MBCRA methodology that the authors discuss.

For the Historical Perspectives feature, “70 Years of Scientific Computing in the Army,” Michael Barton, PhD, and Raju Namburu, PhD, describe the evolution of these powerful tools of scientific computing, partly due to miniaturization of electronic components. They state that at Aberdeen Proving Ground (APG), testing, research, and scientific computing are inextricably connected. The authors describe the first four stages of scientific computing, and they highlight innovations at APG: the first general purpose scientific computer, the first stored program computer, the first campus area network, the first parallel computer, and more. Finally, they hint at a possible stage 5 of scientific computing at APG.

The President’s Corner, written by Gene Hudgins, mentions that this feature is his final President’s Corner. He states his appreciation to all the contributors to The Journal and to all the volunteers and contributors to the ITEA Workshops and Symposia. He then summarizes the articles in this issue and highlights the two final workshops for 2017: this year’s Symposium in October with the theme “T&E in a Time of Risk and Change” and the Disruptive Technology in Test and Evaluation Workshop in November. He also looks ahead to the Symposia in Oxnard, California in 2018 and in Lihue, Kauai in 2019.

Our first of nine technical articles, “Improving the T&E Workforce’s Understanding of the New Approach to Developing Warfighting System Cyber Requirements,” written by William Rowell, PhD, and Darryl Ahner, PhD, has a goal of helping the DoD T&E workforce understand the cyber survivability requirements development process. The key is cyber survivable system capabilities that are engineered properly through development of testable and measureable requirements, reducing the risk of not achieving the desired capabilities to an acceptable level.

The second technical article in this issue, “Cybersecurity Test and Evaluation: A Look Back, Some Lessons Learned, and a Look Forward!” written by Pete Christensen, reviews his 3-year assignment as Director of the National Cyber Range (NCR). He discusses the status at the beginning of his tour, progress made, lessons learned, and then looks to the future. He ends his article stating that the most limiting resource is the Cybersecurity T&E Workforce.

For our next article “Test Capabilities Directory (TCD) On-Line” by Denise De La Cruz and Kenneth Wheeler, the authors state that the TCD is an on-line, web accessible repository of capabilities to support Program Managers and their test planners with information covering the wide range of test capabilities within the Department of Defense (DoD). The authors cover descriptions of the TCD, search capabilities, use cases, and future versions of the TCD.

In our fourth technical article, “Advancing Developmental Evaluation Frameworks and Statistical Test and Analysis Techniques for DT&E,” George Axiotis calls on the evaluation community to further advance application of the Development Evaluation Framework (DEF) and Scientific Test and Analysis Techniques (STAT) for planning effective developmental test and evaluation (DT&E). He encourages updating and adjustment of DEF and STAT to better address DoD system maturation and performance in the development life cycle.

In the fifth technical article, “A Pathway to ‘File and Fly’ Operations for Remotely Piloted Aircraft in the National Airspace System,” George Harrison and Wilson Felder, PhD, look at opportunities and challenges in the Remotely Piloted Aircraft (RPA) arena, and they separate the challenges into the categories of policy and technical. They propose a pathway to speed RPA operations in the National Airspace System in terms of months vice years.

The sixth technical article was authored by Mark London, PhD, and Robert Schaller, PhD, with the title “Musing of a Manager: Thoughts on Training the Future T&E Workforce.” The authors explore some of the pending challenges (such as demographic and financial) to growing the future workforce of test engineers. And, they offer some key trends that seem to portend a bright future for the T&E workforce.

The seventh technical article “Interface Management in Systems Engineering” by George Gardner, PhD, describes the principal users of interface management (one of the technical management processes in systems engineering) and discusses their approach to the process. Interface management seeks to ensure smooth integration, improve interoperability, and provide for easier and cheaper technical enhancement for programs with large systems of systems.

Our next technical article “Ridit Analysis for Cooper-Harper & Other Ordinal Ratings for Sparse Data—A Distance-Based Approach” was written by Arnon Hurwitz, PhD, and discusses how probability scoring (ridit) analyses and related methods can be used to handle ordinal categorical data (OCD). The author developed a new method of comparing the results of OCD data distributions using a distance-based metric and randomization tests to derive inferences on distributions for statistical measures.

For the last technical article for this issue, “A Case Study in Understanding and Evaluating Live, Virtual, and Constructive (LVC) Command and Control Training Effectiveness,” the authors, Andrew Roberts et al., state that a credible means to evaluate the use of LVC simulation as a training tool for warfighters is lacking. They present an approach for understanding and measuring C2 training effectiveness and trainee performance, and they show how it could be applied to a large-scale Air Force Command and Control LVC training event.

I hope you enjoy this third issue of 2017 for The ITEA Journal of Test and Evaluation. By the time you receive issue 38-3 in September, the December 2017 issue 38-4 is being finalized. That theme will be “T&E for Enhanced Security.” For the next issue (the first issue of 2018), 39-1, the deadline for submissions is just after December 1, 2017, and the theme will be “Testing Using Facilities Around the World.” We have posted all themes and descriptions for the remainder of 2017–2021 on the ITEA website. Please provide feedback on the choice of themes, and please write early and often.


 June 2017 – Training the Future Test and Evaluation (T&E) Workforce

June Cover Final webThe theme for this issue is “Training the Future Test and Evaluation (T&E) Workforce,” and the issue includes a Guest Editorial, Inside the Beltway, two Outside the Beltway features, President’s Corner, an Editorial, and nine technical articles.

Our Guest Editorial, written by Derek Hinton, is “Test Resource Management Center: Statutory and Regulatory Missions.” Mr. Hinton discusses the statutory and regulatory missions of the Test Resource Management Center (TRMC) and the most recent missions for TRMC in specific cyber-related areas. He describes the need to attract and retain a skilled cyber workforce to support the National Cyber Range mission. Then, Mr. Hinton covers the ongoing Science, Technology, Engineering and Mathematics outreach for DoD and for TRMC. He states that the ultimate goal is workforce excellence.

Our Inside the Beltway is titled “The State of the Acquisition Test and Evaluation Workforce,” and Ken Stefanek states that the T&E community within the Defense Acquisition Workforce is growing slowly and is healthy. He highlights gains in the number of people that hold bachelors and graduate degrees and in the percentage of the workforce who have achieved appropriate certification levels. He concludes that the T&E workforce is qualified and capable of meeting the needs of DoD.

For this issue, we are fortunate to have two After the Beltway features, and the first one is “Development Test and Evaluation: Progress and Potential” from Frank Kendall. He states that the Department of Defense has made great progress over the last several years in Developmental Test and Evaluation, but with areas in which more progress could and should be made. He states that Better Buying Power has helped emphasize cost control throughout the system life cycle and that expansion of the use of design of experiments helped testing become more efficient. He concludes his feature with thanks to the very professional T&E workforce and those that support them.

Our second After the Beltway feature is from Dave Brown, Ph.D., and Dave Bell, Ph.D., describing “T&E’s Challenging and Exciting Future.” They predict new types of weapon systems and new methodologies for defining and acquiring systems in the future. They also describe a new capability-centric acquisition process and a new way to decide what systems are needed. This new approach will be called Mission Engineering and it is dependent on mission threads. And, they predict that members of the future T&E workforce will need additional skillsets to those of the current workforce.

The President’s Corner, written by Gene Hudgins, mentions the importance of the theme of this issue and summarizes some of the articles in the issue. He also describes the recent, very successful Cyber Workshop, and he looks ahead to the upcoming Test Instrumentation Workshop and its outstanding line-up.

In Cultivating the T&E Workforce, “Simulation: Helping Develop the Future Workforce,” I discuss a new program for high school and technical school students throughout the nation. The Modeling and Simulation Certification Program had a successful beta test, with more than 100 students having already earned the Modeling and Simulation Certification. The certification is designed to prepare students for internship opportunities, employment after graduation, and postsecondary education.

Our first of nine technical articles, “Approach to a Highly Capable Testing and Evaluating Workforce at US Army Test and Evaluation Command: Attraction—Attention—Advancement/Achievement,” by Major General Daniel Karbler, et al., describes how the United States Army Test and Evaluation Command (ATEC) has developed a strategy to ensure their workforce members are prepared to execute the ATEC mission. They have institutionalized a human capital planning process, emphasizing the need to grow a T&E team ready for the future fight and for maintaining readiness.

The second technical article in this issue, “The Test and Evaluation Workforce and a Base of Sand Issue,” written by Raymond Hill, Ph.D., states that the base of the T&E workforce is not as firm as it needs to be in the area of statistical fluency. He suggests that statistical fluency should be a component of every T&E professional’s foundational expertise. Firming up this base will advance the goal of ensuring statistical defensibility of every T&E program.

For our next article, “Directed Energy Test and Evaluation Education,” Sam Blankenship, Ph.D., et al., present the goal of the Directed Energy Professional Society (DEPS) to help educate T&E professionals on Directed Energy (DE) and educate DE professionals in T&E. DEPS has offered many introductory short courses and tutorials on DE topics for thousands of students.

In our fourth technical article, “The Joint AFIT/TPS Program: A Test and Evaluation Partnership,” Donald Kunz, Ph.D., states that 91 Test Pilot School (TPS) students have participated in the Air Force Institute of Technology – TPS program. The students now complete five quarters at AFIT, arrive at TPS, complete TPS, and then are allowed 3 months to complete their Master’s Degree thesis. The students graduate from both programs within 30 months. Over the past 35 years, this program has evolved and is a winning partnership for TPS, AFIT, and the graduates.

In the fifth technical article, “Faster, Better, Smarter: Applying Big Data Analytics to 5th Generation Acquisition Systems,” Ryan Norman, et al., describe that we have experienced a significant growth in the amount of information gathered from more complex, higher resolution, software-intensive acquisition systems. Yet, the tools and methods needed to rapidly collect, aggregate, and analyze this information have not kept pace. Evaluations of commercial big data techniques have shown promising results, and big data analytics will be a beneficial change in methods for future tests.

In the sixth technical article, “Data-Intensive Computing for Test and Evaluation,” J. Michael Barton, Ph.D., and Raju Namburu, Ph.D., point out that the ability and the requirement to acquire data from a wide variety of instrumentation has grown in ways unimaginable even 20 years ago. They recommend a systematic, scalable, and rapidly configurable computational approach to big data based on systems engineering and a recognition of, and planning for, the data life cycle.

The seventh technical article, “A Review of Automated Tools to Improve Natural Language Requirements,” from George Gardner, Ph.D., discusses natural language requirements scanners and their use in requirements engineering. This article discusses three automated natural language text scanning tools used to improve requirements statements. The three systems compared are the Automated Requirements Measurement Tool, the Quality Analyzer for Requirements Specifications (QuARS), and the Federal Aviation Administration (FAA) Requirements Quality Tool (FRQT).

Our next technical article of the issue, “White Sands Missile Range (WSMR) Radio Spectrum Enterprise Testbed: A Spectrum Allocation Solution,” written by Juan Gonzalez, et al., provides a possible solution within WSMR to the military spectrum loss. The project’s objective was to develop an ontology, characterize the spectrum, and to generate rules of interactions.

For the last technical article for this issue, “A Case Study in Understanding and Evaluating Live, Virtual, and Constructive (LVC) Command and Control Training Effectiveness,” the authors, Andrew Roberts, et al., state that a credible means to evaluate the use of LVC simulation as a training tool for warfighters is lacking. They present an approach for understanding and measuring C2 training effectiveness and trainee performance, and they show how it could be applied to a large-scale Air Force Command and Control LVC training event.

I hope you enjoy this second issue of 2017 for The ITEA Journal of Test and Evaluation. By the time you receive issue 38-2 in June, the September 2017 issue 38-3 is being finalized. That theme will be “T&E of Cybersecurity and Readiness.” For the next issue (the last issue of 2017), 38-4, the deadline for submissions is just after September 1, 2017, and the theme will be “T&E for Enhanced Security.” This last theme of 2017 was designed to fit papers from Homeland Security, the Federal Aviation Association, other Federal and State agencies, and supporting contractors. We have posted all themes and descriptions for the remainder of 2017–early 2021 on the ITEA website. Please provide feedback on the choice of themes, and please write early and often.


March 2017 – Blending Systems Engineering, Life Cycle Support, Reliability, and Testing

March cover web 300pxThe theme for this issue is “Blending Systems Engineering, Life Cycle Support, Reliability, and Testing,” and the issue includes two Guest Editorials, Inside the Beltway, Historical Perspectives, Guest Editor’s Perspective, President’s Corner, and eight technical articles.

Our first Guest Editorial, written by William Miller, is “The Engagement of Systems Engineering with Test and Evaluation.” He states that this engagement is necessary because of the complication and interconnection of the systems of today, and he recommends that systems  engineering must engage with test and evaluation to improve systems development outcomes.

Our second Guest Editorial, from Laura McGill, is “Model-Based Engineering (MBE): Advancing Engineering Test and Evaluation,” which discusses the contributions of MBE to design and T&E. She states that MBE’s contribution to design starts with an increased understanding of the system being developed. MBE provides a means to transfer knowledge contained in the model to other engineers in the future and facilitates testing.

Our Inside the Beltway is titled “Complexity Consciousness.” In it Shelley Yak discusses the continued modernization of the National Airspace System (NAS) called Next Generation Air Transportation System (NextGen). She states that attaining and maintaining conscious awareness of the system, service, and stakeholder complexity elements of NextGen is a difficult but essential task as the NAS is transformed.

Our Historical Perspectives feature is titled “Modular Design and Standardization: Past, Present, and Future.” Author Andrew Russell, Ph.D., states that the modular approach to systems design has been adopted in a great variety of fields and is historically similar to the development of standards. Modularity is a strategy that can be used to manage complexity.

For this issue, we are fortunate to have a Guest Editor, Wilson N. Felder, Ph.D., who also provided the Guest Editor’s Perspective on the Issue. He solicited, guided, and sequenced the featured departments and eight technical articles. He also coordinated this issue with the International Council on Systems Engineering (INCOSE) INSIGHT magazine issue that will explore the same theme. His motivating focus was to show that systems engineering and test and evaluation are distinct disciplines that are essential to an integrated approach  toward providing the best systems in every sense. His goal for selecting the features and technical articles was to provide progress reports from the front lines of systems engineering and test and evaluation applied to an integrated system development cycle.

The President’s Corner, written by B. Gene Hudgins, looks ahead in his last year as our ITEA President and as the new administration takes office and begins. He discusses the line-up of articles in this issue and previews upcoming ITEA events for 2017.

Our first of eight technical articles, “Cost Effective Verification and Validation,” by Kevin Knudsen, states that there is a trend to blend and integrate systems engineering, reliability, life cycle support, and testing. Related to that trend, cost-effective verification and validation are performed as early as practical to identify, mitigate, and retire program risks earlier. Exposing design issues earlier in the product development life cycle saves resources.

The second technical article in this issue, “Developmental Test and Evaluation: ‘Shifting Left’ to Improve Defense System Reliability,” written by Irvin Boyles, states that a recent Department of Defense study team proposed activities and practices that begin earlier in new acquisition programs to achieve reliability growth and objectives earlier at less life cycle cost. He states that investing to address reliability shortfalls is most effectively accomplished during system development activities. He includes recommendations to improve reliability.

In a Peer-Reviewed article “Systems Engineering Measurement as a Leading Indicator for Project Performance,” Christian T. Orlowski et al. present the results of surveying systems engineering professionals to capture an industry perspective on  systems engineering measurement. The authors’ goal was to determine if systems engineering measurement on a project yields statistically significant improved project performance.

In our fourth technical article, “Shift Left Testing the Agile Way,” Suzanne Miller and Robert V. Binder state that there are considerations for success and failure for shifting developmental testing and operational testing to the left. They present four ways to shift testing left,  and they discuss the Agile and shift left synthesis. There are impediments to shift left testing that cannot be solved by Agile alone.

In the fifth technical article, “A Business Case for Using Models as a Life Cycle Design Canvas for Aerospace Systems,” Alexander A. Auguste states that the primary driver of cost from engineering change requests and schedule overruns in the development of highly complex systems is the problem of poor or degraded communications. He writes that a key solution is good normalized communications between the many organizations involved in the systems development in order to get early information correct.

In the sixth technical article, “Test and Evaluation for Enhanced Security: A Quantitative Method to Incorporate Expert Knowledge into Test Planning Decisions,” Davinia Rizzo and Mark Blackburn, Ph.D., discuss both the importance of testing complex systems to ensure effectiveness and the constraints on testing due to cost, schedule, safety, and feasibility reasons. Their paper proposes a method to assess the full scope of the factors (performance, cost, schedule, risk…) and leverage expert knowledge to provide a decision aid for test planning  throughout the program development cycle.

The seventh technical article, “Input Attribution for Statistical Model Checking Using Logistic Regression,” from Jeffery P. Hansen, Ph.D., et al., demonstrates an approach to synthesize input attributions that are both meaningful to the investigator and have predictive value on the predicate outcomes. The authors present an approach to input attribution that builds a statistical model from a simulation, implements statistical model checking across many simultaneous simulations, and validates their hypotheses, often gaining new, unexpected insights.

Our last technical article of the issue, “Modeling Avionics System Software Stability Using a Five-State Markov Model and Clopper-Pearson Analysis,” written by Erich Brownlow, proposes a Markov model for multi-functional avionics system modeling and uses statistical inference to evaluate avionics system software stability.

I hope you enjoy this first issue of 2017 for The ITEA Journal of Test and Evaluation. By the time you receive issue 38-1 in March, the June 2017 issue 38-2 is being finalized. That theme will be “Training the Future T&E Workforce.” For the next issue (the third issue of 2017), 38-3, the deadline for submissions is just after June 1, 2017, and the theme will be “T&E of Cyber Security and Readiness.” We have posted all themes and descriptions for the remainder of 2017–early 2021 on the ITEA website.

Please provide feedback on the choice of themes, and please write early and often. Please remember that a one-year membership in ITEA (and 4 issues of The ITEA Journal of Test and Evaluation) can go to any student on your gift list for just $25!