2011 ITEA Journal Issues

2011 ITEA Journal Issues

 “Test as We Fight” – December 2011 Issue at a Glance

jite-32-04-front-coverActual operations aren’t merely realistic, they are real. ‘Test as we fight’ requires every cue that impinges on the senses to be considered, every constraint present, and every threat represented at the same level of fidelity as the friendly force. It entails operators who are trained as we fight and embodies integrated testing and its relationship to the operational environment. Outside of the Department of Defense the counterpart is testing in operationally representative environments, whether for air traffic control, power plant operations, harbormasters in busy commercial ports, or law enforcement.

The Guest Editorial is coauthored by three experienced Department of Defense test and evaluation (T&E) professionals, The Honorable John Krings, The Honorable Thomas P. Christie and Mr. Pete Adolph, attesting to the need for enforcing existing directives and guidance as a primary step in addressing shortcomings in the acquisition process. COL (R) Wilbur Gray describes the origin, history and use of military wargaming in Historical Perspectives.

December sees a new special section featuring perspectives from three of the Department of Defense and one industry T&E Executive. The Honorable Dr. J. Michael Gilmore, Director of Operational Test and Evaluation, summarizes two recent studies that examined acquisition program delays, concluding that none of the reasons cited included T&E. Dr. Steven Hutchison, Defense Information Systems Agency (DISA) T&E Executive, describes the need for designing a test to determine if the product does what the user expects it to do, and the DISA lessons captured from the information technology sector to help craft an improved approach to testing. Ms. Amy Markowich, Senior Executive for Department of the Navy Test and Evaluation, ‘‘busts’’ some of the myths of T&E to dispel commonly held false beliefs and reveal kernels of  truth. Finally, Mr. Dennis O’Donoghue, Vice President of Boeing Test & Evaluation, explores test as we fight as a cultural shift in test philosophy necessitated by complex system-of-systems testing. In the 2011 ITEA Technology Review Best Paper, Mr. Michael Curry, Mr. Noe Duarte and Ms. Nancy Sodano present a T&E infrastructure for data-driven decision-making and development of an integrated, end-to-end T&E strategy.

The contributed articles open with a need for creating a more relevant operational environment by crafting realism and enabling free-play in operational testing, by MAJ Cornelius Allen, Jr and his coauthors. Three articles from the Joint Interoperability Test Command (JITC) directly  speak to test as we fight.  Mr. Richard Delgado, Jr et al. explain JITC automated test tools and procedures and their value in providing rigor to the testing process and raising the confidence level of the test results. MAJ Lee Brinker iscusses the importance of distributed integration and testing in realizing test as we fight. MAJ Robert Houston reviews multi-national testing for assessing operational interoperability of various command, control, communications, and computer systems.

Ms. Jamie Pilar and co-workers introduce modeling and simulation tools that are flexible and network intensive to fully investigate the value-added of systems during evaluation. Mr. Bill Rearick demonstrates how computer modeling and simulation has the ability  to elevate testing and training to a whole new level of realism and fidelity. Dr. Charlie Holman and Ms. Anika Dodds illustrate the use of bioinformatics tools for improving the accuracy of biological warfare agent detection systems. Mr. Todd Remund and Mr. William Kitto apply Monte Carlo techniques for achieving statistical rigor in aircraft T&E. Mr. Craig Schlenoff et al. review lessons learned in evaluating military systems, citing decisions made during the evaluation design stage as possibly the most critical. Mr. Brian Weiss and Prof. Linda Schmidt describe a methodology for automating evaluation design and test plan generation for complex systems. The issue closes with an article by Ms. Nancy Welliver and Ms. Marguerite Shepler that describes two methods for meeting system reliability requirements and reducing life cycle support costs for major weapon systems.

“Rigor of the Scientific Method” – September 2011 Issue at a Glance  

sept-2011-front-coverRigor is scrupulous or inflexible accuracy; applied to the scientific method, it produces a body of techniques for investigating phenomena, acquiring new knowledge, or correcting and integrating previous knowledge. Systems engineering is simply the application of the scientific method to the concept, design, development, and building of complex systems. Bringing the rigor of the scientific method to test and evaluation (T&E) requires adopting emerging technology and makes it incumbent upon testers to go beyond acquiring data to digging into metrics, concept of operations, requirements, system design, and mission environment.

Prof. Douglas Montgomery of Arizona State University literally wrote the book on rigor in the design of experiments and offers his perspective in our Guest Editorial. Dr. Michael Gilmore, the Director of Operational Test and Evaluation, writes from Inside the Beltway on rigor and objectivity in test and evaluation, a follow up to his September 2010 ITEA Journal Guest Editorial. Mr. Robert Stochl provides new insights for managing and storing cryogenic liquids for long duration missions in TechNotes. The centennial of Naval aviation is celebrated in Historical Perspectives with an interview of Mr. Gary K. Kessler, Executive Director, Naval Air Warfare Center Aircraft Division. Ms. Nanci Newhouse of the Joint Interoperability Test Command (JITC) showcases the new  JITC test facility at Fort Meade, MD in Featured Capability.

Opening the issue is an article by Ms. Eileen Bjorkman and Dr. James Brownlow describing an ongoing Air Force program to expand the use of statistical methods for planning, executing, and analyzing tests. Dr. Laura Freeman and coauthors explain a ‘‘4-D’’ process that, when combined with expert engineering judgment and experience, permit the quantitative assessment of risk for acquisition decision making. Dr. Robert Holcomb presents Bayesian analysis techniques and case study methods as alternative approaches to gain confidence in operational test data for ground combat systems. Mr. Zachary Zimmer et al. employ design of experiments in operational testing as a collaborative process involving evaluators, subject matter experts, statisticians, testers, 
and regulators. Capt. Casey Haase et al. offer a statistical design approach to address the complexity of live-virtual-constructive-based experiments.

Dr. Mark Kiemele offers his Identify-Design- Optimize-Validate methodology as a rigorous and scientific approach to the design and development of new products and systems. Dr. Dean Thomas examines previous operational tests and concludes that though structured approaches captured some facets of design of experiments, a more rigorous application could yield improvements. Dr. Peter Carter and colleagues exploit and advance modeling and simulation through implementation of the general principles of systems engineering, enhancing modeling and simulation as a value-multiplier for testing. Dr. Arnon Hurwitz et al. compare parametric, simulation, and Bayesian methods for characterizing and bounding radial target location error distributions, concluding that for small data sets results are consistent among them. 

Three articles address wireless technologies for open air testing. Dr. Theodore Miller describes a maximum likelihood method making it possible to assess the accuracy of very accurate GPS. William Chen et al. perform laboratory experiments to assess a method for reducing multipath fading, which, if successful would improve radar, communications and telemetry posttest data processing in chamber and open air tests. Dr. Marilynn Wylie and Glenn Green present a spectrally efficient modulation scheme for telemetry that is also power and bandwidth efficient.

Mr. Gregg Beitel and coworkers close the issue illustrating the benefit of a university-government partnership between the Air Force Arnold Engineering Development Center and the University of Tennessee Space Institute for rapid development and validation of propulsion diagnostics systems, instrumentation, and test techniques.

“Organizing to Accomplish the Mission” – June 2011 Issue at a Glance

june-2011-front-coverJust as we face a rapidly and continuously evolving threat, the Department of Defense test community must adapt to ever changing test requirements and circumstances, for example, more complex systems, shorter schedules, reduced budgets, a leaner workforce, and rapid acquisition to name a few. The aging workforce continually challenges us to find critical skills while more complex systems demand not only a flexible workforce but more ‘‘System of System’’ engineers. Doing more with less is not new but remains a mantra. This issue addresses ways the test community is responding and delivering by organizing to accomplish the test mission.

Dr. David Alberts, the Director of Research in the Office of the Secretary of Defense Chief Information Officer (DoD CIO), provides in the Guest Editorial a call for partnership between the test and evaluation (T&E) and research communities, to better understand agility, incorporate the consideration of agility in T&E, thus improving the agility of the capabilities we acquire. Daniel Laird of the Air Force Flight Test Center analyzes communication link resynchronization, critical information for the performance of new telemetry systems. Randy Herrin of the Joint Interoperability Test Command (JITC) and Chris Watson of the Defense Information Systems Agency present the history and evolution of the JITC for Historical Perspectives.

Commanders of three of the Department of Defense Operational Test Agencies offer perspectives on Organizing to Accomplish the Mission. Major General David Eichhorn, Commander of the Air Force Operational Test and Evaluation Center, discusses the impact of the national debt and budget deficits on acquisition, and how T&E as the conscience of acquisition is more critical today than it has ever been. Rear Admiral David Dunaway, Commander, Operational Test and Evaluation Force, describes improved operational test efficiencies and better early involvement through internal process improvements including use of integrated testing, mission based test design, and design of experiments. Colonel Joe Puett, Commander, Joint Interoperability Test Command,  presents the challenges of balancing warfighters’ demands for increased IT capabilities with strengthening our national security capability while DoD seeks to achieve budget efficiencies.

Two best paper awards from the 2011 LVC Conference appear in this issue. In the Best Undergraduate Student Paper, Marjorie Ingle et al. from the Center for Space Exploration Technology Research at the University of Texas at El Paso, design a thrust balance and test micropropulsion thrusters intended for space applications. In the Best Graduate Student Paper, Jose Barajas and colleagues from New Mexico State University use a rigorous systems engineering approach to identify environmentally benign candidate technologies for energy harvesting and scavenging.

Dr. James Valdes et al. of the National Defense University survey requirements and the state of the art in ground truth determination for bioaerosol detector testing. Brian Weiss from the National Institute of Standards and Technology and Dr. Linda Schmidt from the University of Maryland continue work on tools for complex evaluation designs of advanced and intelligent systems. Mark Jones reviews five proposed changes for test and evaluation to improve our ability to identify critical failure modes and design flaws early
in a system’s lifecycle.

The issue closes with two articles from the 2010 Technology Review. David Bate of Scientific Research Corporation examines the effects of sleep deprivation on working memory and cognitive performance, concluding that no single metric can characterize the effects. Dr. Joshua Gomer of Human Factor Engineering and Dr. Christopher Pagano of Clemson University demonstrate the utility of the NASA Task Load Index for assessing subjective workload data in complex human system technologies, valuable in operator selection and training as well as in design of human system technologies.

“Software Intensive Systems” – March 2011 Issue at a Glance

mar-2011-front-coverSoftware plays an increasingly pervasive role in our lives and is a multi-billion dollar enterprise. It would be nearly impossible to live an entire day without contacting a software-driven system, from household appliances to automobiles to utilities; air transportation and law enforcement; entertainment, commerce and education; national defense and health care. Software-intensive systems include largescale heterogeneous and embedded systems and in the near future will exhibit adaptive behavior. They will process knowledge rather than merely data and will dynamically alter their structure. Two primary challenges in designing and testing software systems are their increasing complexity, and the increasing need to adapt them to fast changing technology and environments. Requirements engineering methods, techniques, and tools help ensure successful customer involvement throughout the system lifecycle. For large military systems this is particularly difficult because requirements can be based on subjective demands, evolve over  time, and be troublesome to precisely define. This issue examines software and testing in several manifestations.

Dan Craytor, Chief Architect of the Microsoft Department of Defense sales district and retired Army aviator, uses the Guest Editorial to introduce cloud computing as the next tipping point in technology that will bring new capabilities to the warfighter. Dr.  CatherineWarner, new Science Advisor to the Director, Operational Test & Evaluation, introduces her agenda, including integrated testing, increased scientific rigor, and reliability growth and tracking. Dr. JamesWelshans, chair of the ITEA History Committee,  interviews a true software testing pioneer for Historical Perspectives.

Will Manning of the US Army Operational Test Command illustrates with three examples the role operational testing plays in software intensive systems evaluation. Major Douglas Kaupa and Michael Whelan describe early influence through integrated testing 
at the Air Force Operational Test and Evaluation Center. Colonel Michael W. Kometer, et al. examine the need for a joint system of systems test methodology for operational testing, and the intrinsic role distributed testing should play. Kevin Johnson and Dr.  Mark Kiemele demonstrate the benefits of design of experiments in testing and its potential for becoming the science of testing.

In a pair of articles from the National Institute of Standards and Technology, Craig Schlenoff and coworkers first present challenges in the design of an effective performance evaluation for intelligent systems. That article is followed by Brian Weiss and Craig Schlenoff’s experience leading the design and implementation of performance evaluations for a speech-to-speech translation system. In another pair  of articles, this time from the Information Sciences Institute of the University of Southern California, Dan Davis and colleagues begin with a discussion of General Purpose Graphics Processing Units (GPGPUs) and related accelerator technology and the potential impact for test and evaluation. The companion article by Dr. Ke-Thia Yao et al. argues for the implementation  of a scalable data grid for distributed data analysis and data mining, and provides results from an application at the Joint Forces Command. The contributed articles close with Nicholas Lenzi et al.’s flexible test and evaluation framework developed for unmanned and autonomous systems.