CLICK HERE for access to the FULL ONLINE issue (Members Only)
Blending Systems Engineering, Reliability, Life Cycle Support and Testing
The theme for this issue is “Blending Systems Engineering, Reliability, Life Cycle Support and Testing,“ and the issue includes a Guest Editorial, an Inside the Issue feature, a Historical Perspective, President’s Corner, a STEM feature, and seven technical articles.
Our Inside the Issue feature titled “Systems Engineering, Reliability, Life Cycle Support and Testing: A New Theory of Test for Complex Systems,” is authored by Wilson Felder, Ph.D., our Guest Editor for this issue. His feature summarizes the key themes of the articles he helped gather for this issue and lays the groundwork for future issues that will continue to tackle the maturing theory of testing complex systems.
Our Guest Editorial is titled “The Future of Systems Engineering,” and Michael Griffin states that systems engineers must embrace elegance of design as a primary goal, and systems engineering research must focus on ways to mathematically describe what that may mean if the field of systems engineering is to progress.
For our Historical Perspective feature, “Stall/Post-Stall Test Evolution from Recovery to Prevention--The Beginning,” Charles “Pete” Adolph and Patrick Sharp summarize their experience in high angle-of-attack (AOA) Air Force flight testing in the 1970s.
In President’s Corner, Gene Hudgins writes his first article “We’ve Turned the Corner” as the new president of ITEA. A positive article of thanks, due in a major way to all your efforts.
As our last feature of this issue, we include a STEM article “Clemson University’s New Innovation Center Features 21st Century Educational Advancements” that illustrates how the center is designed to help students and the university be ready to lead development of creative solutions to significant technical and social challenges.
Our first of seven technical articles is “Automated Test for NASA cFS,” by David McComas, et al, describes the challenges facing the cFS and the results of an effort to apply a new testing approach to the cFS applications.
The second technical article in this issue, “How New Test and Evaluation Policy is Being Used to De-Risk Project Approvals through Preview T&E,” written by Keith Joiner, Ph.D., was the first article to go through our new peer review process. Congratulations to Dr. Joiner on his article that describes how preview T&E in Australia is able to capture and mitigate high operational and technical risks wherever possible.
In “Continuous System Monitoring as a Test Tool for Complex Systems of Systems,” Bruce Normann presents his findings that there is a need and an opportunity to extend the T&E discipline deeper into the operational portion of the system life cycle.
In our fourth technical article, “Verification of Software Intensive System Reliability and Availability through Testing and Modeling,” Myron Hecht, Pd.D., suggests an alternative approach to testing software intensive systems based on testing more accessible test articles at the subsystem and component level and then combining the testing results using a credible system level reliability model.
For our next technical article in this issue, “Taming the Complexity Beast,” Darren Cofer, Ph.D., describes the expected software-intensive, highly complex systems of the future and suggests one way to tame the complexity beast: moving from test-based verification to analysis-based verification.
In the sixth technical article, “A System Dynamics Approach to Evaluate the Long-Term Effects of Requirement Changes and Unit Quantity Changes During Department of Defense Acquisition,” Daniel Shin, et al, propose a methodology that uses a balance model based on system dynamics principles to evaluate the effects of changes in several variables on program cost.
Our seventh technical article of the issue, “System of Systems Complexity Addressed by Practical Adiabatic Quantum Computing,” written by Robert Lucas, Ph.D., et al, describes the powerful new capability that should be available to the T&E community soon to solve intractably complex problems in near real time.
I hope you enjoy this last issue of 2015 for The ITEA Journal. By the time you receive issue 36-4, the March 2016 issue 37-1 is being finalized. That theme will be “Leveraging Training and Experimentation Infrastructure and Events for T&E.” This theme for March 2016 has not been used before, and we will be openminded with a very liberal acceptance posture. For the next issue (the second issue of 2016), 37-2, the deadline for submissions is just after March 1, 2016, and the theme will be “Inter-Agency and International T&E.”
We have posted all themes for the remainder of 2016 and the first two issues of 2017. Please provide feedback on the choice of themes, and please write early and often.
T&E of Information Assurance, Information Security,
and Cyber Readiness
The theme for this issue is “T&E of Information Assurance, Information Security, and Cyber Readiness,” and the issue includes a Guest Editorial, an Inside the Beltway feature, a Hot Wash-Up feature, two Historical Perspectives, and six technical articles.
Our Guest Editorial is titled “Cyber Security T&E: Closing the Gap Between Authorities to Operate and Operating Securely,” and Steve Hutchison, Ph.D., states that network-enabled command and control systems have revolutionized the way we plan, process, and share information. Unfortunately, this also presents a cyberrich targeting environment for our enemies. Dr. Hutchison goes on to describe how we can defend the net in somewhat the same way we already defend the ground, and we can think of cyberspace as a contested operational domain.
In our Inside the Beltway feature titled “There is Life Inside the Beltway and Progress on Cyber Security Test and Evaluation,” authored by Pete Christensen, a 2009 study that proposed moving information assurance (IA) considerations earlier in program development and establishing operationally realistic IA test environments as part of the T&E process is reviewed. From that starting point, Mr. Christensen then describes the progress toward these goals and the actions that are still being completed.
In his Hot Wash-Up feature titled, “Testing Cyber Systems: A Final Exam,” Wilson Felder, Ph.D. describes cyber systems as a perfect storm of difficulties. Dr. Felder offers that cyber systems are really hard systems to test—complicated not in the least by a fluidly changing cast of participants. He concludes that the test methods best formed to deal with cyber systems may be very useful beyond purely cyber systems.
For our Historical Perspective features, Jim Welshans, Ed.D., provides us with two interviews that provide personal views of cyber history. The original personal view of cyber history as lived in the cyber trenches is a reprint of an article from 2010 (The ITEA Journal of Test and Evaluation issue 31: pages 449-452) titled “History of Cyber Testing and Evaluation—A Voice from the Front Lines.” It covers an interview with Vince Holtmann, a telecommunications engineer who began active work in cyber security in about 1995. The most recent look at personal experiences in cyber is titled “Operationalizing Cyber Warfare,” and in this article, Dr. Welshans interviews Colonel (Retired) Marc “Homer” Jamison, who was one of the original staff members in US Cyber Command.
Our first of six technical articles is “Test and Evaluation of Complex Cyber Security Systems: A Case Study in Using Modeling and Simulation to More Efficiently Understand, Test, and Evaluate the Security of Quantum Key Distribution Systems,” by Major Logan Mailloux, et al, demonstrating how modeling and simulation (M&S) can be used to more fully understand the threats and vulnerabilities of complex information systems. The authors also suggest a more efficient process for security testers to determine what behaviors need to be tested and how much testing is enough.
The second technical article in this issue, “A Centralized Holistic Approach to Information Assurance Testing,” written by Katherine McGinn, discusses some issues with legacy approaches to IA, and recommends a more integrated approach that supports continuous monitoring, validation, and authorization. Also, in the future, software updates and configuration changes may be implemented in a more controlled environment to control risk.’
In “Mitigating RDT&E Risk: TRMC Addresses Cyber Security Challenges through Common Architecture, Software, and Processes,” Ryan Norman, et al, present a way that the Test and Training Enabling Network Architecture (TENA) provides support to IA. The TENA framework provides strict software development processes and standards that help maintain IA certifications. The article proposes that common architecture, software, and processes promote a test infrastructure that is more secure from cyber vulnerabilities.
In our fourth technical article, “Processing and Analysis of Large Data Sets from High Bandwidth Tactical Networking Experiments Using High Performance Computing,” Ken Renard, et al, propose more scalable methods to process large amounts of recorded data into a form that analysts can use to make critical decisions. The authors describe an approach to implement a parallel framework for processing and reducing large data sets from network testing, improving the turnaround time between testing event and the analysis of test data.
For our next technical article in this issue, “Critical STAT Questions DoD Leadership Should Ask: Injecting Rigor into the Test and Evaluation Process through Effective Inquiry,” Michael Harman, et al, describe the need to apply scientific test and analysis techniques (STAT). The focus of the article is on the leadership aspect of the test planning process and identifying if the proposed testing will be sufficient.
In the last technical article of this issue, “ECC for Heterogeneous Wireless Sensor Networks Using Ancient Indian Vedic Mathematics,” S. Hemalatha, et al, state that the main objectives of the paper are to reduce the energy required for the sensor network to perform cryptographic algorithms and to also reduce the storage space for the network cryptographic key. Elliptic Curve Cryptography (ECC) requires a smaller key size than more commonly used methods. The authors propose using ancient Indian Vedic mathematics to help realize these goals.
I hope you enjoy this third issue of 2015 for The ITEA Journal of Test and Evaluation. By the time you receive issue 36-3, the December 2015 issue 36-4 is being finalized. That theme will be “Blending Systems Engineering, Reliability, Lifecycle Support, and Testing.” For the next issue (the first issue of 2016), 37-1, the deadline for submissions is just after December 1, 2015, and the theme will be “Leveraging Training and Experimentation Infrastructure and Events for T&E.” This theme for March 2016 has not been used before, and we will be open-minded with a very liberal acceptance posture.
We have posted all themes for the remainder of 2016 and the first two issues of 2017. Please provide feedback on the choice of themes, and please write early and often.
Test Methodology Rigor
The theme for this issue is “Test Methodology Rigor,” and the issue includes a Guest Editorial, an Inside the Beltway feature, a Hot Wash-Up feature, the Historical Perspectives, and five technical articles.
Our Guest Editorial is titled “Continuing to Advance the Science of Test in Operational Test and Evaluation,” and the Honorable J. Michael Gilmore, Ph.D., states that he has seen substantial improvements in the use of scientific test design and statistical analysis techniques. Through the use of these techniques, the department has been provided with test results that identify what the systems being acquired by the Services can and cannot do in combat. Dr. Gilmore goes on to describe his near-term areas of focus.
In our Inside the Beltway feature titled “How Scientific Test and Analysis Techniques Can Assist the Chief Development Tester,” authored by Terry Murphy, et al, strategies executed across the testing continuum are described to assist the Chief Development Tester (CDT) in identifying problems early, understanding performance and reliability, and ultimately reducing discovery in Operational Testing. The authors explain the role and benefits of applying Scientific Test and Analysis Techniques across a system’s acquisition life cycle.
In his Hot Wash-Up feature titled, “A Question of Rigor,” Wilson Felder, Ph.D. discusses the meaning of rigor, and describes it from the minimum benchmark to a more complete definition. Dr. Felder offers five practical elements that provide a vision of the path forward in this rapidly maturing discipline – a roadmap for continued growth.
Our Historical Perspective feature titled “Measuring the Enlightenment – Thomas Jefferson and the Near-Adoption of Rationality in Early America” from Jim Welshans, Ed.D. provides a look at Thomas Jefferson’s quest to establish a rational, ordered, and consistent method of measuring and monetary systems based on units of ten. Dr. Welshans discusses Thomas Jefferson’s amazing and gifted ways of thinking and describes the story of how Mr. Jefferson developed his standard measures in about 1790. Congress passed an act to establish Mr. Jefferson’s measurement methods in 1796. Thomas Jefferson was an author of the Declaration of Independence, Governor of Virginia for two terms, United States Minister to France, Secretary of State, Vice President of the United States, and then President Thomas Jefferson was our third President, March 4, 1801 – March 4, 1809.
Our first of five technical articles is “Radar-Range Confidence Intervals for Group Comparisons and Specification Compliance Evaluation,” by Todd Remund and Arnon Hurwitz, Ph.D., and the authors describe confidence interval methods for estimation and analysis of calibration points and differences between calibration points. Application of their delta method is proposed as a useful and practitioner-friendly alternative to other more complicated methods.
For the second technical article in this issue, “A Monte Carlo Approach to Estimating Test Duration and Statistical Power,” written by Bruce Bishop and Lieutenant Colonel Bruce Cox, Ph.D., discusses the use of Monte Carlo simulation to help in estimating statistical power for a given amount of testing. The authors include the underlying theory and numerical examples to illustrate the method, results, and interpretation of the results. The authors’ method helps answer the question “How Much Testing is Enough?” in an alternative way to traditional statistical power analyses.
In “Twenty-One Parameters for Rigorous, Robust, and Realistic Operational Testing,” Richard Kass, Ph.D. proposes the 21 parameters to supplement the classical seven-step design and analysis approaches used for Design of Experiments. The author points out that often enhancing test rigor necessarily constrains the levels of robustness and realism and vice versa. His 21 parameters may help find the appropriate levels of rigor, robustness, and realism to support acquisition decisions.
In our fourth technical article, “Prioritization of Component-to-Component Links for Testing in Component- Based Systems,” Subash Kafle, et al, propose a method that is demonstrated through application to a component-based system and validation through a Markov chain analysis of state transitions. The proposed method uses a multi-dimensional dependency analysis to identify link priorities for testing. The answers generated by such a method may be key to cost-effective integration testing.
For our fifth and final technical article in this issue, “Validating the Probability of Raid Annihilation (PRA) Testbed Using a Statistically Rigorous Approach,” Dean Thomas, Ph.D. and Rebecca Dickinson, Ph.D. describe an approach to validate a model with only a small number of live tests. This concept is ingenious, and it answers the need to limit live tests due to resource limitations yet test enough across the operational envelope to have acceptable statistical power. The modeling and simulation must be verified, validated, and accredited, with only a limited number of live tests to validate the simulation. The authors present a straightforward method and illustrate use of the method for the PRA Testbed runs.
I hope you enjoy this second issue of 2015 for The ITEA Journal of Test and Evaluation. By the time you receive issue 36-2, the September 2015 issue 36-3 is being finalized. That theme will be “T&E of Information Assurance, Information Security, and Cyber Readiness.” For the next issue, 36-4, the deadline for submissions is September 1, 2015, and the theme will be “Blending Systems Engineering, Reliability, Lifecycle Support, and Testing.” This theme for December 2015 has not been used before, and we will be open-minded with a very liberal acceptance posture.
We have posted all themes for the remainder of 2015 and for 2016 and the first two issues of 2017. Please provide feedback on the choice of themes, and please write early and often. And, I’m still waiting for the first article to be submitted to the Referees. The first article submitted as a proposed Refereed Article and approved for publication by the Referees will be awarded a free one-year ITEA membership that includes subscription to The ITEA Journal of Test and Evaluation.
March 2015 - The Right Mix of T&E Infrastructure
The theme for this issue is The Right Mix of T&E Infrastructure, and the issue includes a Guest Editorial, an Inside the Beltway feature, a Hot Wash-Up feature, the Historical Perspectives, and five technical articles.
Our Guest Editorial is titled “What is the Right Mix of T&E Infrastructure – What is Required?,” and Major General Arnold Bunch states that this is an area that the T&E Enterprise has always struggled with. He maintains that we must guard against providing desired capability in one area at the expense of the required capability in another area. General Bunch discusses the Air Force Strategic Guidance, A Call to the Future, and points out that this document, outlining the major Air Force acquisitions and technologies, is one of the g uiding documents for future T&E infrastructure needs. He continues his editorial by describing the elements of infrastructure that need to be modernized and supported.
In our Inside the Beltway feature titled “The Test Resource Management Center: Stewards of the Defense T&E Infrastructure,” C. David Brown, Ph.D., describes his responsibilities as the Director of the Test Resource Management Center (TRMC) to oversee our national defense Test and Evaluation (T&E) infrastructure. In his article, Dr. Brown details missions, responsibilities, and inner workings of the TRMC. He discusses the processes and oversight responsibilities of his office; and he describes the TRMC organizational structure, T&E Science and Technology program, Central T&E Investment Program, Joint Mission Environment Test Capability, and the National Cyber Range. TRMC seeks to provide robust and flexible T&E support to warfighters today and tomorrow.
In his Hot Wash-Up feature titled, “Managing the National Test and Evaluation Infrastructure: Myths and Opportunities,” Wilson Felder, Ph.D., states that the reality of how we can better manage our national test infrastructure to avoid duplication, increase efficiency, or lower cost is complicated. Dr. Felder discovered, in leading a joint National Aeronautics and Space Administration and Department of Defense study, that there were serious shortfalls in the national inventory in several areas, including airborne test assets. He also describes barriers to change that fall into the organizational, funding constraints across agencies, and the short term nature of T&E funding areas. His feature then discusses opportunities to consolidate, share, and more effectively use the T&E facilities we maintain.
Our Historical Perspectives feature titled “No Particular Place to Go – On the Road Again” from Jim Welshans, Ed.D., provides a look at the General Motors (GM) Proving Ground that opened in 1924. Dr. Welshans discusses the tests devised for measures of quality and performance. The GM Proving Ground was one contributor to improving safety and effectiveness of automobiles from the average longevity of six years on the road, rarely staying in service beyond 25,000 miles of use during the first years of the automobile industry.
Our first of five technical articles is “Wind Tunnel Testing in the Department of Defense” written by Edward Kraft, Ph.D., and Lieutenant David Stebbins. The authors have cataloged the eight world-class wind tunnels that the Department of Defense has in the Major Range and Test Facility Base. These wind tunnels cover flight regimes from subsonic to hypersonic, and they can support aerodynamics, propulsion integration, weapons integration, structural load behavior, heat transfer, and air vehicle platform test insight for current as well as emerging flight systems. The authors recommend that the nation develop a persistent roadmap to assure the viability of critical resources like those described in this article.
For the second technical article in this issue, “Using the Developmental Evaluation Framework to Right Size Cyber T&E Test Data and Infrastructure Requirements” C. David Brown, Ph.D., et al, state that program managers (PMs) must identify cyber security requirements early in the acquisition lifecycle. The authors state that PMs can then take advantage of early cyber security testing in Developmental T&E in advance of Operational T&E. The authors also state that cyber security needs to be addressed in a collaborative and disciplined manner throughout the entire acquisition lifecycle for programs. Cyber security T&E involvement early in a program will help PMs deliver resilient systems to the warfighter.
In our third technical article, Mark Radke, et al, deliver an analysis of “Modernizing the Test and Evaluation Infrastructure to Support Next Generation Aeronautical Mobile Telemetry” stating that telemetry networks are a necessary building block to assist in achieving the T&E architecture to support future distributed test events effectively and efficiently. A networked instrumentation system with a complementary wireless component will be a key enabling technology for this capability. Some of the developments proposed may help support large footprint testing required of new and emerging weapon systems even when time and resources are reduced.
In “Applying the Test and Evaluation (T&E) Paradigm to Evaluating the T&E Infrastructure – Wheelchair Test Days at Edwards Air Force Base,” Darcy Painter describes an initiative at Edwards Air Force Base (AFB) to apply T&E paradigms to test wheelchair accessibility. It is one thing to look at situations, and it is yet another experience to sit in the chair and try to go through the workday. Ms. Painter describes how Edwards AFB leaders participate in Wheelchair Test Day (WTD); the outcome is realization that something that looks easy is hard – through direct experience, they realize how challenging Edwards AFB vintage facilities are to navigate using four wheels vice two legs. Results of WTD have helped in the prioritization of improvements to facilities, entrances, doors, roads, and crossings.
For our final technical article in this issue, “Testing to Ensure Compliance with 1% UXO Limitations” David Sparrow, Ph.D., et al, describe preliminary testing of a cluster munition with the addition of a self-destruct fuse (SDF) to ensure that the delivery of the submunitions do not result in more than 1% unexploded ordnance (UXO). The need to test this capability is required by United States law. The paper discusses the proposed extensive testing of the SDF, setting the evaluation criteria, and selecting the appropriate sample sizes to conduct a meaningful test.
I hope you enjoy this first issue of 2015 for The ITEA Journal of Test and Evaluation. By the time you receive issue 36-1, the June 2015 issue 36-2 is being finalized. That theme will be “Test Methodology Rigor.” For the next issue, 36 -3, the deadline for submissions is 1 June 2015, and the theme will be “T&E of Information Assurance, Information Security, and Cyber Readiness.”
We have posted all themes for the remainder of 2015 and for 2016. Please provide feedback on the choice of themes, and please write early and often. And, I’m still waiting for the first article to be submitted to the Referees.