Articles

2017 Pre-Symposium Tutorials

34th Annual International Test and Evaluation Symposium
T&E in a Time of Risk and Change

Hyatt Regency ~ 1800 Presidents Street ~ Reston, VA  20190


NOTE: These tutorials require a separate fee from the Symposium.

Single 4-hour Tutorial - $205, Two 4-hour Tutorials - $385 (use discount code "Tutorial-Multi" at check out).


MONDAY - October 2, 2016

 

 

THURSDAY - October 5, 2016

  


 

Tutorial Abstracts

 


Cybersecurity Test & Evaluation and the National Cyber Range

Instructor: Mr. Pete Christensen, CTEP – Director, National Cyber Range, Test Resource Management Center (TRMC)

This tutorial is intended for managers and practitioners who are required to conduct test and evaluation of systems operating in Cyberspace. The tutorial introduces key concepts associated with Cyberspace and Cyberspace Operations. The material will cover both Offensive Cyber Operations and key avenues of attack as well as Defensive Cyber Operations and strategies for defending against those attacks. With respect to the DOD 5000 Process, when we will discuss approaches for developing and testing systems to ensure mission effectiveness in a contested Cyber Environment. Finally, we will overview available resources and ongoing initiatives to improve Cyberspace T&E.


Data Science and Its Relationship to Test & Evaluation

Instructor: Mark Kiemele, Ph.D. – President, Air Academy Associates

In a data-driven economy, industry and government leaders rely increasingly on skilled professionals who can see the significance in data and use data analytic techniques to properly collect data, solve problems, create new opportunities, and shape change.  Data science can be defined as the art and science of solving problems and shaping decisions through the precise collection and analysis of data.  This tutorial is intended for executives, leaders, managers, and practitioners who need to know how their critical thinking can be impacted by such things as Big Data, Predictive Analytics, Design of Experiments (DOE) and other tools in the Data Science toolkit.   This tutorial will cover the need for critical thinking as well as a high-level view of a variety of data analytic tools that can be used to enhance critical thinking.  Even if one never designs a test or evaluates its results, this tutorial participant will be able to explain the uniqueness of DOE and why big data and predictive analytics are needed to generate the analytical capability every organization needs.


Planning and Executing Cyber Table Tops, Facilitator Training

Instructor: Ms. Sarah Standard, Cybersecurity/Interoperability Technical Director, DASD DT&E

The primary objective of the Cyber Table Top (CTT) Facilitator Training Workshop is to build the knowledge, skills and abilities that will allow trainees to successfully construct, coordinate, organize, and execute a Cyber Table Top (CTT) exercise. The primary audience for this training are those personnel who will facilitate and moderate CTT’s for their program, command. The training will include tips, tools, and resources for CTT facilitators as well as a practical example of the process and outputs.


Processes for Testing with International Partners

Instructor: Mr. Robert Butterworth – Sustainable Ranges and International Programs, Director, Operational Test & Evaluation (DOT&E)

Defense budgets are shrinking; requirements for complex systems and systems–of–systems are increasing; and interoperability with allies is becoming the norm by necessity. These are challenges all nations are facing.  Duplicative testing is inefficient for all nations, so sharing of “test resources” is highly desirable. "Test resources" includes test facilities, open air ranges and operating areas, laboratories, equipment, expertise, methods, data, and funds. Upon making the decision to test, participants must complete certain administrative actions to implement a test program. To test with an international partner an international agreement must be in force. To test under such an agreement, the partnering nations must negotiate and approve a project arrangement. The laws of sovereign nations govern such activity and DOD has developed administrative processes to ensure statutory compliance. The Office of the Director, Operational Test and Evaluation (DOT&E) will offer a tutorial to inform members of the test community of the capabilities and limitations of the international Test and Evaluation Program and how to develop project arrangements with an individual and with multiple partnering nations.

Speakers will be representatives from the Office of the Director, International Cooperation in the Office of the Undersecretary of Defense for Acquisition, Technology, and Logistics, the International Test and Evaluation team within DOT&E, and international partners with whom the DOD test community has worked for many years.


Real-World DOE and Modern Design and Analysis Methods

Instructor: Thomas A. Donnelly, PhD, CAP

Part 1: Custom DOE – Making Your Design Fit the Problem
This tutorial will present solutions to real-world Design of Experiment (DOE) problems.  You will learn how to treat separately and in-combination, factors of the following types: continuous/quantitative, categorical/qualitative, discrete numeric, mixture, covariate, blocking, and hard-to-change.  It will demonstrate how to constrain design regions and disallow certain factor level combinations.  It will show how to augment or add onto existing experiments.  By using both augmentation and constraints it will show how to repair a broken design. It will show how to design for special knowledge of the model.  Algorithmic custom DOE is the most efficient way to develop accurate and useful models of complex real-world processes.

Part 2: Using Definitive Screening Designs to Get More Information from Fewer Trials
This tutorial is for testers interested in learning to use the new Definitive Screening Design (DSD) method of Design of Experiments. DSDs not only efficiently identify important factors but can often support second-order predictive models. For the same number of factors three-level DSDs are often smaller than popularly used 2-level fractional-factorial (FF) designs yet yield more information especially about curvature for each factor. DSDs when first published in 2011 worked only with continuous factors. Subsequent publications in 2013 and 2015 added support for categorical factors with two levels and blocking factors. When the number of significant factors is small, a DSD can collapse into a 'one-shot' design capable of supporting a response-surface model with which to make accurate predictions. A case study will be shown in which a 10-factor process is optimized in just 24 trials. In cases where too many factors are significant and the design can't collapse into a one-shot design, existing trials can economically be augmented to support a response-surface model in the important factors. Comparisons between augmented DSDs and augmented FF designs will show DSDs yield more information in fewer trials.

Part 3: Strategies for Analyzing Modern Screening Design of Experiments
The new Definitive Screening Designs (DSD) provide clean estimates of all main effects and squared effects for the design factors.  This leads to saturated or nearly saturated models and the potential to falsely identify lower power squared terms as important.  Effective strategies for analyzing these designs are reviewed to build a consensus model from the data.  Plus, a newly developed (2015) method for robustly determining the most likely model will be featured. In this tutorial we examine several strategies for analyzing DOE data sets.  We start with graphical exploration of the data using interactive distributions and scatterplots.  With an idea of what factors are visually dominant we move on to conservative modeling approaches such as looking at first order effects before moving on to second order effects -including interactions - guided by "effect heredity" and "effect sparsity" principles.  Finally aggressive strategies are used which include stepwise regression using several different stopping criteria to prevent over fitting and even fitting "All Possible Models."  Actual vs. Prediction plots with checkpoints can be used to help choose models.


Software Assurance

Instructor: Mr. Robert Martin – Senior Secure Software & Technology Principal Engineer, MITRE

This tutorial will explore how the the directed activities in the DoDI 5200.44, DoDI 8510.01–2014, and DoDI 8500.01–2014, and their Program Protection Plans, Developmental test and evaluation, Systems Engineering design & architecture reviews can be used to gain assurance about DOD Software and its resilience to attack.

Improving our assurance that the mission will not be circumvented, undermined, or unnecessarily put at risk through attacks on the software that provides critical mission capabilities requires a shift in focus and integration of many types of assessment activities across the acquisition life cycle.

This tutorial will also cover how the public vulnerability information, along with an understanding of the weaknesses in commercial and open source software puts the mission at risk. Publicly available about these weaknesses and the patterns of attacks they are susceptible to can be used to test GOTS and custom software so we have insight into how attackable DOD Software is and what can be done to address those risks.


T&E 1–2–3, The Fundamentals

Instructor: Mr. Matt Reynolds – Test and Evaluation Consulting

This tutorial is designed to explain the true fundamental concepts and precepts that apply to all testing. The literature on T&E is replete with policies and practices that serve the needs of specific generations of systems, of technologies and of acquisition strategies. They have evolved in a reactionary manner. But little has been published that describes the universal principles that underlie those policies and that do not change over time. A good understanding of these timeless principles is critical to the success of complex T&E programs. The primacy of thorough planning, contingency strategies, validation of test support resources, enterprise level thinking, and a thorough understanding of customer requirements (both stated and unstated) will be explained, and will be reinforced by lessons learned from the past programs. Several simple but effective exercises will be included in order to stimulate thinking and to reinforce the teaching points.


 

Test and Evaluation Across the Acquisition Lifecycle

Instructor: Michael Flynn, PhD, CTEP - Defense Acquisition University

This tutorial will focus on the latest DoDI 5000.02 guidance for defense acquisition process from a Test and Evaluation perspective with emphasis on the involvement in the Systems Acquisition Lifecycle and T&E's relationship to the Systems Engineering processes used throughout the lifecycle of major acquisition programs from requirements generation, through Post Milestone C.  Coverage will include the relationship between the Test and Evaluation Master Plan (TEMP), and Systems Engineering Plan (SEP) as they proceed through each of the major Milestone phases.  Focus will be on the major events that occur during each phase of acquisition, required documentation, and expected entrance and exit criteria for successfully achieving approval. The intended audiences are engineers, program managers, and industry for an understanding of DoD acquisition in relationship to T&E's involvement.


The Art of Planning Preview T&E: Australian Techniques for Early Test Strategies for Technical Maturation and Risk Reduction

Instructor: Group Captain Keith F. Joiner, Royal Australian Navy (Ret.), CSC

This four-hour workshop will benefit anyone who is involved in planning or conducting early T&E to de-risk and shape more successful projects. Such participants are likely to have been part of such planning processes before, but this workshop is an opportunity for them to examine a fresh systematic approach and see where their previous processes and personal master test planning skills might be made more robust.

Western governments continue to find an unacceptable proportion of projects fail to deliver the capability sought and that inadequate early T&E or trialing is a significant factor in the risks not being determined early enough for them to be mitigated. In a Senate inquiry into Defence procurement (2012, especially Ch. 2 & 12) this was found to be some ten percent of projects by value. A more recent report on broader Australian government public project failings (Shergold Report, 2015) found systemic inability to identify and plan early trialing as part of scoping projects.

New Defence T&E policy was implemented in Australia from 2013-14 to systematically plan and conduct de-risk or preview T&E (See Dr Joiner article ITEA Journal Dec 2015). Focused workshops ensure preview T&E is driven by significant technical and operational risk into a program of key confirmatory demonstrations, configuration audits and user trials. Within the U.S. DoD, such early T&E would typically occur during the Technical Maturation and Risk Reduction (TMRR) lifecycle phase and thus would be planned and funded in the Analysis of Alternatives (AOA) phase at Milestone A.

The Australian planning technique has now been confirmed in Defence T&E policy updates (2016) and is taught at the leading Defence university in Australia, University of New South Wales Canberra, as part of all postgraduate master programs in system engineering and project management.

Workshop participants will be given an overview of the workshop process and use a hypothetical capability requirement to role-play the workshops, so as to determine indicative outcomes of each phase of the hypothetical project. Two Australian examples will then be covered where such planning was positively used and another where it was comparatively ignored in order to contrast the benefits to de-risking projects through such early T&E. At the end of the workshop, students will have a chance to reflect back to the group on their own possible chances to have previously used such processes.


Using TENA and JMETC to Reduce Risk, Saving Time and Money

Instructor: Mr. Gene Hudgins – TENA and JMETC User Support Lead, KBRWyle

The Test and Training Enabling Architecture (TENA) was developed as a US DoD Central Test and Evaluation Investment Program (CTEIP) project to enable interoperability among ranges, facilities, and simulations in a timely and cost–efficient manner, as well as to foster reuse of range assets and future software systems. TENA provides for real–time software system interoperability, as well as interfaces to existing range assets, Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR) systems, and simulations.

TENA, selected for use in Joint Mission Environment Test Capability (JMETC) events, is well–designed for its role in prototyping demonstrations and distributed testing. JMETC is a distributed LVC testing capability developed to support the acquisition community during program development, developmental testing, operational testing, and interoperability certification, and to demonstrate Net–Ready Key Performance Parameters (KPP) requirements in a customer–specific Joint Mission Environment. JMETC uses a hybrid network architecture. The JMETC Secret Network (JSN), based on the SDREN, is the T&E enterprise network solution for secret testing. The JMETC Multiple Independent Levels of Security (MILS) Network (JMN) is the T&E enterprise network solution for all classifications and cyber testing. JMETC provides readily available connectivity to the Services' distributed test capabilities and simulations, as well as industry test resources. JMETC is also aligned with the Joint National Training Capability (JNTC) integration solutions to foster test, training, and experimental collaboration. TENA provides the architecture and software implementation and capabilities necessary to quickly and economically enable interoperability among range systems, facilities, and simulations. TENA also fosters range asset reuse for enhanced utilization and provides composability for assembling rapidly, initialize, test, and execute a system from reusable, interoperable elements. Because of its field–proven history and acceptance by the range community, TENA provides a technology already deployed and well tested within the US Department of Defense.

This tutorial will inform the audience as to the current impact of TENA and JMETC on the Test, Training, and Evaluation community; as well as its expected future benefits to the range community and the warfighter. 

 

 
International Test and Evaluation Association (ITEA) | E-mail: info@itea.org | www.itea.org © 2015 | All Rights Reserved.
4400 Fair Lakes Court, Suite 104, Fairfax, VA 22033-3899 | Phone: 703-631-6220 | Fax: 703-631-6221
See ITEA on Facebook Visit ITEA on LinkIn Web Graphic Design of Warrenton, VA - Hosting and Construction by Moe Technologies (MoeTec)