Shift Left! Test Earlier in the Life Cycle


To print a PDF copy of this article, click here.

Author: Steven J. Hutchison, Ph.D.

To achieve the outcomes of Better Buying Power (BBP) and deploy improved capability to our warfighters in an effective and timely manner, we have to get the development right and verify it through rigorous developmental test and evaluation (DT&E) before we commit to production. In other words, we have to Shift Left!

We have to change the paradigm that encourages testing late in the acquisition life cycle. Our acquisition process has some important test and evaluation (T&E) activities occurring after the decision to begin production. Why does this matter? In short, testing late means finding problems late, when it is most costly to fix. Late discovery then leads either to delayed deployment—or to accepting and fielding the system, where our warfighters will bear the burden of the development shortcoming.

The Shift Left initiative fundamentally is about improving DT&E to set the conditions for successful production and deployment. Shift Left achieves this goal through earlier identification and correction of failure modes, thereby avoiding the high costs of late cycle repair and reducing the impact to our war­fighters of fielding capabilities that do not satisfy requirements.

There are three key elements of Shift Left: earlier testing for interoperability, earlier testing of cybersecurity, and conducting DT&E in a mission context. While shifting tests of interoperability and cybersecurity earlier in the life cycle forms a more comprehensive set of pre-production developmental test activities and gains test efficiencies, mission context is essential to adequately evaluate (and expose potential failure modes in) the four critical developmental issue areas: performance, reliability, interoperability, and cybersecurity. Bringing mission 091014-article-9-secondarycontext into DT&E does not mean program managers (PMs) have to rehearse the initial operational test and evaluation (IOT&E), but getting the system out of the lab to see how it actually will be used always should be an important part of developmental testing (DT).

Interoperability has proven to be a persistent challenge, especially throughout the past decade of combat operations, which suggests we are not finding interoperability issues early enough in DT to fix them before operational urgency demands the system go to the field. There also are considerable data from assessing information assurance during operational exercises that show that fielded systems are vulnerable in the cyber domain. Clearly many of the interoperability issues and cybersecurity vulnerabilities could have, and should have, been found and corrected before the systems were fielded.

The Office of the Deputy Assistant Secretary of Defense for Developmental Test and Evaluation (DASD[DT&E]) and Director, Test Resource Management Center (TRMC) has embarked on a course to work aggressively with chief developmental testers and lead DT&E organizations to help them achieve the objectives of BBP, and more important, to help ensure that a development problem does not become a war­fighter problem.

The interoperability and information assurance certification processes permit numerous defects to get to the field, and it is time to reverse that trend.

Developmental Testing in the DoD Acquisition Process

Take a look at the array of T&E activities relative to the Milestone C decision as depicted on the acquisition “wall chart” (https://ilc.dau.mil). The wall chart is a detailed systems-engineering-based depiction of activities and critical decisions described in the DoD 5000 series directive and instruction. Figure 1 highlights the main T&E activities. This image illustrates what appears to be a good DT&E strategy as the program moves up the right-hand side of the engineering and manufacturing development (EMD) phase “systems engineering V” in preparation for Milestone C. But it is incomplete.

Joint interoperability certification testing follows Milestone C during the production and deployment phase, and cybersecurity testing, which is not shown on the wall chart but is critically important for today’s Net-enabled capabilities, typically occurs after Milestone C and under the auspices of the Defense Information Assurance Certification and Accreditation Process (DIACAP) (DoD Instruction 8510.01). In terms of informing the Milestone C decision, interoperability and cybersecurity testing are late to need. Interoperability and cybersecurity certification involve critical test activities that should be part of a robust DT&E strategy.

Since this discussion is based on a chart image, the question is: What outcomes are programs achieving in the real world? Where interoperability and cybersecurity are concerned, we have considerable data showing that unresolved issues continue to be discovered in operations. One source of data is the program of interoperability and information assurance assessments during Combatant Command and Service exercises. After almost a decade of these assessments, the program continues to observe … cyber effects caused by unresolved interoperability deficiencies, coupled with low-to-moderate level threats that were sufficient to adversely affect the quality and security of mission critical information in a way that could (and where permitted did) degrade mission accomplishment significantly. (http://www.dote.osd.mil/pub/reports/FY2012/)

Since this program assesses the interoperability and information assurance posture of operational systems, it provides value in process hindsight; in other words, it lets us see what got through the certification processes into the field. While this program is a significant source of vulnerability information, it is subject to real-world limitations when conducting cybersecurity testing on live networks with live data (note use of the phrase “where permitted” in the quote above). We have the means to overcome this limitation in DT&E. We can be certain, though, that the interoperability and information assurance certification processes permit numerous defects to get to the field and it is time to reverse that trend.

Shift Left!

The intent of Shift Left is to set the conditions for improved production readiness and reduce the likelihood that major deficiencies get to the field. Shift Left also is an effort to influence an acquisition culture that today focuses on IOT&E and the full-rate production decision. Since these events are late in the life cycle, PMs frequently trade off testing during DT (“we’ll do that in operational testing [OT]”) for other priorities. Late life-cycle focus also effectively lowers the bar for entry into low-rate initial production (LRIP), and consequently rarely pays off. This is not a new trend; in July 2000, the General Accounting Office (GAO now the Government Accountability Office]) wrote: “Despite good intentions and some progress by the Department of Defense (DoD), weapon system programs still suffer from persistent problems associated with late or incomplete testing” (GAO, “Best Practices: A More Constructive Test Approach Is Key to Better Weapon System Outcomes,” July 2000; http://www.gao.gov/assets/160/156809.pdf).

091014-article-9-secondary-3The Weapons System Acquisition Reform Act (WSARA) of 2009 is the most recent attempt by Congress to help DoD acquisition. One of the means to improve acquisition outcomes through this legislation was renewed emphasis on DT&E, implementing several recommendations from the May 2008 Report of the Defense Science Board (DSB) Task Force on Developmental Test and Evaluation (www.acq.osd.mil/dsb/reports/ADA482504.pdf). This included establishment of a DT&E director under the supervision of the Under Secretary of Defense for Acquisition, Technology and Logistics (USD[AT&L]). The WSARA also required component acquisition executives to provide appropriate resources for developmental test organizations to “participate in and oversee the conduct of developmental testing, the analysis of data, and the preparation of evaluations and reports based on such testing.” More legislative support for DT&E appeared in section 835 of the FY2012 National Defense Authorization Act (NDAA), requiring each major defense acquisition program to be supported by a chief developmental tester, and a government test agency to serve as lead DT&E organization. Section 904 of the FY2013 NDAA continued the trend and granted additional authorities for the DASD(DT&E).

Unfortunately, these legislative efforts fall short of addressing the late life-cycle emphasis on full-rate production. In fact, legislation drives much of the focus late in acquisition: 10 U.S.C. section 2399 establishes considerations for operational test and evaluation; it does not offer similar considerations for DT&E. Moreover, sections 2366 and 2399 both establish conditions for proceeding beyond LRIP; there are not similar conditions for proceeding into LRIP. The GAO reported this finding almost 20 years ago:

Congress may wish to require that all defense acquisition programs (major and nonmajor) conduct enough realistic testing on the entire system or key subsystems to ensure that key performance parameters are met before LRIP is permitted to start. The objective of GAO’s recommendations is to avoid the premature commitment to production and thereby avoid fielding systems that do not meet requirements and need costly and time-consuming retrofits. (GAO, Weapons Acquisition: Low-Rate Initial Production Used to Buy Weapon Systems Prematurely, November 1994; http://www.gao.gov/assets/160/154796.pdf)

In other words, the GAO was recommending a Shift Left. And it hasn’t just been the GAO; there have been countless Blue Ribbon commissions, Defense Science Board panels, National Research Council studies, Inspector General reports, industry reports, etc., that all make the same recommendation: When it comes to testing, earlier is better.

The challenge for us is to overcome process inertia and positively effect change for acquisition programs. Developmental testing has a significant role in accomplishing this objective, but the key is simply to “do better DT&E” to find and fix the problems before entering production. Doing better DT&E requires us to get beyond the notion that DT is just “technical testing.” On the back of the wall chart, for example, is this definition of Developmental Test and Evaluation: “A technical test conducted to provide data on the achievability of critical system performance parameters.” A 2006 National Research Council report, “Testing of Defense Systems in an Evolutionary Acquisition Environment,” recommended revising DoD testing procedures to “explicitly require that developmental tests have an operational perspective (i.e., are representative of real-world usage conditions) in order to increase the likelihood of early identification of operational failure modes and system deficiencies… .” (http://www.nap.edu/catalog.php?record_id=11575). The way I see it, if programs conduct DT&E only to verify compliance with specifications, we will completely miss the sense of whether the capability satisfies the warfighter’s need.However, if programs test in a mission context during DT&E, not only will we be able to answer the technical questions, but we will obtain that critical user feedback early in the life cycle—that’s a 2-for-1 better buying power bargain!

If programs conduct DT&E only to verify compliance with specifications, we will completely miss the sense of whether the capability satisfies the warfighter’s need.

Robust DT&E should include all of the elements of interoperability and cybersecurity testing and bring the right resources to bear to provide confidence in the decision to enter production. As DoD acquisition programs become increasingly complex, DT&E must leverage all resources and test venues as potential data sources, to include use of modeling and simulation, and where practical, leverage training exercises, experimentation, and operations. DT&E should exploit the power of the network—such as the joint mission environment test capability (JMETC)—as a way to bring test resources together to reduce cost, gain efficiency, and improve realism.

End-to-end testing using joint mission threads, and testing with a realistic cyber threat (in a 091014-article-9-secondary-4cyber range suited for that purpose) will provide this confidence.

We are making progress in shifting interoperability testing to the left. The latest version of the Chairman of the Joint Chiefs of Staff Instruction (CJCSI) 6212.01F, Net Ready (NR) Key Performance Parameter (KPP), states:

(A.2.c) DoD Components will ensure the Component Developmental Test and Evaluation (DT&E), Operational Test and Evaluation (OT&E) processes include mission-oriented NR KPP assessments … .

Note the emphasis on “mission oriented” assessments. Additionally, the CJCSI 6212 establishes a relationship between the Joint Interoperability Test Command (JITC) and DASD(DT&E) to ensure more attention to interoperability during DT&E:

(A.7.b) DISA will ensure JITC leverages previous, planned and executed DT&E and OT&E tests and results to support joint interoperability test certification and eliminate test duplication. DASD(DT&E) shall approve Developmental Test and Evaluation plans in support of Joint Interoperability Test Certification as documented in the TEMP [Test and Evaluation Master Plan]. JITC shall advise DASD(DT&E) regarding the adequacy of test planning in support of Joint Interoperability Test Certification.

In meeting with JITC, we determined that the best path forward was not to introduce a burdensome new test plan approval process; rather, we decided to work with chief developmental testers to add, where appropriate, relevant interoperability tests and data collection activities during DT&E, and reflect the interoperability test objectives in the DT&E event descriptions and required resources in the TEMP.

We have a similar effort in the update to the DoD 8500 series directive and instructions for cybersecurity. The 8500 series is under revision to implement the “risk management framework” for cybersecurity. As was highlighted in the discussion of Figure 1, our current process does not adequately incorporate cybersecurity testing as a critical developmental test activity. Security test and evaluation (ST&E) has been all but lost in the current DIACAP process. Security test and evaluation was defined under the former DITSCAP process (DoD Information Technology Security Certification and Accreditation Process, DoDI 5200.40), which preceded DIACAP, as “examination and analysis of the safeguards required to protect an IT system, as they have been applied in an operational environment, to determine the security posture of that system.”

Figure 1. Test and Evaluation in the Defense Acquisition System

091014-article-9-secondary-2

The phrase “safeguards required” is interesting. We test to requirements, and all acquisition programs have a set of “mandatory” KPPs that drive major elements of the test strategy. The “mandatory” KPPs (the Manual for the Operation of the Joint Capabilities Integration and Development System, Jan. 19, 2012, uses the word “mandatory” in quotes) include force protection, survivability, sustainment, Net-ready, training, and energy. There is not a cybersecurity KPP. One can argue that the survivability KPP can be applied to systems in the cyber domain, but that just diminishes what should be considered today a mandatory requirement, with well-defined attributes specifically written for network-enabled military capabilities. The department should mandate a cybersecurity KPP, require cybersecurity testing in DT&E, include it in the TEMP, and resource it accordingly. The department also should require all programs to test cybersecurity with a realistic cyber threat, and make use of a cyber range to limit the risk of collateral damage to live networks and data sources.

Shift Left is a priority initiative for the DASD(DT&E) and Director, TRMC. In our engagement with programs, we are assisting PMs, chief developmental testers, and the lead DT&E organizations in developing and executing a comprehensive DT&E strategy that evaluates the system in a mission context, and includes early testing of interoperability and cybersecurity. We will help programs craft the wording in TEMPs and other documents to reflect a sound DT&E strategy that will set the conditions for initial production. We will assist programs in provisioning the necessary infrastructure resources, such as a cyber range and the JMETC. We sponsor the Scientific Test and Analysis Techniques Center of Excellence at the Air Force Institute of Technology (http://www.afit.edu/en/stat_te_coe) to assist programs with statistical approaches to test design. Finally, consistent with the 2008 DSB recommendation for the DT&E office to brief an “assessment of DT results” at milestone decision reviews, we will make an internal shift left process change, and transition the “assessment of operational test readiness” (AOTR) (note its placement in Figure 1), to a “DT&E Assessment” of performance, reliability, interoperability, and cybersecurity to better support acquisition decision making at each milestone decision.

Summary

Developmental test and evaluation is a tool PMs use throughout the life cycle to identify areas in need of improvement, manage risks, build confidence, and gain early and continuous feedback; it is the key to informing the decision to begin production, and sets the conditions for improved acquisition outcomes. Developmental testing is the key to acquisition agility and delivering capabilities to the warfighter more effectively and efficiently. To achieve the outcomes of Better Buying Power and deploy improved capability to our warfighters in an effective and timely manner, we have to get the development right and verify it through rigorous DT&E before we commit to production. We have to Shift Left!


To print a PDF copy of this article, click here.

Hutchison is the Acting Deputy Assistant Secretary of Defense for Developmental Test and Evaluation/Director of the Test Resource Management Center in the Office of the Under Secretary of Defense for Acquisition, Technology and Logistics.

The author can be contacted at steven.hutchison@osd.mil.

Additional materials can be found in the Acquisition Community Connection, DT&E Community at https://acc.dau.mil/CommunityBrowser.aspx?id=22039. Also, check out my blog on the Defense Acquisition Portal at https://dap.dau.mil/cop/trmcblogs.

Comments

comments

Leave a Reply

Your email address will not be published. Required fields are marked *