arj72-article-4-lead

The Effects of System Prototype Demonstrations on Weapon Systems


To print a PDF copy of this article, click here.

Authors: Edward J. Copeland, Thomas H. Holzer, Timothy J. Eveleigh, and Shahryar Sarkani

The inability of Department of Defense (DoD) programs to sufficiently reduce technology risk prior to entering formal systems development has between 2007 and 2012 contributed to a 13 percent cost growth in weapon systems acquisition and a 17 percent increase in cycle time to deliver initial operational capability. With the advent of key legislation and resulting DoD acquisition reform initiatives, weapon systems programs are now required to enforce a technology development strategy that can foster true risk reduction prior to entering systems development. A key enabler to reducing technology risk and thereby accelerating design maturity is the use of system prototype demonstrations. The objective of this article is to present research findings on the “effects of system prototype demonstrations on weapons systems development” for major defense acquisition programs. The results of this research will better inform systems engineers and contribute to improved technology development strategy.


The Department of Defense (DoD) has historically struggled to implement effective risk-mitigation strategies in the development of highly complex weapon systems, as evidenced by increasing cost and schedule growth over the past several decades (General Accounting Office, 1999; Government Accountability Office [GAO], 2006b). The inability of DoD programs to sufficiently reduce technology risk prior to allowing a program to enter formal systems development has, as measured from 2007 to 2012, contributed to a 13% cost growth in weapon systems acquisition, and a 17% increase in cycle time to Initial Operational Capability, or IOC (GAO, 2013). Acquisition cycle time is defined as that span of time from program start to deployment of IOC to the warfighter. When compared to First Full Estimates, the DoD major defense acquisition program (MDAP) portfolio total acquisition cost had grown an average 38%; correspondingly, product cycle time increased an average 37% (GAO, 2013).

First Full Estimates, as defined by the GAO, are the original total acquisition cost estimates established at program development start (GAO, 2012, p. 36). The GAO estimates for MDAPs and their total acquisition costs are collected from DoD Selected Acquisition Reports (SAR) and consist of research and development, operations and maintenance, and military construction costs (GAO, 2012, p. 171). Clearly, this performance trend has been unacceptable, and further attention is required to manage technology risk effectively.

Today’s economic climate continues to threaten available DoD funds and underscores the need for streamlined but effective systems engineering. Smart application of cost-effective tools and techniques, such as the use of system prototype demonstrations, should be leveraged to ensure maximum payback per dollar towards risk reduction. The cost of using prototypes, balanced with value-added risk-reduction returns, will contribute to program “Should Cost” savings. The phrase Should Cost, institutionalized by DoD as part of Better Buying Power 2.0, is an initiative for MDAPs to eliminate inefficiencies and capitalize on cost-saving opportunities (Carter & Mueller, 2011). In a recent, concise, and highly convincing article published in Proceedings, the U.S. Naval Institute’s flagship magazine, VADM David Dunaway, Commander of the Naval Air Systems Command, wrote about today’s economic climate: “In the face of decreasing budgets, rapidly evolving threats, and a shift in defense strategy, … it’s imperative that every dollar spent increases warfighting capability” (Dunaway, 2013, p. 326).

Today’s economic climate continues to threaten available DoD funds and underscores the need for streamlined, but effective systems engineering.

Through the use of descriptive statistics and empirical analysis, this article summarizes the comparative performance for MDAPs that did and did not invest in system prototype demonstrations for early risk reduction prior to entering system development, otherwise referred to as Engineering and Manufacturing Development (EMD). Additionally, for those MDAPs that did use prototype demonstrations over this past decade, program performance was examined for any impacts coincident with the adoption of related key systems engineering policy and legislation.

arj72-article-4-secondaryWith the Defense Acquisition Management System (DAMS) model as a conceptual framework, key hypotheses were evaluated using empirical analysis of historical evidence and trends to help validate observed system behavior. The effects of pre-EMD system prototype demonstrations on program performance were examined using observed impacts to technology readiness and weapon system design maturity. The data analysis does not highlight any individual program specifics, but applies a macro-level analysis of aggregated data to characterize observed program performance as a function of key predictor variables.

The authors anticipate that the findings of this research would help to (a) better inform program managers and systems engineers on the effects of system prototype demonstrations on weapon systems development; (b) better provide insightful knowledge to develop more effective technology development strategy; and (c) better implement “true” risk reduction measures, per DoD guidance (Kendall, 2012) before entering the EMD phase. The context of “true” in reference to risk reduction is meant to imply pre-EMD system development mitigation activities that can indeed reduce the risk of cost and schedule growth, and minimize product cycle time to the warfighter. System prototype demonstrations not only validate the state of technology maturity for enabling technologies, but also provide for early mitigation of system/subsystem integration risk. Consonant with DoD’s goals for improving Better Buying Power, this research also provides additional insight into whether perceived gains from pre-EMD prototype demonstrations are actually being realized.

Prototype Demonstrations —A Historical Perspective

As demonstrated in the early 1900s, whether it’s the Wright brothers’ experimentation leading up to the first successful flight of the Wright Flyer, or Samuel Langley’s attempts to launch an Aerodrome for the first time off a modified houseboat at sea, our nation’s industry has leveraged system prototype demonstrations for over a century. Figure 1 portrays two historical moments in time where system prototypes were used to reduce early aviation technology risk.
Prototypes provide the designer a useful tool with which to visualize and transition new ideas into development using an archetype, initial model, or early pattern of the envisioned end product. Industry has leveraged prototypes with great success as a necessary enabler and bridge to introduce new products into the marketplace. Although the value of prototypes may seem obvious, historically the use of prototypes and the perceived return on investment has been a subject of debate. The following chronology highlights DoD’s changing opinion on the use of prototype demonstrations:

  • (Favorable) As early as the 1930s, industry commonly built engine-aircraft combination prototypes as a form of aircraft development risk mitigation (Drezner, 1992).
  • (Favorable) Post-World War II, in the mid to late 1940s, competitive prototype flight testing occurred with the transition of propellers to reciprocating engines (Smith, Barbour, McNaugher, Rich, & Stanley, 1981).
  • (Not Preferred) In the 1950s, since prototypes were not representative of full-scale development integrated designs, the opinion was that the practice was wasteful and non-value added (Smith et al., 1981).
  • (Not Preferred) With the advent of the digital computer age in the 1960s, a prevailing philosophy existed that theoretical analysis would be sufficient to predict systems design performance without the need for costly prototypes (Smith et al., 1981).
  • (Favorable) Coincident with the first issuance of DoD Directive 5000.1 in 1971, prototyping was re-introduced as a key risk reduction tool as a result of then-Secretary of Defense David Packard’s “Fly-Before-Buy” promulgated policy. Competitive prototypes were encouraged with less dependence on concurrent development and paper studies before entering Full-Scale Development (DoD, 1986).
  • (Favorable) In 1986, the President’s Blue Ribbon Commission on Defense Management, referred to as the Packard Commission, reported the need for rigorous testing of system prototypes prior to Full-Scale Development, again emphasizing a Fly-Before-Buy philosophy (DoD, 1986). Subsequent legislation was introduced in 1987, which mandated that DoD develop and test competitive prototypes for MDAPs before awarding a production contract (Glass, 1988).
  • (Favorable) As a result of a General Accounting Office (1999) study rcommendation, in 2001 DoD adopted the use of Technology Readiness Levels (TRL) as a means for MDAPs to manage the maturity of technology entering system development (Technology Readiness, 2010). The National Defense Authorization Act (NDAA) of 2006 established statutory law for the Milestone Decision Authority to certify that all critical technologies (i.e., referred to as critical technology elements) have been demonstrated in a relevant environment (i.e., TRL 6) before granting an MDAP approval to enter EMD (NDAA, 2006). In 2007, then-Under Secretary of Defense for Acquisition, Technology and Logistics John Young released a memorandum, “Prototyping and Competition,” directing the Services and Defense Agency proponents for MDAPs to “formulate all pending and future programs with acquisition strategies and funding that provide for two or more competing teams producing prototypes through milestone (MS) B” (Young, 2007). The Weapon Systems Acquisition Reform Act of 2009 (WSARA) introduced legislation that enforced specific risk-reduction efforts prior to entering system development, including engagement with industry before EMD for technology maturation; competitive prototyping; and the establishment of a system allocated baseline at a system-level Preliminary Design Review (WSARA, 2009).

Figure 1. Early Examples of Aviation System Prototype Demonstrations

arj72-article-4-figure-1

What Constitutes a Prototype?

The term “prototype” has many definitions depending on the context and need. First, it is important to understand the difference between prototyping and a prototype. In general, prototyping is a process to foster creativity and new ideas, visualize novel application and enabling technologies, reduce uncertainty and increase the advancement of knowledge, and highlight the art of the possible. Prototypes provide the mechanism to “uncover truth” (National Research Council, 2013, p. 3) through observed and controlled experiments that allow for the collection of quantifiable data to explore, develop, validate, and improve performance prediction models or theories.

The primary purpose for using a prototype is to mitigate risk (cost, schedule, or performance) to product development and to the timely delivery of an affordable and compliant end-item to the customer. Prototypes focus on high-risk areas considered essential to achieve system performance and are deemed important to achieve market or user introduction. The cost and relative complexity that a prototype can take on will vary depending on the need and the significance of the function being mitigated. From small-scale, relatively simple models for desktop experiments to larger, more complex full-scale integrated system demonstrators, the primary goal for the use of a prototype is to yield insightful knowledge that can be used to reduce end-item risk.

A prototype fundamentally is used to demonstrate increasing levels of system integrated solutions in stages of representative environments to meet expected operational performance in mission-relevant scenarios. When considering the general nature of prototyping, a RAND Corporation study (Drezner, 1992) concluded that a prototype is best defined as:
… a product (hardware and/or software) that allows hands-on testing in a realistic environment. In scope and scale, it represents a concept, subsystem, or production article with potential utility. It is built to improve the quality of decisions, not merely to demonstrate satisfaction of contract specifications. (p. 9)

Criticality of Prototype Demonstrations on Technology Maturity

The term “maturity” or “technology maturity” refers to that period in which an enabling technology translates from instantiation of an idea to the realization of that idea’s fullest potential. The product life cycle therefore transitions from early conceptual and technology development, through systems development (i.e., Developmental Test and Evaluation), operational test, production, market or user introduction, and finally, to disposal or recycle.

arj72-article-4-secondary-4Maturity is a relative term that is applied based on comparison to a predefined end state. When discussing the readiness to enter system development, a technology that has not achieved TRL 6 is considered “immature.” According to DoD (DoD, n.d.; Taylor, 2007) and Public Law (NDAA, 2006, 2008), technologies that are TRL 6 or better are considered as meeting the minimum maturity level acceptable to enter system development (i.e., EMD) at Milestone B. When considering a production decision at Milestone C, DoD best practice requires technologies to be at least TRL 7 to be considered mature enough to enter a production decision. A similar relationship applies when considering readiness for deployment; those technologies not yet TRL 8 (i.e., fully qualified, specification-compliant, and ready to enter operational test) would not be considered mature enough to enter the capstone Operational Evaluation (OPEVAL). Although GAO and DoD agree that any critical technology less than TRL 6 is considered “immature,” GAO recommends that TRL 7, not TRL 6, is the appropriate level of technology maturity when entering product development (i.e., EMD or GAO Knowledge Point #2). GAO refers to critical technologies at TRL 6 as “approaching or nearing maturity.” DoD considers TRL 9 as the level when a critical technology can be considered fully mature (i.e., when the system is considered suitable and effective by the user and deployed to field). GAO, on the other hand, considers critical technologies as “mature or fully mature” at TRL 7 when a production decision at Milestone C is required (i.e., GAO Knowledge Point #3; GAO, 2006a, p. 132).

Figure 2 associates the level of prototype and demonstrations, the venue for those demonstrations, and the technology maturity achieved as delineated by assigned TRLs to the applicable dimension of the DoD acquisition life cycle. The diagram shows that as Science and Technology (S&T) progresses from early exploratory development (i.e., basic principles, analytical studies, and early experimentation) to the formulation and test of component/breadboard prototypes in a low-fidelity laboratory environment, the product performance (i.e., demonstrated technology maturity) curve exhibits a gradual-to-exponential growth (TRL 1 to TRL 4). After entering Milestone A (i.e., Technology Maturation and Risk Reduction phase), the curvature becomes less steep over an extended period of technology development as competitive prototype solutions are used to demonstrate critical technologies in a relevant environment (i.e., TRL 6). Upon achieving a TRL 6 level of maturity, a more gradual inclining plateau results for the duration of EMD. This flatter profile indicates a lower technological risk exists (i.e., related to technology maturity) and a representative system prototype or model of the end-state product has been achieved. During EMD, there should be no more reliance on S&T; only standard engineering developmental test and evaluation should be applied both to finish product design and build/test a production representative prototype (i.e., engineering development model) prior to Milestone C. After the actual system has been fielded and the technology eventually approaches end-of-life, the tail of the flattened S-curve dips, reflecting technology aging as well as a degradation in both system reliability and supportability.

Figure 2. Level of Prototype Demonstrations, Venue, and Technology Maturity

arj72-article-4-figure-2

As shown in Figure 2, the S-curve shape represents a generic depiction of increasing technology maturity and product performance over time while progressing through the acquisition life cycle. Several analogies have been theorized relating technology maturity with the shape and phenomenon of an S-curve (MITRE, n. d.; Nolte, 2008). Although the shape of the curve implies a changing rate of improving maturity or product utility consistent with increasing levels of integrated prototype demonstrations and development progress, the overlaid TRL mapping shown in the figure should be interpreted as discrete threshold attainment points where increasing levels of technology maturity can be claimed. TRL values are assigned only as integer values (i.e., DoD does not recognize a readiness level fraction). Only when enough aggregate demonstration evidence of technology maturation has been collected can the Technology Readiness Assessment (TRA) independent review panel substantiate assignment of the next integer TRL value. The TRL definitions, demonstration criteria, and TRL values, as overlaid onto the S-curve and shown in Figure 2, are consistent with DoD guidance and policy (DoD, n.d; DoD, 2011).

Key Aspects of Prototype Demonstrations

The applicable venues for the demonstration of a prototype depend on the level of information required, complexity and integration level of the prototype, relevant environment in which the prototype must operate, performance expectations, and the technology maturity required at the associated stage within the DoD acquisition life cycle. Considerations of potential relevant environments for which a critical technology would need to survive and meet operational performance would include physical, logical, data, security, and user. The relevant environment is characterized by the critical technology application and its operational performance expectations while under worst-case, mission-relatable conditions.

Considerations of potential relevant environments for which a critical technology would need to survive and meet operational performance would include physical, logical, data, security, and user.

A Critical Technology Element (CTE) represents an enabling technology that is deemed critical to meet operational performance of the system to be acquired and is also (a) a technology or application of a technology that is considered either new or novel, or (b) represents an area that poses a significant technological risk during product development (i.e., EMD) (DoD, n.d.; DoD, 2009). A TRA is conducted using an independent review panel to reconcile program CTEs and associate TRLs based on the level and quality of integrated prototype demonstrations accomplished. Figure 3 provides a mapping of TRL descriptions and definitions to prototype demonstration environment and venue, level of technology, and expected attainment across the DAMS timeline.

Figure 3. Technology Readiness Level Mapping to Prototype Demonstration Attributes

arj72-article-4-figure-3

Conceptual Framework

For this study, a research conceptual framework was established to examine the effects that system prototype demonstrations, when applied early in the systems engineering acquisition life cycle, would have on reducing technology risk for system development and production of U.S. military weapon systems. Since the approach leverages event-driven knowledge points (e.g., design reviews) consistent with standard systems engineering practice, the framework, as applied, can be tailored to accommodate other agency or industry product life cycles. The DAMS is a disciplined systems engineering, event-based framework in which acquisition programs proceed through a series of milestone decision reviews for authorization to enter subsequent life-cycle phases of the weapon systems acquisition process (DoD, 2013). Relationships were examined between key variables related to technology maturity, design maturity, and their corresponding impact on program performance.

The DAMS provided the rigorous structure necessary to collect and analyze descriptive statistics on independent variable constituents representing technology and design maturity, as well as on program performance dependent variables (i.e., cost, schedule, and product cycle time). Today’s prevailing best practices endorse the use of system prototype demonstrations as a major contributor to true risk reduction before entering system development (Carter, 2010; Kendall, 2012; Young, 2007). In fact, DoD’s expectations/assumptions now encompass realization of not only reduced program cost and schedule growth, but shorter product cycle time to the warfighter. The following questions were used to examine the validity of these assumptions:

  • Do technology development (i.e., pre-EMD) system prototype demonstrations provide a positive return on investment for weapon systems development?
  • Do technology development system prototype demonstrations impacting technology maturity improve weapon systems development program performance?
  • Do technology development system prototype demonstrations have a positive impact on achieving weapon systems design maturity?

Research Population and Sampling Description

The research population, consisting of DoD MDAP portfolios ranging from FY 2002 through FY 2012, were designated Acquisition Category I (ACAT-I) since they were projected to exceed threshold FY 2000 constant dollars criteria for either Research, Development, Test and Evaluation ($365 million) or Procurement ($2.19 billion) (DoD, 2000, 2008). The latest interim DoDI 5000.02 (DoD, 2013) modified the ACAT-I designation criteria to be relative to FY 2014 constant dollars for subsequently established MDAPs. A mixed-methods research approach was used to collect and analyze historical program performance data and findings from available and relevant literary sources. Data collection was focused primarily on MDAPs that were part of the annually published GAO assessments for selected major weapon systems programs. These reports, dating from 2003 to 2013, represent limited case study, knowledge-based program performance assessments that were provided to the United States Congress.

The actual data contained within these published reports are mostly reflective of the previous year’s program performance, therefore representing MDAP portfolios spanning from 2002 to 2012. MDAP cost, schedule, and performance data were also collected from annual DoD SARs, which are submitted in conjunction with the President’s Budget. The research data population consisted solely of MDAPs and did not include Major Automated Information Systems, or ACAT-IA programs.

arj72-article-4-secondary-3After initial data cleansing to ensure validity and reliability, 139 MDAPs were determined to contain enough usable and relevant data for analysis of key research factors of interest. Considerations used for data purification included adequacy of sample size, verification of ACAT assignment, and noting if programs were canceled or restructured. The research population spread was as follows: 25% Air Force (34 MDAPs), 23% Army (32 MDAPs), 35% Navy and Marines (49 MDAPs), and 17% DoD Joint (24 MDAPs). Product types included aircraft, helicopters, satellites, ships, submarines, ship/ground vehicles, ship/ground stations, sensors and electronic warfare systems, missiles, weapons and munitions, core electronics, and unmanned air vehicles. Hypothesis testing was limited to those MDAPs that were in or completed EMD. This final cleansed population of 117 MDAPs from which valid samples were empirically analyzed included 70 MDAPs that used system prototype demonstrations before entering EMD, and 47 programs that did not.

The MDAP data collected included available initial program baseline dates for systems engineering technical reviews and key decision points along the program acquisition timeline. Planned reviews were compared to actual event dates, and a percentage deviation was calculated to represent either schedule reduction or growth. Data validity and reliability for factors and their constituents were assured for comparative analysis of descriptive statistics, correlation, and regression by using percentage deviation from plan. This approach allowed for findings to be explained by systems engineering progress rather than biased by other potential factors associated with the uniqueness of product type. Care was taken to compare only completed events so as not to skew the empirical analysis results with projected accomplishments.

General Introduction to Findings

A primary assumption in determining which programs applied system prototype demonstrations prior to entering EMD was the fact that all CTEs need to have achieved TRL 6. Any program that conducted a TRA and identified CTEs would have shown evidence that at least TRL 6 was achieved by Milestone B, therefore validating that a system-level demonstration had occurred; otherwise, the Milestone Decision Authority would not have been able to certify compliance with Title 10 U.S.C. § 2366 (NDAA, 2006). All programs after the 2006 legislation would meet this criteria with certainty. Programs that conducted TRAs post-2001, and before the 2006 legislation, would also apply given the need to be consistent with then-existing DoD 5000.02 policy (DoD, 2000) to perform technology maturity assessments through the application of TRLs and adherence to subsequent Office of the Secretary of Defense initial TRA deskbook guidance published in 2003 (DoD, 2003). MDAPs with acquisition strategy that included either a Demonstration and Validation phase or Technology Demonstration (TD) phase were also counted. These would correspond to MDAPs that held a Milestone A event (or analogous Milestone I event). Also included were those older MDAPs that employed Fly-Before-Buy or acknowledged system-level demonstrations that were still part of the active DoD portfolio in 2002, and therefore were reported by GAO and within the relevant data collection window of this research data population.

MDAPs that were counted as not using pre-EMD system prototype demonstrations were those that were initiated at or post-system development start (i.e., Milestone B or analogous Milestone II event). MDAPs that entered the DAMS at production (i.e., Milestone C or analogous Milestone IIIA event) were not counted since the acquisition strategy likely did not include development activity, and therefore only accepted fully mature technologies into production.

Results and Findings

Linear Relationships Between Key Factor Constituents

To assess the strength and direction of any linear relationships, a Pearson correlation analysis was completed for research factor constituents associated with MDAPs using system prototype demonstrations to assess the strength and direction of any linear relationships. The impact that system prototype demonstrations have on technology maturity (e.g., TD span and technology readiness) was examined for any relationships with design maturity (e.g., percent drawings released by Critical Design Review [CDR] and percent schedule change to CDR) and program performance (e.g., cost and schedule growth).

The Pearson coefficient is based on the method of covariance and ranges from +1 to -1, where a value equivalent to zero (0) indicates no correlation between variables. As shown by the sign of the coefficient, the direction of the linear fit represents a positive or negative relationship (Laerd Statistics, 2013). Table 1 summarizes constituent relationships for MDAPs that used system prototype demonstrations prior to EMD. All constituent pairs shown in Table 1 met a 0.10 or higher level of significance (i.e., establishing that a relationship exists).

Table 1. Pearson Product-Moment Correlation Analysis of Key Constituents
(MDAPs with System Prototype Demonstrations Prior to EMD)

arj72-article-4-table-1

Four constituent pairs (AB2, AB3, AB4, and AB5) indicated a high degree of association (i.e., strong correlation) and are characterized as follows: (a) any change in the number of CTEs taken into system development will realize a corresponding change in the time required for TD; and (b) any change in the duration of time required for TD will have a similar schedule impact to system development (i.e., EMD phase), as well as an opposite impact on percent acquisition cost growth. Therefore, the greater the number of immature CTEs necessary to meet a capability gap, the longer the TD phase will be to reduce technology risk prior to entering system development. Additionally, given the increased leverage of enhancing emergent technologies, the EMD phase will likely be longer to accommodate additional systems integration and test. The extended TD phase would, with other factors not considered, contribute to a reduction in acquisition cost growth. Additionally, two constituent pairs (AB1 and AB6) were identified as having a moderate degree of association and are interpreted as follows: (a) with a change to EMD span time, there is a corresponding opposite change in acquisition cost growth relative to First Full Estimates; and (b) with a change in TD span time, there is a corresponding opposite change in acquisition cycle time growth. Therefore, with longer TD spans to accommodate increased risk mitigation and maturation activities due to increased number of CTEs, the overall acquisition cycle time can be reduced as a result. Similarly, with longer EMD span times likely to mitigate complexities associated with standard engineering development and complex integration, the percentage of acquisition cost growth can be reduced. Due to direct relationships among key constituent pairs, the Pearson correlation analysis indicates that high potential exists for a positive effect on program performance when implementing effective risk reduction through the use of system prototype demonstrations.

System Prototype Demonstrations Provide a Positive Return-on-Investment

With the exception of percentage acquisition cost growth since the First Full Estimates and percentage cycle time growth from program start to IOC, Figure 4 shows that the remaining program performance factor constituents show a modest improvement when employing system prototype demonstrations before entering system development. MDAPs that leveraged system prototype demonstrations prior to EMD realized a mean reduction in acquisition cost growth (2006 to 2011) by as much as 125% over those that did not, i.e., [(17.58-7.82)/7.82] · 100 = 125%. Although percentage cycle time growth was relatively equal, with the addition of a TD phase (i.e., system prototype demonstrations), the net cycle time to the warfighter from both program start and EMD start to IOC was reduced by 17% and 21%, respectively, relative to MDAPs that did not use system prototype demonstrations. The average TD phase span for a sample of 41 MDAPs equated to 3.18 years. The noted improvement in percentage acquisition cost growth measured from 2006 to 2011, as compared to no improvement when measured against First Full Estimates (through 2011), coincides with the 2006 Public Law (NDAA, 2006) decree that all immature critical technologies are required to be demonstrated in a relevant environment (i.e., TRL 6) prior to receiving approval to enter EMD.

Figure 4. Comparison of MDAP Performance Key Constituents
(With & Without System Prototype Demonstrations Prior to EMD)

arj72-article-4-figure-4

Although the empirical analysis, as depicted in Figure 4, shows a minimal difference in percentage cycle time growth from program start to IOC for those MDAPs that did and did not use system prototype demonstrations before EMD, the development cycle time required to IOC or from program start to IOC is on the average 1.9 years shorter for MDAPs using prototypes. Coincidentally, programs that used system prototype demonstrations had a 9.8% lower mean total acquisition cost growth when assessed using 2006 to 2011 data.

When comparing available MDAP performance data that are coincident with the implementation of key DoD policy and congressional legislation, the benefits gained from pre-EMD system prototype demonstrations are amplified. Since policy was introduced by DoD in 2001 to adopt TRLs and implement a TRA-like process, a 23% reduction in mean total acquisition cost growth, relative to First Full Estimates (through 2011), has been realized (i.e., 26.2% cost growth prior to July 2001 versus 3.64% cost growth post-July 2001). Subsequently, with the enactment of the NDAA of 2006 establishing a TRL 6 certification requirement for all immature technologies prior to entering EMD, a further reduction of 1.63% is observed (i.e., 3.64% cost growth post-July 2001 versus 2.1% cost growth post-January 2006). Data were binned based on when the MDAP EMD start date occurred relative to the official instantiation of the policy or legislation.

System Prototype Demonstrations Increase Technology Maturity

Technology maturity at Milestone B is a significant factor since it gauges the level of technology risk carried forward into system development. Post-January 2006, the NDAA of 2006 ensured that a minimum acceptable TRL would need to be achieved before awarding a development contract. Just as important, but not currently regulated by DoD or legislated by Congress, is whether there should be a best practice or policy on the total number of CTEs considered reasonable for an MDAP to adequately manage in system development. The number of CTEs could imply adequacy of requirements and extent of system design complexity required to meet operational needs. The data show that when the cycle time from EMD start to IOC increases, there is a corresponding increase in the number of CTEs that were carried into EMD. This fact, coupled with the knowledge that EMD span increases with shorter TD spans, implies that the greater the number of immature critical technologies introduced into EMD, the greater the technology risk transferred to system development, and hence increased threat for increased cost and schedule growth (i.e., reduced buying power).

Figure 5 represents the total number of CTEs reported by MDAPs at entry to system development (i.e., Milestone B), independent of whether or not system prototype demonstrations were used prior to Milestone-B.

Figure 5. Percent Critical Technology Elements Per TRL Rating at Development Start (i.e., Milestone B)
(With & Without System Prototype Demonstrations Prior to EMD)

arj72-article-4-figure-5

The data show 77.7% of the MDAPs at Milestone B reported CTEs at TRL 6 or greater (47.3% at TRL 6 and 30.4% at ≥ TRL 7). The remaining 22.4% of the MDAPs entered system development with CTEs less than TRL 6. Up until January 2006, DoD was receptive to accepting and managing technology risk in EMD based on the establishment of a timely and viable risk management plan. The 25 MDAPs that did not meet minimum technology maturity requirements before entering system development held Milestone B prior to TRL 6 becoming statutory law in 2006 (NDAA, 2006). The mean number of CTEs entering system development is four for both system prototype and non-system prototype demonstration programs. MDAPs using system prototype demonstrations have shown a 12% reduction in the number of programs entering EMD with three to five CTEs. On the other hand, the data also show a 4.1% increase in the willingness of MDAPs using early system prototypes to carry 6 to 10 CTEs into EMD, and correspondingly a 2.2% increase for those carrying greater than 10 CTEs.

System Prototype Demonstrations Increase Systems Design Maturity

arj72-article-4-secondary-2A measure of design maturity is the percentage of engineering drawings available to be released to manufacturing at both CDR and by the Milestone C production decision point. For MDAPs sampled (n = 50), independent as to whether system prototype demonstrations were employed prior to EMD, only 48% of the MDAPs met DoD best practice goals (DoD, n.d; DoD, 2011) of 75% to 90% engineering drawings complete and releasable to manufacturing by CDR. Correspondingly, only 34% of MDAPs met the GAO best practice goal (GAO, 2013) of at least 90% by CDR. The mean percentage engineering drawings released to manufacturing by CDR for MDAPs that used system prototype demonstrations prior to EMD is significantly greater than those that did not (i.e., 73.7% for MDAPs using prototypes versus 51.25% for MDAPs not using prototypes). Although for CDR there is a notable 22.5% improvement in completion of engineering drawings for MDAPs using system prototype demonstrations prior to EMD, this mark remains slightly short of the DoD best practice goal and 16.3% short of GAO’s knowledge point best practice goal. The mean percentage schedule change to CDR (plan versus actual) for those MDAPs that conducted system prototype demonstrations prior to EMD is 1.84%—significantly less than the 12.45% realized for those programs that did not.

Conclusions

The following quote (Farrell, 2011) appropriately characterizes today’s environment and the need to apply systems engineering tools smartly, such as system prototype demonstrations, to achieve early and effective risk reduction:

“Gentleman, we have run out of money. Now we have to think.”
—Sir Winston Churchill

With the harsh realities of today’s economics and the need to implement true risk reduction activities through sound systems engineering practice, DoD is looking to leverage the knowledge gained through system prototype demonstrations to reduce technical risk and provide state-of-the-art weapon systems to the warfighter sooner—and at a decidedly reduced acquisition cost.

The knowledge gained by this study can help the government, in collaboration with industry, formulate more effective risk mitigation strategy for the transition of influential enabling technologies into system development such that overall cycle time to the warfighter can be reduced.

The application of system prototype demonstrations to improve technology maturity and accelerate design maturity, as evidenced by the findings of this study, do indeed have a profound positive influence on the outcome of weapon systems development performance. Data have also shown that with the implementation of key policy and legislation to reinforce the need to perform system-level prototype demonstrations prior to entering system development, MDAP total acquisition cost growth can be further reduced. Some key findings follow:

  • The greater the number of CTEs entering system development (i.e., EMD), the longer it will take to complete the preceding TD phase. Therefore, it can also be interpreted that the more mature the technology solution to fill a capability gap (i.e., leverage of proven technology), the less the dependence on TD and the shorter the cycle-time to deliver IOC to the warfighter.
  • Increased focus and time invested during TD to maturate technology solutions and reduce system development risk will have a positive contribution to reducing both acquisition cost growth and overall product cycle time to the warfighter.
  • Although all MDAP CTEs in EMD achieved at least TRL 6 by Milestone B since 2006, the average number of CTEs carried into EMD remained unchanged. Assuming the MDAP is not a production entry (i.e., Milestone C) or rapid deployment acquisition, researchers found no evidence to suggest any policy or directives that would minimize the actual number of CTEs acceptable for entry into EMD.
  • The average percent of manufacturing quality engineering drawings available by CDR is 22% higher for MDAPs that used system prototype demonstrations prior to EMD. There was insufficient evidence to link the percentage of engineering drawings completed to the amount of CTEs entering EMD.
  • MDAPs with system prototype demonstrations that exercised a TD phase realized reduced product cycle time of 17% (1.88 years) from program start to IOC, and 21% (1.87 years) for EMD start to IOC. Based on a sampling of 41 MDAPs, the average span time for a TD phase has been 3.18 years.

The knowledge gained by this study can help the government, in collaboration with industry, formulate more effective risk-mitigation strategy for the transition of influential enabling technologies into system development such that overall cycle time to the warfighter can be reduced.


 To print a PDF copy of this article, click here.

 References

Carter, A. B. (2010). Better buying power: Mandate for restoring affordability and productivity in defense spending [Memorandum]. Retrieved from http://www.acq.osd.mil/docs/USD_ATL_Guidance_Memo_September_14_2010_FINAL.PDF

Carter, A. B., & Mueller, J. (2011). Should cost management: Why? How? Defense AT&L, 40(5), 14–18. Retrieved from http://www.dau.mil/pubscats/ATL%20Docs/Sep-Oct11/DATL%20Sept_Oct11.pdf

Department of Defense. (n.d.). Defense acquisition guidebook. Retrieved from https://acc.dau.mil/CommunityBrowser.aspx?id=289207&lang=en-US

Department of Defense. (1986). A quest for excellence: Final report to the President by the President’s Blue Ribbon Commission on Defense Management. Retrieved from http://www.ndia.org/Advocacy/AcquisitionReformInitiative/Documents/Packard-Commission-Report.pdf

Department of Defense. (2000). Operation of the Defense Acquisition System (DoDI 5000.2). Retrieved from http://www.marcorsyscom.usmc.mil/Sites/PMIA percent20Documents/Resources/Department percent20of percent20Defense/DoDI percent205000-2 percent20DefAcqSys.pdf

Department of Defense. (2003). Department of Defense technology readiness deskbook. Retrieved from http://www.dtic.mil/dtic/tr/fulltext/u2/a418881.pdf

Department of Defense. (2008). Operation of the Defense Acquisition System (DoDI 5000.02). Retrieved from http://www.acq.osd.mil/asda/docs/dod_instruction_operation_of_the_defense_acquisition_system.pdf

Department of Defense. (2009). Department of Defense technology readiness assessment (TRA) deskbook. Retrieved from http://acqnotes.com/Attachments/Technology%20Readiness%20Assessment%20Deskbook.pdf

Department of Defense. (2011). Technology readiness assessment guidance. Retrieved from http://www.acq.osd.mil/ddre/publications/docs/TRA2011.pdf

Department of Defense. (2013). Operation of the Defense Acquisition System (Interim DoDI 5000.02). Retrieved from http://www.dtic.mil/whs/directives/corres/pdf/500002_interim.pdf

Drezner, J. A. (1992). The nature and role of prototyping in weapon system development (RAND Report No. R-4161-ACQ). Retrieved from http://www.rand.org/pubs/reports/R4161.html

Dunaway, D. (2013). Creating integrated warfighting capabilities. Proceedings, 139(8), 60–65. Retrieved from http://www.usni.org/magazines/proceedings/2013-08/creating-integrated-warfighting-capabilities

Farrell, L. P., Jr. (2011). ‘Gentlemen, we have run out of money; Now we have to think.’ National Defense. Retrieved from http://www.nationaldefensemagazine.org/archive/2011/November/Pages/‘Gentlemen,WeHaveRunOutOfMoney;NowWeHavetoThink’.aspx

General Accounting Office. (1999). Best practices: Better management of technology development can improve weapon system outcomes (Report No. GAO/NSIAD-99-162). Retrieved from http://www.gao.gov/products/GAO/NSIAD-99-162

Glass, G. W. (1988). CBO study: Concurrent weapons development and production. Retrieved from http://www.cbo.gov/sites/default/files/cbofiles/ftpdocs/55xx/doc5543/doc08b-entire.pdf

Government Accountability Office. (2006a). Defense acquisitions: Assessments of selected major weapon programs (Report No. GAO-06-391). Retrieved from http://www.gao.gov/new.items/d06391.pdf

Government Accountability Office. (2006b). Defense acquisitions: Major weapon systems continue to experience cost and schedule problems under DOD’s revised policy (Report No. GAO-06-368). Retrieved from http://www.gao.gov/new.items/d06368.pdf

Government Accountability Office. (2012). Defense acquisitions: Assessments of selected weapon programs (Report No. GAO-12-400SP). Retrieved from http://www.gao.gov/products/GAO-12-400SP

Government Accountability Office. (2013). Defense acquisitions: Assessments of selected weapon programs (Report No. GAO-13-294SP). Retrieved from http://gao.gov/assets/660/653379.pdf

Kendall, F. (2012). Better buying power 2.0: Continuing the pursuit for greater efficiency and productivity in defense spending [Memorandum]. Retrieved from http://www.defense.gov/news/BBPWorkforceMemo.pdf

Laerd Statistics. (2013). Pearson product-moment correlation. Retrieved from https://statistics.laerd.com/statistical-guides/pearson-correlation-coefficient-statistical-guide.php

MITRE. (n. d.). Assessing technical maturity. Systems engineering guide. Retrieved from http://www.mitre.org/publications/systems-engineering-guide/acquisition-systems-engineering/acquisition-program-planning/assessing-technical-maturity

National Defense Authorization Act for Fiscal Year 2006, Pub. L. 109-163 § 801(a)(1) (2006).

National Defense Authorization Act for Fiscal Year 2008, Pub. L. 110-181 § 812, codified as amended at 10 U.S.C. § 2366a (2008).

National Research Council. (2013). Assessment to enhance Air Force and Department of Defense prototyping for the new defense strategy: A workshop summary. Retrieved from http://www.nap.edu/catalog/18580.html

Nolte, W. L. (2008). Did I ever tell you about the whale? or measuring technology maturity? Charlotte, NC: Information Age Publishing.

Smith, G. K., Barbour, A. A., McNaugher, T. L., Rich, M. D., & Stanley, W. L. (1981). The use of prototypes in weapon system development (RAND Report No. R-2345-AF). Retrieved from http://www.rand.org/pubs/reports/R2345.html

Smithsonian Libraries. (n.d. a). Houseboat and launching apparatus. Samuel Langley’s Aerodrome [Photograph]. Galaxy of Images. Retrieved from http://www.sil.si.edu/imagegalaxy/imageGalaxy_enlarge.cfm?id_image=7683

Smithsonian Libraries. (n.d. b). Man’s first flight. Wilbur Wright’s 1st Successful Flight [Photograph]. Galaxy of Images. Retrieved from http://www.sil.si.edu/imagegalaxy/imageGalaxy_enlarge.cfm?id_image=11114

Taylor, J. (2007). DoD TRA policy. Presentation to Industrial College of the Armed Forces at Air Force Research Laboratory Technology Maturity Conference, Virginia Beach, VA, September 2007.

Technology Readiness. (2010). In Acquisition Community Connection [Online forum]. Retrieved from https://acc.dau.mil/CommunityBrowser.aspx?id=148681&lang=en-US

Weapon Systems Acquisition Reform Act of 2009, Pub. L. 111-23 §§ 203(a), 205(a)(3), codified as amended at 10 U.S.C. § 2366b(a)(2) (2009).

Young, J. (2007). Prototyping and competition [Memorandum]. Retrieved from https://acc.dau.mil/adl/en-US/180554/file/31687/Prototyping percent20Memo.pdf


Author Biographies

arj72-article-copelandMr. Edward J. Copeland is technical director for the Avionics, Sensors, and E*Warfare Department at the Naval Air Systems Command with over 31 years of experience in naval aviation RDT&E. Mr. Copeland is currently a PhD candidate pursuing a doctorate in systems engineering through The George Washington University. He received a master’s degree in engineering management from the National Technological University, a bachelor’s degree in electrical engineering from the University of Delaware, is a graduate of the U.S. Navy Test Pilot School, and is a Naval Air Systems Command Fellow.

(E-mail address: copefamily@md.metrocast.net)

arj72-article-holzerDr. Thomas H. Holzer is an adjunct professor of Engineering Management and Systems Engineering at The George Washington University. He previously served as the director, Engineering Management Office, National Geospatial-Intelligence Agency, with over 35 years of experience in systems engineering and information technology programs. Dr. Holzer holds a doctorate and MS in Engineering Management from The George Washington University and a BS in Mechanical Engineering from the University of Cincinnati.

(E-mail address: holzert@gwu.edu)

arj72-article-eveleighDr. Timothy J. Eveleigh is an adjunct professor of Engineering Management and Systems Engineering at The George Washington University and is an International Council on Systems Engineering Certified Systems Engineering Professional. He has over 30 years’ industry experience working on DoD/Intelligence Community information technology acquisition challenges, research and development, and enterprise architecting. Dr. Eveleigh has a 30-year parallel career as an Air Force Reserve intelligence officer and developmental engineer, focused on command and control integration.

(E-mail address: Eveleigh@gwu.edu)

arj72-article-sarkaniDr. Shahryar Sarkani is an adjunct professor in the Department of Engineering Management and Systems Engineering at The George Washington University. He has over 20 years of experience in software engineering. Dr. Sarkani holds a DSc in Systems Engineering from The George Washington University, an MS in Mathematics from University of New Orleans, and a BS in Electrical Engineering from Louisiana State University.

(E-mail address: emseor2003@yahoo.com)

 

Comments

comments

Leave a Reply

Your email address will not be published. Required fields are marked *