Applying Early Systems Engineering: Injecting Knowledge into the Capability Development Process


To print a PDF copy of this article, click here.

Authors: Mark Pflanz, Chris Yunker, Friedrich N. Wehrli, and Douglas Edwards

A common problem in defense acquisition is the difficulty in ensuring that the required capabilities stated in capability development documents are technically feasible, affordable, and available through mature technologies. This problem is driven by a lack of knowledge on both the capability developer and program manager teams. Addressing this knowledge gap requires a new approach to capability development, where knowledge gained early in the process is injected into the capability development process in a rigorous way. This article describes that new technical approach along with lessons learned on two large acquisition programs. Key tenets include the use of pre-planned knowledge points as a vehicle for expanded collaboration between program managers and capability developers, and early use of systems engineering fundamentals.


The current capability development environment includes a host of challenges: the need for increasingly capable systems, often with greater complexity; time constraints and the resultant pressure on rapid delivery of new capabilities; and increased cost pressures. As new capabilities are developed, the threat and operational environment continues to adapt, often necessitating midstream changes to requirements or other aspects of the capability. Mandatory requirements to satisfy larger DoD policy goals must also be addressed. These challenges are made more difficult by a lack of knowledge on the part of the capability developer as well as the program management teams regarding technology maturity, technical feasibility, and affordability. This situation makes it difficult to reconcile requirements stated in capability development documents, with the ‘state of the possible’ in terms of feasibility and cost. The purpose of this article is to outline a technical approach to addressing these problems. Key tenets of this approach are use of pre-planned Knowledge Points as a vehicle for expanded collaboration between the capability and program managers (PM), and early use of systems engineering fundamentals in the capability development process.

This approach has been demonstrated on the Joint Light Tactical Vehicle (JLTV) program throughout the Technology Development (TD) Phase (over 36 months until its Joint Requirements Oversight Council [JROC] approval), and based on that success is now being implemented on the Marine Corps Amphibious Combat Vehicle program. Although these programs remain in development, the purpose of this article is to describe a technical approach that has shown promise for those PMs opting to apply the techniques and lessons learned described herein to their own programs.

Background

In September 2007, then Under Secretary of Defense for Acquisition, Technology and Logistics (USD[AT&L]) John Young directed that all acquisition programs requiring USD(AT&L) approval include competitive, technically mature prototyping from two or more industry teams through Milestone B. Programs requiring USD(AT&L) approval are typically the largest, most expensive, and most complex (Young, 2007). Competitive prototyping was later incorporated into Department of Defense Instruction 5000.02. Secretary Young directed this policy to address the problem of large weapon system programs being initiated with an inadequate understanding of technical risk, without firm requirements, and with a weak foundation for estimating developmental and procurement costs. This situation results in an unacceptable number of programs not meeting performance, cost, or schedule requirements. The JLTV program was the first ACAT 1D program to apply this directive. This competitive prototyping paradigm in the TD phase offers capability developers a unique opportunity, but confers a responsibility for a technically sound capability development approach.

As foreshadowed by implementation of the Joint Capabilities Integration and Development System (JCIDS) by DoD in 2003, this new approach to capability development involves early use of systems engineering and technical analyses to supplement the existing operational analysis techniques currently used in capability development activities. To meet their responsibilities in the acquisition process, capability developers must make capability trade-off decisions based on the performance of industry in meeting the requirements, cost, and risk. The involvement of industry prototypes at significant investment make close PM and capability developer collaboration essential to understanding the TD phase results and translating that knowledge into decisions that guide the new capability documentation.

As draft requirements are provided to industry to begin design, the capability developer must remain actively engaged in the design reviews for informed trade-off decisions. Exercising their leadership in establishing the foundational requirements, the capability developer must remain active in framing and observing the results of early key testing to make informed judgments about industry’s success in meeting the requirements.

As design, fabrication, and test takes place, the operational relevance, feasibility, and cost of some requirements will be clear, but the best combination will not. Because the design of a system includes a series of trade-offs, indicators and issues on the more critical decisions of best balance in cost versus performance will not be clear-cut. The indicators will manifest themselves piecemeal at various points in design and test, and the capability developers use the systems engineering framework to orient and correctly place indicators in a logical decision series leading to a sound capability statement. Informed capability decision making requires understanding the basics of technical issues and using that understanding to supplement user expertise to state a feasible capability. The capability developers own the requirement, but the results of a competitive prototyping TD phase will, by definition, produce changes to the draft Capability Development Document (CDD) used at the start of the TD phase. To truly ‘own’ the CDD, capability developers need to be conversant in the basics of the technical issues uncovered in the TD phase to resolve and state the best expression of feasible and useful capabilities in the draft CDD. Gaining a working understanding of the technical issues involved in capability decisions will require access to technical resources, discussed later in this article.

A “Knowledge Point”-Based Approach

Competitive prototyping provides an immense array of valuable information based on the success of the competing industry teams in meeting performance, schedule, and cost as outlined in the TD phase initiating requirements. The primary goal of the capability developer during this phase is to translate knowledge gained in the TD phase into a technically achievable, operationally relevant, and affordable set of required capabilities documented in a revised CDD. Abstractly, the capability developers could revise the CDD using knowledge of the TD phase in one of two ways: incrementally, or with a ‘big bang’ at the end. The big-bang approach presumes an extremely high level of ability in translating all of this information and getting it right in a single change. Alternatively, the capability developers can play an active role in TD activities, incrementally updating the CDD at pre-planned intervals, based on major events in the TD phase where key information elements are expected to be available. Incrementally is preferred for a number of reasons. First, comprehensively capturing all necessary changes is difficult over the course of the TD phase: organizations often lose focus. Second, the more revisions done at a single point, the more difficult it is to manage. The more potential changes that occur simultaneously, the greater the need for analysis resources, which can be more efficiently used over time. Finally, an incremental approach allows capability developers to identify an issue, establish an analysis team, conduct the analysis, and reflect the recommendation in a rigorous manner.

We next introduce a new term, Knowledge Point (KP), as an approach to address these issues. A KP is a pre-determined, event-based CDD review where accumulated knowledge is injected into the CDD, updating the requirements based on analysis or test results . The main idea is to translate information gained at key points during the TD phase into actionable knowledge to refine the CDD and system specification. The incremental approach is event-driven and tied to targeted information gaps. For a major program, the capability developer may conduct four to eight KPs, depending on the depth and complexity of the initiative and the length of the TD phase activities. The number of KPs will be driven by the number of key events triggering a KP and the amount of time available. As time decreases, fewer KPs may be practicable or multiple key events may be combined into a single KP. Events that trigger a KP review include: industry design reviews (Preliminary Design Review, Critical Design Review, etc.); the conclusion of major test phases (ballistic hull testing, performance testing, etc.); and the conclusion of major analysis activities (Analysis of Alternatives [AoA], Trade Studies, etc.). Figure 1 displays this sequence of events. KPs are capability decision briefs that assess the information available to revise the CDD. The major result of each KP is a revised CDD with associated analysis products supporting the decisions made at that KP. A secondary result of KPs is to initiate analysis activities to address the problems raised at a particular KP. Such analyses and trade studies are then due at a future KP for implementation in the CDD. To reduce confusion and ensure transparency, the capability developers only update the CDD at KPs, not in between.
In planning a KP approach, the capability developers should identify and carefully consider key knowledge gaps associated with the initiative. Which key requirements are considered high risk? What are the system boundaries? When are cost projections and affordability estimates available? The program manager has a responsibility to assist the capability developer in identifying these knowledge gaps. In a well-designed program, information about these knowledge gaps will be addressed by the TD phase events planned by the program manager.

Figure 1. Planning a Knowledge-Based Approach

For example, feasible protection requirements are addressed in live fire testing; feasible reliability is assessed in durability testing; weight is assessed in design reviews and upon prototype arrival at test centers. Where knowledge gaps are not addressed, the capability developers must work with the program manager to get these key knowledge gaps addressed in the planned activities. The capability developers must also consider when this information is available with respect to the CDD development timeline, and work with the testing and cost authorities to ensure that their products are available early enough to influence the CDD refinement activities. Stove-piped delivery of test results and cost estimates that are not available until very late in the TD phase will not support the CDD decision timeline. Collaboration is required to sequence test activities and cost analysis activities to address key concerns early using interim reports.

Implementing a KP-based approach to incrementally refining the CDD provides several key benefits. First, it provides a framework upon which PMs can base their own plans, synchronizing the overall effort. Specification development activities can base their development plans from the KP timeline. The AoA and cost analysis teams can use specific KPs as a data cut-off point. Key tests can be scheduled to ensure results are available to inform the CDD. Importantly, this approach ensures transparency in how analysis and test results are used to drive key CDD decisions. Second, all CDD decisions are implemented in an open KP format with key stakeholders present. Transparency eliminates confusion, allowing sequential decisions by the systems engineering, test, or cost organizations to proceed with the best information about the intent of the decision and the constraints under which it was made. Third, a knowledge point, incremental approach allows for the full impact of a decision to be clarified or revisited as the phase progresses. To summarize, having a series of KPs supports a deliberate analytical process in which issues are sequentially identified and framed with assumptions, analyses are conducted, and recommended solutions are then presented to leadership for decisions and recorded in the newest CDD draft.

Executing A Typical Knowledge Point

The capability developers must own the KP process. Each KP event should be structured as a decision brief with defined decision authority. Decision authority is discussed in more detail later in this article. The Requirements Integrated Product Team leading up to each KP is the place for detailed discussions and development of recommended positions on each issue, allowing the KP to be focused on the ‘so what’ of key analysis or test results. The agenda of each KP can include updates of ongoing studies, but is effective when focused on a final results briefing of completed analyses ready for decision. While large groups tend to complicate decision making, KP attendees should include all the key stakeholders for stable decisions, ensuring transparency. Two stakeholders, the PM and lead systems engineer, hold special prominence at KP reviews as they hold the most accurate assessments of technical feasibility, maturity, and cost and schedule risk. The capability developer plans, coordinates, and leads the analysis activities, often relying on the PM or other technical expert to assist in the conduct of each analysis activity. When analysis and test activities are ready for presentation, the capability developer and technical experts collaborate in presenting the material at the KP. Later in this article, we discuss access to technical resources, which is a key enabler of sound capability development decisions.

Each KP should include success criteria to assist in communicating with stakeholders the focus of each KP event. The results of the KP should be summarized in a Memorandum for Record (MFR) stored in a location accessible to those who need to reference it. The capability developers and PMs will reorganize their efforts after each KP to ensure they are correctly aligned with the overall direction of the capability development effort based on the decisions made at the KP. Therefore, the KP and its results must be accessible. External agencies will also seek to minimize disruptions to a program, and can use KPs as key interface points with which to engage a program.

Well in advance of each KP, the capability developers provide a draft copy of the CDD with which stakeholders are invited to generate comments. Using a standardized format, such as the existing JROC Knowledge Management Decision Support Comment Resolution Matrix is recommended for simplicity. Comments from the stakeholders should be returned with sufficient time (approximately 2 weeks) prior to the KP event to allow time for background work to be conducted on each comment. The capability development team takes each change recommendation and conducts an impact (traceability) assessment to determine which related CDD attributes would be affected by the change. Each comment is characterized as a ‘non-issue,’ ‘major analysis,’ ‘minor analysis,’ or a ‘deferral.’ Changes that didn’t require analysis (i.e., could be accepted or rejected without further effort) are characterized as ‘non-issues.’ Changes where insufficient data exist or where the answer will be available at a defined future event (such as a test) are characterized as ‘deferral’ and are deferred until the correct data are available. Changes proposed to critical requirements or requiring further analysis are characterized as major analyses. Changes requiring further analysis and proposed to lower tiered, non-KPP requirements are characterized as minor analyses. Once comments requiring analysis are characterized the study objectives are determined, and guidance and resources are assigned (Figure 2).

Figure 2. Executing the Knowledge Point Macro Process

Once the background work is completed, the results are published 3 days prior as a read-ahead for the KP. This allows participants to arrive knowing all of the salient issues and understanding the decisions needed at the KP. The background work reflects the recommended ‘going-in’ positions at the KP. However, no decisions are made except at the KP to ensure transparency. At the KP, each proposed change with supporting analysis is reviewed, and the CDD decision authority adjudicates the proposed changes after receiving input from key stakeholders. Those changes adjudicated as major analyses, minor analyses, or deferrals are tagged as ‘on-hold’ and tracked in the requirements management database. Decisions on each proposed change are made only at KPs when the analysis is complete, not necessarily when the proposed change is first submitted.

KPs should include the use of metrics and culminate in a decision to either publish a revised CDD or publish an erratum. This provides a quantitative snapshot regarding requirements uncertainty, detailing what studies have been closed and implemented, as well as which are outstanding. It reflects how many change proposals are being submitted at a given time and helps assess relative success at dealing effectively with the complete set of proposed changes.

Key to sound decision making is the rigorous use of analysis and test results to underpin every activity and decision. Deferring decisions until sufficient information is available is preferable to changing an attribute or CDD section multiple times. Reliance on test results and technical analyses moderates the influence of any one stakeholder group. While not always easy, making CDD decisions only at the KP is key to maintaining transparency and critical to reinforcing the goal of always underpinning every CDD change based on analysis or test results. The results of each KP should be communicated throughout the capability developer and PM organizations to ensure everyone understands how these decisions affect their own work. This can be accomplished via an MFR summarizing the KP outcomes. Publishing an MFR ensures only one (vice multiple) interpretations of a decision made, which is especially important if the KP decision is to publish an errata, rather than an updated draft CDD.

For key requirements, the decision authority for a CDD change is typically the capability development senior leadership. Therefore, following select KPs with a General Officer—or SES-level senior leadership review —is useful to validate key decisions. For example, a senior leader review can be used to validate the Key Performance Parameter (KPP) or to validate a key trade-off decision with far-reaching effects. This ensures that the Service leadership remains engaged in the capability development initiative, and can serve as a forum to reconcile differences that could not be resolved at the action officer level. However, to preclude schedule slip, these reviews should be scheduled in advance. Additionally, the scheduling of senior leader reviews should balance their availability and authority with the substance of the issues being reviewed.

Early Use of Systems Engineering Fundamentals

Early use of systems engineering fundamentals is essential to successfully implementing a KP-based capability development approach. Key tenets include: (a) determine the plan upfront; (b) application of best practices; (c) enterprise-level use of requirements management software; (d) access to technical resources; (e) integrating test results; and (f) early and ongoing cost integration.

A comprehensive technical plan is essential during the TD phase: our warfighters depend on us, and a significant amount of taxpayer money is involved in any TD phase initiative. The plan should address the timing, events, and execution of various KPs and the knowledge gaps they seek to resolve. It should address what roles and responsibilities various organizations will play in terms of issue identification, analysis, decision authority, and closure. Given that potentially a lot of changes to a CDD and systems specification can occur, how will these changes be tracked, managed, and burned down? How will analyses initiated at a given KP be tracked and managed? Decision authority is especially important. At each KP, the lead capability developers should make CDD-relevant decisions after hearing the key points of stakeholders, with special attention paid to the PM and lead systems engineer. Certain key decisions, such as regarding a KPP, should be validated following select KPs at a senior leader review. All of these decisions are re-validated as the CDD moves through Service and Joint staffing as well as during key acquisition meetings such as the Defense Advisory Boards. Finally, the plan should address how software will be used in the process for key activities like requirements management, test integration, and include how any classified aspects will be handled. While many of these are simple, they must be documented to ensure common understanding given the number of people involved in a large program. Some decisions are not at all simple and require forethought and planning. All of these decisions and the resulting plan should be documented in the Requirements Management and Analysis Plan (RMAP) and signed by each of the lead capability developers and PMs. Implementing this plan, including the sections described below, requires an investment of resources by the capability developer in terms of people and funding, and a commitment to the processes it describes. For the capability developer, this may require one to three additional staff members to execute this process, depending on the status of the program. No additional staff is needed for the PM. To keep the RMAP from growing stale, it can be reviewed at each KP to determine if changes should be made in the plan. A copy of the technical plan used to execute the KP process on JTLV (Pflanz & Clark, 2009) is available through the Defense Technical Information Center (DTIC) Online Access Controlled as accession SURVIAC-SV-33264.

Best practices in systems engineering, as taught at the Defense Acquisition University, must continue to make their way into capability development activities. This principle aligns with the general guidance of the JCIDS as described in Chairman Joint Chiefs of Staff Instruction 3170. However, certain aspects are particularly important and worth elaboration. First, the attributes in the CDD should include decomposition and relative prioritization (Figure 3). Decomposition is important because it describes how a top-level capability, such as a KPP, is supported by lower level capabilities. A functional hierarchy can be developed to support decomposition using existing systems engineering techniques. When doing the impact assessment during the KP process, this decomposition can be used to support the impact (traceability) analysis to determine what other requirements are affected by a single attribute change. Relative prioritization is equally important. It can be used to inform trade-off decisions during the KP process to preclude lower level attributes from causing undue performance or cost risk to high-priority capability, such as a KPP. Relative priority also can be flowed down into the system specification. Relative priority can be established by assessing an attribute’s ‘depth’ in the functional hierarchy and through subject matter expertise. Relative priority can be reflected using a set of tiers, where the definition for each tier is clearly defined (Figure 3). It is essential that industry understand the relative priority of the requirements for it to make sensible trade-off decisions when building prototypes. While the CDD is important, the warfighter ends up receiving what industry builds, and industry will build to the system specification, not the CDD. Therefore, conducting a series of CDDs to system specification crosswalks is absolutely essential to success. These crosswalks should verify that each attribute is completely and accurately decomposed, and that there are no requirements in the system specification without a parent in the CDD. The results of these series of crosswalks should be agreed to by senior leadership at a formal review prior to strategic points in the acquisition process. This is critical to ensuring the system specification is a sufficient and accurate representation of warfighter needs stated in the CDD.

Figure 3. Decomposiiton and Tiering the CDD Attributes

Enterprise-level use of requirements management software is a key enabler to rigorous execution of the KP process. IBM’s DOORS© is one popular software package. The increased demands on performance, complexity, and costs of systems now being developed require tight coupling between operational requirements stated in the CDD, system requirements stated in the specification, and test results. All of the requirements documents, such as the CDD and system specification, need to be resident in a single database with controlled access for authorized staff among capability developers, PMs, and testers. Since capability developers and PMs will often be geographically separated, this may require a networked tool to allow all data to reside on a single database. The more often the CDD is updated, the more frequently all of those ripple effects will flow down toward related and child-level documents. While it may be physically possible to manage a CDD in MS Word, doing so is not recommended. Changes will get lost or misapplied. The program office will have difficulty in tracing requirements and decomposing requirements as they change.

Access to technical resources is also essential to effective execution of the KP process. A large-scale capability development effort will include a wide variety of technical aspects. No one organization or individual can be expected to provide technical expertise across the spectrum. Establishing a working relationship and ensuring access to technical experts in the government are essential to success. The government has established centers of excellence in almost every area of science and engineering relevant to weapon systems development and should be included where possible. By resourcing these agencies to conduct analyses to support capability development, capability developers get access to the best minds in government who are already ‘past the learning curve’ on the particular issue at hand. Importantly, a capability development effort should establish a standing Whole Systems Trade Study (WSTS) group. The WSTS group focuses on whether the KPPs and other key requirements are achievable at the whole system level. An example of one such group is the U.S. Army Tank Automotive Research, Development and Engineering Center, Advanced Concepts Lab (TARDEC ACL). On the JLTV program, the TARDEC ACL served as a WSTS Group by building full computer models of a government design for JLTV, and also analyzing industry designs as they matured. They analyzed whole system achievability, as well as manipulating designs to answer ‘what if’ questions. The WSTS group government designs were also used as alternatives in the AoA, and portions of the WSTS group participated in the AoA. For JLTV, the WSTS Group was especially important in the decision to increase the JLTV underbody protection requirements and determine which other system requirements must be traded. Here, the computer models proved invaluable to underpinning this key protection decision.

Integrating test results is a key enabler of effectively executing the KP process. The test results must show that the current requirements stated in a CDD for a program at Milestone B are achievable; or where modified from the delivered prototypes, they are estimated as achievable by a credible expert authority or analytical modeling result. Assessing test results is difficult because it involves a complex mapping of multiple prototype test results used to assess achievability and the fact that requirements often changed during execution of the KPs. Collaboration of the testers, systems engineers, and capability developers is required to sufficiently translate the test information into actionable knowledge that can be applied in the CDD.

Test results are traditionally available at the end of testing, and therefore the end of the TD phase. This is not compatible with a competitive prototyping TD phase where the requirements are periodically updated as described previously. However, a prioritized test schedule can be developed using phases where test results are available at the end of each phase. KPs can be tied to the timing of each test phase. If done in priority order, the most important CDD attributes are verified first, with lower importance attributes varied as the testing progresses. There will be important exceptions to this rule. For example, durability testing typically involves long durations; therefore, reliability and certain sustainment attributes cannot be verified until late in the phase. However, these exceptions can be dealt with while still verifying as many key attributes as early as possible.

The purpose of the TD phase is to “get the requirements ‘right’”; therefore, a logical consequence of the TD phase is changed requirements. Where possible, the test plans should be modified to reflect new changes to the CDD at a prior KP. For example, if a KPP changes or the mission profile changes in time to be reflected in testing, then the program will benefit from testing to the new requirement vice the old requirement. Not passing a modified requirement (to which industry did not design toward) does not necessarily invalidate the new requirement; however, it does increase the level of uncertainty in the achievability of that attribute. Finally, it is essential for capability developers to be present at certain key test events to collect the ‘right’ take-aways from the test and to ensure that the testing is in accordance with the implicit vision and explicit attributes of the CDD.

Early integration of cost estimates is the last key enabler to the KP process. Test results and delivered prototypes may demonstrate achievability, but not affordability. Correlating cost estimates from prototype work to affordability estimates requires early integration of the cost estimating profession. This requires several key activities. First is establishing a cost threshold beyond which the system is at risk of not being affordable; in the case of JLTV, this meant establishing cost as a Key System Attribute. Second is to correlate cost-driving requirements with the relative priority of requirements. Using a cost-informed trade-off assessment, the capability developer must be prepared to make difficult trade-off decisions to ensure low-priority, cost-driving requirements do not price the capability above the affordability cutline. Typically, these cost trade-off decisions require the participation of senior leaders. This integration of cost analysis is similar in scope to the DoD’s Better Buying Power Initiative (https://dap.dau.mil/bbp) where cost analyses are used to inform systems engineering trade-off decisions to meet an affordability target.

Conclusions

This article described a new approach to capability development in the DoD’s new competitive prototyping guidance for the TD phase. It focused on how the draft requirement is refined through a series of KPs, enabled by early use of systems engineering fundamentals. By following the ideas established in this approach, future programs can tailor their application based on program peculiarities; however, the common principles described here will endure regardless of scope or application.


 To print a PDF copy of this article, click here.

Author Biographies

Dr Mark PflanzDr. Mark Pflanz is a lead associate for Booz Allen Hamilton and has supported USMC ground vehicle development since 2004. He is a graduate of the U.S. Military Academy at West Point and a former U.S. Army officer. Dr. Pflanz holds a Master of Science in Systems Engineering from Virginia Polytechnic Institute and State University, and a PhD in Systems Engineering and Operations Research from George Mason University.

(E-mail address: pflanz_mark@bah.com)

Mr Charles YunkerMr. Chris Yunker currently works at the Nevada Automotive Test Center Virginia Office. Formerly, he worked as the mobility section lead, Fires and Maneuver Integration Division, Deputy Commandant for Combat Development and Integration (CD&I), Headquarters Marine Corps, in Quantico, Virginia. Mr. Yunker was the lead combat developer on multiple Marine Corps ground vehicle initiatives, including the JLTV. He is a graduate of Cornell University and a retired U.S. Marine Corps officer.

(E-mail address: cyunker@natc-ht.com)

Mr Friedrick N WehrliMr. Friedrich N. Wehrli is currently the mobility division chief, Materiel Systems Directorate, U.S. Army Sustainment Center of Excellence, Fort Lee, Virginia. As the mobility chief, he is the U.S. Army lead capabilities developer for the JLTV, Driver Vision Enhancer, and multiple U.S. Army watercraft. Mr. Wehrli is a graduate of the University of Connecticut and a retired U.S. Army officer.

(E-mail address: friedrich.n.wehrli.civ@mail.mil)

Mr Douglas EdwardsMr. Douglas Edwards is currently a program analyst at Fires and Maneuver Integration Division, Combat Development Directorate, Headquarters, U. S. Marine Corps (USMC), Combat Development and Integration Command in Quantico, Virginia, where he is the USMC Joint Light Tactical Vehicle capabilities integration officer. Mr. Edwards is a Certified Defense Financial Manager and holds an MS in Management from the U.S. Naval Postgraduate School.

(E-mail address: douglas.w.edwards@usmc.mil)


References

Pflanz, M., & Clark, N. (2009, January 21). Joint Light Tactical Vehicle requirements management and analysis plan (RMAP) (Report No. SURVIAC-SV-33627). Wright-Patterson Air Force Base, OH: Survivability/Vulnerability Information Analysis Center.

Young, J. (2007). Prototyping and competition [Memorandum]. Retrieved from https://acc.dau.mil/CommunityBrowser.aspx?id=180554

Comments

comments

Leave a Reply

Your email address will not be published. Required fields are marked *