To print a PDF copy of this article, click here.
Project management has been a constant challenge for the U.S. Department of Defense (DoD) acquisition community. While most DoD projects are technologically advanced, the tools and methods to manage these projects are the same as for simple, repetitive projects. The authors argue that traditional approaches fail because they only evaluate the relationships between two of the three elements of cost, schedule, and performance. Instead, they have developed a system dynamics model that allows cost, schedule, and performance to interact and influence one another. This model is complementary to other research and intended to be usable by the practicing project manager. The results from model runs will provide consequences for three potential control alternatives in DoD project management.
The search for the ideal form of DoD project management has existed since the dawn of DoD projects and is driven by a desire to reduce the DoD project failure rate. In fact, many external observers as well as quite a few DoD acquisition professionals would probably say that more DoD projects fail than succeed. Many researchers have explored the reasons for general project failure in great detail. They have developed many theories that withstand academic scrutiny, but the range of even simply understanding the nature of projects varies depending on project characteristics, the project industry, and where researchers decide to restrict their studies. Ultimately, no one solution makes DoD or any other domain of project management work. That the DoD project management industry has not had more research is somewhat surprising given the large number and size of DoD projects. However, this is a common problem in most project management domains (Love, Edwards, & Irani, 2008). Certainly, DoD could gain more appreciation and insight from an improved understanding of the way it conducts project management, which is different from other industries.h4
According to the U.S. Government Accountability Office (GAO), half of the major defense acquisition programs are not meeting cost goals, and 80 percent have increasing unit costs (GAO, 2011). GAO further notes that between 2008 and 2010, the 98 major defense acquisition projects have grown in budget by 9 percent (GAO, 2011). In the Fiscal Year (FY) 2012 budget request, the U.S. Department of Defense (Comptroller) asked for $85.3 billion of its $553.1 billion budget, or approximately 15.4 percent for major defense acquisition projects (Comptroller, 2011).
Frequently, budget pressure, schedule pressure, or changing user demands are cited as the reasons for both commercial and DoD failures (Meier, 2010). However, these challenges have been present for as long as projects have been undertaken, and the trend within DoD is getting worse instead of better (GAO, 2011). The fact that many of these projects are developmental and have little or no basis for comparison is a fair excuse for why initial cost, schedule, and performance estimates prove to be incorrect. However, DoD project management needs to develop approaches to overcome the current trends. The first step to improvement is a solid understanding of how most DoD project management works. The goal of the authors’ research is to ultimately provide insight into the practical application of alternative decisions within DoD project management from the perspective of the government project manager.
Project management has existed since man began building things. However, many researchers define the beginnings of the formal discipline of project management with the U.S. Polaris missile development and use of critical path methods in the 1950s (Lyneis, Cooper, & Els, 2001; Pich, Loch, & De Meyer, 2002; Tishler, Dvir, Shenhar, & Lipovetsky, 1996; Williams, 2005). Therefore, a close connection exists between DoD projects and formal project management, but not all project management is the same. Tishler et al. (1996) note that defense projects are different from commercial projects due to a larger, more interdisciplinary design and higher technological risks. Despite this early bond between DoD and project management, many U.S. senior DoD leaders as well as the U.S. Congress have expressed the view that U.S. DoD project management needed improvement as early as 1970, and have since changed the acquisition policy guidance nine times (Ferrara, 1996).
Sorenson (2009) provides an in-depth overview of the history of U.S. DoD acquisition. He notes that the “current defense acquisition process is constructed on a foundation of distrust” (Sorenson, 2009). By this he means that the distribution of power as well as the extensive oversight is all in place to ensure that everyone is involved in doing the right thing, and to avoid the illegal and immoral past history of highly publicized procurement irregularities related to defense acquisition. He notes that there are variations in how projects are executed and decisions that are made on varying projects. Sorenson (2009) further comments that the Secretary of Defense has (though infrequently) terminated acquisition programs, but Congress never has. He highlights many of the problems with defense acquisition, from poor cost estimates to development delays to changing requirements to excessive oversight (Sorenson, 2009).
Other researchers have found that many DoD project managers underestimate cost and schedule due to the failure to understand complexities involved as well as seemingly futile efforts to correct an underperforming project, which often results in blaming exogenous variables as opposed to endogenous ones (Lyneis & Ford, 2007). They further add that a great deal of research exists noting general theories on the need to reduce elements of project management rework cycles, but domain-specific advice or research is limited (Lyneis & Ford, 2007). Ford and Dillard (2009) address this deficiency using a system dynamics model of the JAVELIN missile development program that allowed them to evaluate the strengths and weaknesses of “evolutionary acquisition” by comparing two strategies for system development.
System dynamics also has a long history with DoD. One of the earliest uses of system dynamics within DoD was in the diagnosis and legal support for delay and disruption claims by Ingalls Shipbuilding against the U.S. Navy in the development of the amphibious assault ships in the 1970s (Cooper, 1980). This model was effective, but was taken from the perspective of the contractor in the 1970s. An extensive system dynamics model was used to evaluate the general field of software development (Abdel-Hamid & Madnick, 1991). Black and Repenning (2001) developed a generic system dynamics model based on a commercial manufacturing new product development to evaluate how early failure to apply appropriate resources to a project (or multiple projects) results in a “firefighting” phenomenon that results in poor project performance. Taylor and Ford (2006) further reinforce this research with the same phenomenon and additional “tipping point” analysis as applied to construction management. These research results are highly valuable, but focused on the commercial world that does differ from government project management in that government project management is more focused on managing a contractor who is doing the development. More recent uses of system dynamics models have been in the actual prosecution of combat operations in the areas of command and control, search and rescue, and irregular warfare (Coyle, Exelby, & Holt, 1999; DoD Announces, 2008). All of these models are valuable and insightful, but they do not provide practicing project managers much specific detail in ways to perform their jobs better.
In the last 3 years, Ford and Dillard’s JAVELIN system dynamics model is one of the most recent models and does provide good insight into varying acquisition approaches to a project. Both the current authors and they agree that a DoD project manager must still accomplish a single-block development even within the larger evolutionary acquisition; and attempting to document and model all external influences on DoD project management may be futile. Therefore, our approach is to develop a historically based, empirical model that produces the final cumulative cost, schedule, and performance results in a manner that allows us to evaluate the consequences of three simple control alternatives within any larger acquisition framework. Thus, our model could one day be incorporated into Ford and Dillard’s to provide additional understanding of the dynamics of DoD project management. We believe our research will help practicing DoD project managers better understand positive and negative consequences of simple project control alternatives that they may consider.
If you were to ask many DoD professionals to describe how acquisition truly works at the strategic level, they would describe a framework similar to that depicted in Figure 1. In this figure, one strategic activity is all or the vast majority of what influences the next downstream strategic activity. Therefore, a given threat (or change in threat) causes a new delivered requirement that changes a budget estimate, which causes the schedule estimate, and the end result is a performance expectation. At the budget event, Congress may intervene and adjust up or down the budget, which impacts the downstream activities. Eventually, the expected performance will have an impact on the threat. While this figure is relatively simple to understand, the problem is that it ignores the speed of change in many of these subactivities (i.e., budget development every 2 years while the requirements may change every year or less) and ignores other impacts of the subactivity interaction. It further assumes that a single change can be controlled or managed with “simple processes” such as through the monitoring of a work breakdown structure or earned value management. Previous research has shown these techniques may be effective if there are relatively few or no unknowns, but they prove inadequate when there are many unknowns or the true state of the variable may not be known for some time (Dvir & Lechler, 2004; Thomas & Mengel, 2008).
Figure 1. DoD Acquisition at the Strategic Level
As an alternative, the authors suggest that DoD professionals adopt a mental framework like that shown in Figure 2. In this figure, all of the same activities as Figure 1 are present. However, Figure 2 illustrates that delays are present and every activity impacts every other activity. This means that an adjustment to the budget will impact schedule and performance over time. That this is a more accurate depiction of the real world is usually not in question, but the major issue is how we deal with this. The authors believe that a system dynamics model will serve as an effective tool to better understand what is going on and to propose an alternative for improved system response. In other words, we hope to find a better way to perform DoD project management for the practicing project manager.
Figure 2. Mental Framework
It is important to first note that the authors subscribe to the belief that DoD project management as a system is poorly understood. Therefore, our objective is to better understand the system and its responses through the use of an empirical model. Because of this objective, our model is relatively simple and strives to show several aspects of the system and its dynamic behavior over time that may help improve the overall results. The model is shown in Figure 3.
Figure 3. Work Flow Process Surrounding System Design/Understanding
The foundation of our model is a work flow process surrounding system design/understanding. Our work flow process is very simple and defines system design/understanding project work of any kind as a percentage in one of three states: work to be done, work in progress, and work completed. The completion of all work would equate to a perfect understanding of the system being acquired, which should be the goal of all projects. While we make a simplifying assumption that all work is equal in priority and execution, we believe that over the entire project life cycle, this is the ideal case and appropriate for our objective. With all other influences being removed, our work flow will complete all activities within 15 years, which is appropriate for most DoD projects. In our baseline model, no unknown work or rework is included. While this is certainly not accurate nor representative of the real world, our goal is to best understand the ideal case before moving to more intricate situations.
To allow for dynamic consequences, control of starting and completing work is done through a comparison of the projected or estimated budget, schedule, or performance to the actual budget, schedule, or performance. In the case of the budget comparison, we have incorporated a 2-year delay due to the DoD Planning, Programming, Budgeting and Execution process where a project budget is submitted, and about 2 years later the actual approved budget is delivered. (We use the terms “budget” and “cost” interchangeably in this model for simplicity, and are only focused on total system design and production costs, not the actual total life-cycle costs that include sustainment and disposal.) When the difference between the estimated budget, schedule, or performance and the actual budget, schedule, or performance is positive (i.e., the project is under cost and/or ahead of schedule and/or less capable than initially desired), then work is allowed to start and be completed at an accelerated rate corresponding to an increase in work execution. However, when the difference in the estimated budget, schedule, or performance and the actual budget, schedule, or performance is negative (i.e., the project is over cost and/or behind schedule and/or more capable than initially desired), then the work is slowed to a decelerated rate corresponding to a slowing of work execution or delaying of work. As an example, if the actual budget is 25 percent over the estimated budget, then the work initiation and completion rates are slowed by 12.5 percent due to the 2-year delay. In this model, budget and schedule are equally important and contribute the same to the work rates. Therefore, a situation with a 25 percent over budget and a 25 percent over schedule would result in a corresponding 37.5 percent reduction to the work rates.
Budget and schedule flows are based on the work in progress. The budget flow is the product of work in progress multiplied by a cost per work constant that is multiplied by the ratio of the number of systems to desired number of systems and the ratio of current performance to the desired performance. This assumes a cost reduction is associated with fewer quantities of systems and less system performance. The schedule flow is the product of the work in progress multiplied by a schedule per work constant. Either the budget or schedule flow can move in a positive or negative direction allowing for budget or schedule reductions, but it is important to note that because some amount of work is in progress at any given time, these flows will never be negative. It is also important to note that budget and schedule do not directly influence each other in this model. This is due to observations that a budget increase does not guarantee a reduction in schedule nor does a schedule increase guarantee a budget reduction (assuming all other factors are the same).
Previous project management research has addressed the interactions of cost, schedule, and work. Many earned value management and earned value schedule studies have evaluated cost, schedule, and work. However, little to no previous research has enabled the interaction of cost, schedule, and work levels in addition to performance levels over time. Our research provides some insight into this interaction. We have incorporated performance in our model by evaluating it in two ways. First, the number of systems is another flow that impacts the total budget. As long as the project is under budget, then the number of systems will remain the same. Once the project goes above cost, then the number of systems will be reduced in an effort to reduce cost growth. This is a common behavior observed in DoD projects.
Second, performance is evaluated through a percentage of the initial desired performance. The desired performance begins at 100 percent and, like system quantity, as long as the project is under budget, will remain at its current level. Also like system quantity, once a project is over budget, the performance level will be reduced by a percentage in an effort to reduce the budget and schedule of the system development. Additionally, the amount of performance degradation could also be thought of as a quantifiable estimate of program risk, which may not be a problem or be tolerable to the project stakeholders. Both of these performance measures can be generically applied and are helpful in our gaining a basic understanding of how this model operates. While both of these performance measures impact the budget flow, only the performance level compared to the desired performance level impacts the work flow. This is due to the assumption that complete system understanding can be gained through one system and no further insight is gained from additional system production.
More detailed aspects such as rebaselining or evaluating which performance elements are reduced have been excluded so that the essential model behavior can be observed. Verification and validation of this model was done through two means. First, common system dynamics practices as referenced in Barlas (1996) were successfully conducted. Second, the authors have used a case study to validate the results of the model with actual system performance. The authors have chosen the U.S. Army’s Future Combat System as the case study.
Case Study—The Army’s Future Combat System
In 1999, the U.S. Army began designing the Future Combat System (FCS) as a means of preparing itself for what it expected to be the future of warfare. The Army expected to have the first unit equipped in 2011 and the entire Army equipped by 2032. FCS involved multiple air and ground systems that were networked and interoperable. One key tenet of the FCS effort was that information could replace mass, and a second tenet was that FCS components could be deployed rapidly. The ultimate combination was a highly technical and revolutionary system-of-systems that sought to push technology and balance many competing priorities (GAO, 2008). FCS was officially terminated as a program in the summer of 2009.
While many unique and interesting dynamics surround this program, it is used as a means of verifying and validating our model. In our model, the key FCS inputs were the project cost and schedule estimates. These inputs were taken from the U.S. Congressional Budget Office (CBO, 2006) report on FCS. The model set 2005 as the base year, which was based on availability of fiscal information, and extends until 2025. Our model results of a total system cost in 2025 of $161 billion are consistent with the Congressional Budget Service (CBS) estimates of $160.6 billion. While the CBS may have been in error in its assessment, the fact that our model achieves similar results instills confidence in our approach. Additionally, our model estimates that when the project was killed in 2009, the number of systems would be reduced to 14, with an estimated project completion date of 2021. These results are consistent with actual results of that time and provide the final validation of our model for general understanding.
Using this model, we now turn to evaluating how varying responses impact the results. We have focused on project manager responses, and these results should hold true as long as DoD projects are evaluated against their initial (or current) estimates. The potential strategies evaluated are to Remove Controls, Ignore Schedule, Ignore Budget, and Improve Estimate. The reduction in total systems remains the same for all strategies so further discussion of it is excluded.
One potential strategy to improve project success is to remove all project controls. While this model does not account for some of the uncertainties and unknowns that occur within the life of the project, it does help us evaluate a perfect-world scenario. In this perfect world with no reduction in work flow, the project completes all work by 2025, but only attains 67 percent of the desired capability. The total budget is $147 billion, which is a 9 percent cost reduction of the baseline.
Another potential strategy to improve project success is to ignore the schedule comparison. This alternative operates on the principle that “if you need it bad enough, you will do anything to get it.” Upon first look, this alternative achieves the lowest total cost at $141 billion (a 13 percent reduction of the baseline) with all work completed by 2014 and a performance drop of 21 percent of the initial desired capability.
Positive results were achieved by ignoring schedule comparison so the authors were interested in what would happen if the budget comparison is ignored. In this case, the total cost was $133 billion, which is an 18 percent reduction in baseline and looks very attractive. However, only 67 percent of the initial desired performance is achieved on system completion, which does not occur until 2032. While this strategy results in the best cost reduction, the performance and timeline are sacrificed.
Another seemingly simple strategy is to focus all efforts on knowing the true cost and schedule up front. This technique is in consonance with most systems engineering literature and does make sense with what every senior DoD acquisition leader advises. While not the focus of this research, it is another challenge entirely to determine how to accomplish this. However, this approach results in a total budget of $210 billion or 130 percent of the baseline, with 86 percent of the desired performance delivered.
|Total Cost||Performance Delivered||Work Completed by 2025||Work Completed by|
|Baseline||$161B||69 percent||100 percent||2021|
|Remove Controls||$147B||67 percent||100 percent||2025|
|Ignore Schedule||$141B||79 percent||100 percent||2014|
|Ignore Budget||$133B||67 percent||77 percent||2032|
|Improve Estimate||$210B||86 percent||100 percent||2022|
Looking at this data reveals two general observations. First, in a perfect world potential opportunities for cost reductions abound, but, second, they come at the expense of performance level. The largest variation in strategies regarding cost is 30 percent, which could trigger a Nunn-McCurdy breach and require congressional reporting. However, the trade-off is clearly associated with the amount of performance delivered, which at most varies by 19 percent, and when that performance is delivered, which varies at most by over 18 years. Additional statistical analysis shows that no two factors are highly correlated, but that the most likely relationship is between total cost and performance delivered.
What these results mean to a DoD project manager is that no single strategy is likely to achieve cost, schedule, and performance optimization. Project managers need to evaluate the prioritized objectives of that project’s stakeholders and develop their strategies to meet those priorities. For instance, if a project is needed quickly, then ignoring how the project is comparing to the initial schedule may be the best solution. If a project needs high performance, most likely a good strategy for the project manager is to ensure the systems engineering and analysis is performed early so that the best cost and schedule estimates can be made.
Future Research and Conclusions
While these results are interesting, they are certainly not to be considered rigid rules of DoD project management. In fact, many elements and influences have been excluded in this model in an effort to gain an initial understanding of the total system behavior. The dynamics of how DoD project work is prioritized and executed, the dynamics of varying design and evaluation methods, and the dynamics and value of the three-milestone DoD acquisition gate process are all work-related influences that should be further studied. Another potential future area of study is the combination of this model into Ford and Dillard’s evolutionary acquisition comparison to see if even simple control alternatives affect the results of the research. The dynamics of varying budget delays as well as the impact of congressional budget action should also be further studied. Finally, the dynamics of system quantity changes to the actual system cost is an area that can also be expanded to provide better fidelity in this model.
All of these future areas of study require significant investigation and study, and likely vary from project to project. This further supports the authors’ theory that DoD project management is a highly contextual process that requires dynamic understanding of influences that sometimes do not make themselves known for some time. A system dynamics model such as the one discussed in this article could be used to best identify and predict total project behavior so that varying strategies could be evaluated for the best one in a given situation. The model could certainly be expanded and complemented, and the DoD would do well to invest more resources in exploring why projects fail (or succeed), documenting the circumstances and influences, and distributing them for widespread use in the project management arena. However, a single or even several causal factors should be avoided to explain all projects as every project is unique and must be evaluated in its unique context.
To print a PDF copy of this article, click here.
Mr. Patrick R. Cantwell works as a system engineer with SURVICE Engineering Company. He is currently a PhD candidate in systems engineering at The George Washington University with research focusing on complex systems and system dynamics. Mr. Cantwell holds a BS in Naval Architecture from the U.S. Naval Academy and an MS in Engineering Management from The George Washington University.
(E-mail address: firstname.lastname@example.org)
Dr. Shahram Sarkani, P.E., is professor of Engineering Management and Systems Engineering (EMSE), and director of EMSE Off-Campus Programs at The George Washington University. He designs and administers graduate programs that enroll over 1,000 students across the United States and abroad. In over 150 technical publications and in sponsored research with the National Aeronautics and Space Administration, National Institute of Standards and Technology, National Science Foundation, Agency for International Development, and Departments of Interior, Navy, and Transportation, his research has application to risk analysis, system safety, and reliability. Dr. Sarkani holds a BS and MS in Civil Engineering from Louisiana State University and a PhD in Civil Engineering from
(E-mail address: email@example.com)
Dr. Thomas A. Mazzuchi is professor of Engineering Management and Systems Engineering at The George Washington University. His research interests include reliability, life testing design and inference, maintenance inspection policy analysis, and expert judgment in risk analysis. Dr. Mazzuchi served as research mathematician at Royal Dutch Shell, and has conducted research for the U.S. Air Force, U.S. Army, and U.S. Postal Service, National Aeronautics and Space Administration, and for the Port Authority of New Orleans, among others. Dr. Mazzuchi holds a BA in Mathematics from Gettysburg College, and an MS and DSC in Operations Research from The George Washington University.
(E-mail address: firstname.lastname@example.org)
Abdel-Hamid, T., & Madnick, S. E. (1991). Software project dynamics: An integrated approach. Upper Saddle River, NJ: Prentice Hall.
Barlas, Y. (1996). Formal aspects of model validity and validation in system dynamics. System Dynamics Review, 12(3), 183–210.
Black, L., & Repenning, N. (2001). Why firefighting is never enough: Preserving high-quality product development. System Dynamics Review, 17(1), 33–62.
Cooper, K. (1980). Naval ship production: A claim settled and a framework built. Interfaces, 10(6), 20–36.
Coyle, J., Exelby, D., & Holt, J. (1999). System dynamics in defense analysis: Some case studies. Journal of the Operational Research Society, 50(4), 372–382.
DoD announces winners of annual modeling and simulation awards for excellence. (2008). In Defense AT&L (37th ed., Vol. 4, p. 79). Alexandria, VA: Defense Acquisition University.
Dvir, D., & Lechler, T. (2004). Plans are nothing, changing plans is everything: The impact of changes on project success. Research Policy, 33, 1–15.
Ferrara, J. (1996). DoD’s 5000 documents: Evolution and change in defense acquisition policy. Acquisition Review Quarterly, 3(2), 109–130.
Ford, D., & Dillard J. (2009) Modeling the performance and risks of evolutionary acquisition. Defense Acquisition Review Journal, 16(2), 143–158.
Love, P.E.D., Edwards, D. J., & Irani, Z. (2008). Forensic project management: An exploratory examination of the causal behavior of design-induced rework. IEEE Transactions on Engineering Management, 55(2), 234–247.
Lyneis, J. M., Cooper, K. G., & Els, S. A. (2001). Strategic management of complex projects: A case study using system dynamics. System Dynamics Review, 17(3), 237–260.
Lyneis, J. M., & Ford, D. N. (2007). System dynamics applied to project management: A survey, assessment, and directions for future research. System Dynamics Review, 23(2-3), 157–189.
Meier, S. R. (2010). Causal inferences on the cost overruns and schedule delays of large-scale U.S. federal defense and intelligence acquisition programs. Project Management Journal, 41(1), 28–39.
Pich, M. T., Loch, C. H., & De Meyer, A. (2002). On uncertainty, ambiguity, and complexity in project management. Management Science, 48(8), 1008–1023.
Sorenson, D. S. (2009). The process and politics of defense acquisition: A reference handbook. Westport, CT: Preager Security International.
Taylor, T., & Ford, D. N. (2006, Spring). Tipping point failure and robustness in single development projects. System Dynamics Review, 22(1), 51–71.
Thomas, J., & Mengel, T. (2008, January). Preparing project managers to deal with complexity—Advanced project management education. International Journal of Project Management, 26(3), 304–315.
Tishler, A., Dvir, D., Shenhar, A., & Lipovetsky, S. (1996). Identifying critical success factors in defense development projects: A multivariate analysis. Technological Forecasting and Social Change, 51(2), 151–171.
U.S. Congressional Budget Office. (2006). The Army’s Future Combat Systems program and alternatives (Report No. 2565). Washington, DC: Author.
U.S. Department of Defense (Comptroller). (2011). Overview—FY2012 defense budget. Retrieved from http://comptroller.defense.gov/budget.html
U.S. Government Accountability Office. (2008). 2009 is a critical juncture for the Army’s Future Combat System (Report No. GAO-08-408). Washington, DC: Author.
U.S. Government Accountability Office. (2011). Defense acquisitions: Assessments of selected weapon programs (Report No. GAO-11-233SP). Washington, DC: Author.
Williams, T. (2005). Assessing and moving on from the dominant project management discourse in the light of project overruns. IEEE Transactions on Engineering Management, 52(4), 497–508.