How Full Is the Toolbox? A Look at Product Support Analysis Tools


To print a PDF copy of this article, click here.

Author: Michael A. Bayer

We have long known the value of using tools to make our jobs easier, and our guidance and policy actually advocate the use of applicable tools. But when asked to do a task, does the logistician actually have the tools in the toolbox to help in accomplishing the task? Clearly the Government Accountability Office (GAO) didn’t think so, based upon GAO Report-09-41, Defense Logistics: Improved Analysis and Cost Data Needed to Evaluate the Cost-Effectiveness of Performance Based Logistics (PBL), which stated “. . . although DoD’s guidance recommends that business case analyses be used to guide decision making regarding the implementation of PBL to provide weapon system support, the services are not consistent in their use of such analyses.”

The report went on to say “additionally, most of the services have not established effective internal controls to ensure that the analyses are prepared or that they provide a consistent and comprehensive assessment of weapons system support options.”

The results published by the GAO were further substantiated when the DoD Weapon System Acquisition Reform Product Support Assessment Team (PSAT), a 65-member cadre of DoD and industry members, hypothesized, “If the DoD clarifies and codifies the larger group of analytical tools by which product support Business Case Analyses [BCAs]) are conducted, it will improve the effectiveness of the BCA as a decision-making tool.”

While there were several lists of Product Support Analytical Tools, there was little guidance as to the applicability, appropriateness, and efficacy of the various tools based upon stage in the acquisition life cycle.

The Concern Demands a Response—Research Meets Application

The Office of the Assistant Secretary of Defense for Logistics and Materiel Readiness—ASD(L&MR)—suggested a deeper look into issues raised by both the GAO and the PSAT reports, soliciting the assistance from the Defense Acquisition University (DAU) to research the concerns. However, this was not just any research project where the goal was to support, or not support a hypothesis and later provide recommendations. Rather the results of this research were intended for use in spearheading a tangible solution to the crisis in Product Support Analytical Tools.

The Research Piece

Stage One consisted of an extensive literature review where the DAU research team scoured previously conducted efforts, examining sources of existing analytical tools and ferreting out bits of useful information about the tools to include their applicability, usefulness, ease-of-use, and accessibility to the workforce. Stage Two consisted of survey research and personal interviews. Program managers, product support managers, and financial managers from across the services, academia, and industry were asked if they used Product Support Analytical Tools during the course of their work; if so, for what purpose, if not, why not. Those who said they used Product Support Analytical Tools were asked to provide information regarding the specific tools they used.

So What’s the Real Problem?

Typically, all research begins with a problem statement, defining the concern at hand and explaining the reason for the research. We at DAU found, along with the PSAT and GAO, that there had been inconsistent use of Product Support Analytical Tools in BCAs to determine the best product support option. Also, while there were several lists of Product Support Analytical Tools, there was little guidance as to the applicability, appropriateness, and efficacy of the various tools based upon stage in the acquisition life cycle. Oh, and there was no single, central repository with this information. We believed a central database could provide a key enabler in selecting the most cost-effective product support option and achieving greater affordability over the life cycle of a weapon system.

And Why Do We Care?

Another important piece of the research process is the statement of purpose, which tells what the research hopes to achieve. We hoped our project would identify what Product Support Analytical Tools were available and their applicability at various stages of the weapon system acquisition life cycle. To avoid reinventing the wheel, we chose to leverage any work previously conducted in support of ASD(L&MR) in examining various sources for analytical tools in hopes of establishing a body of knowledge/database to support weapon system program offices in their efforts to conduct BCAs, and specifically product support analyses as part of the BCA.

What Do We Really Need to Know

It was important to isolate exactly what we wanted to know and document our inquiries in the form of research questions. We hoped to answer the following four questions:

  • What Product Support Analytical Tools are available?
  • When in the product support life cycle are the tools used?
  • How “user-friendly” are the tools?
  • Is there an overarching awareness of available tools?

Where We Obtained the Data

The data for this research were gathered through various methods. As intended, this research leveraged previous work by Price Waterhouse, Logistics Management Institute, and other Defense Acquisition University efforts. This review revealed numerous tools previously identified for use by the Acquisition, Technology and Logistics (AT&L) workforce, but also revealed a lack of consistency in funneling information about the tools to that workforce. We then compiled, analyzed, and organized the captured data into a form usable by the research team.

We also developed and distributed a survey to members of industry, academia, and select members of the AT&L workforce. The intent of the survey was to retrieve data pertaining to the tools used by product support workers, codified by the type of tool (product support, financial, BCA), type of user (Program Manager, Product Support Manager, Financial Manager, etc.), where in the life cycle the tool is used, ease of use, efficacy, and ease of access. The survey was offered to senior level officials in the fields of Program Management, Product Support Management, Systems Engineering, and Financial Management. Fifty-four individuals responded to the survey. While the limited number of responses would have significantly jeopardized a more formal research effort, given the purpose of this research, we were able to glean a substantial amount of information from the available respondents.

What We Found Out

Question 1—What product support tools are available?

The total number of product support tools located while reviewing the previous “product support analytical tool” efforts was 269, which included a previously collected listing of tools included in Business Case Analysis Guidebook and a listing of tools used by a defense industry product support provider. Only 23 were identified in the product support survey. This was an area of concern. More than 269 tools available according the initial review, but our workforce only identified 23. Perhaps the word wasn’t getting out (See Figure 1. How the Tools Were Identified).

Figure 1. How the Tools Were Identified

010213-article-13-figure-1

Question 2—When in the product support life cycle are the tools used?

There were 11 separate decision-making tools identified, many that covered multiple phases of the life-cycle framework. Six tools were identified for the Materiel Solution Analysis (MSA) phase, four for Technology Development (TD) phase, eight for the Engineering and Manufacturing Development (EMD) phase, nine for the Production and Deployment (P&D) phase, and seven for the Operations and Support (O&S) phase.

Similarly, eight technical tools were identified, many covering multiple phases of the life-cycle framework. One was identified for use in MSA, four in TD, six in EMD, five in P&D, and three in O&S. Finally, financial management tools were identified in the survey, but they also covered multiple phases of the life cycle. One was identified for use in MSA, four for TD, eight for EMD, nine for P&D, and seven for O&S (See Figure 2. When the Tools Were Used).

Figure 2. When the Tools Were Used

010213-article-13-figure-2

Question 3—How “user-friendly” are the tools?

Each survey respondent was asked why he or she was inhibited from using each of the categories of product support tools, along with reasons for choosing certain tools. There were 17 responses regarding why a person did not use decision-making tools, 17.65 percent citing lack of expertise. This reason can reasonably be translated as not understanding how to use the tool, and perhaps lack of familiarization. Consequently, there were 27 responses regarding why a decision-making tool was used, and 40.74 percent cited ease of use as a reason for choosing the tool.

As for the technical tools, 13 responses were captured regarding why a person did not use a tool; 7.69 percent citing lack of expertise as a reason. Twelve responses were captured for reasons that a technical tool was selected; 25 percent stated ease of use as the reason. Additionally, 14 answers were captured regarding the nonuse of financial tools, 14.29 percent citing lack of expertise as the reason. Seven responses were captured regarding why a financial tool was chosen with 28.5 percent citing ease of use.

The previous paragraphs addressed the issues pertaining to the use of the categorized tools (decision-making, technical, and financial); however, there were also responses indicating that some respondents chose to use no tools whatsoever. In this case, 33 respondents stated that they have not used any supportability analysis tools, 15.15 percent citing lack of expertise as the inhibitor (See Figure 3. Why the Tools Were Not Used).

Figure 3. Why the Tools Were Not Used

010213-article-13-figure-3

Question 4—Is there an overarching awareness of available tools?

In many instances, respondents were asked if they used product support tools, and they indicated that they had not used tools in performing their duties. In those cases, there was a follow-up question as to the reason a tool was not used. One of the available options was ‘did not know there were applicable tools available.’ Thirty-three responses indicated that no Supportability Analysis tools were used. Approximately 30.3 percent cited not knowing applicable tools were available as the reason. Similarly, when asked the same question regarding the use of decision-making, technical, and financial tools, 35.29 percent, 30.77 percent, and 42.86 percent, respectively, indicated there was a lack of knowledge regarding availability of the tools.

The Results

This analysis revealed a number of tools are available for use. However, many of the tools are often unknown, require expertise, require special access, and/or are cost prohibitive. The DoD Business Case Analysis Guidebook provides a comprehensive list of tools known and at least marginally available to the workforce, but the list does not provide information regarding use or access. The tools identified during this study tended to be applicable for use in many phases of the product life cycle, and many tools had multiple applications.

During the course of this research, two significant aspects came to light. First, a substantial number of the responses indicated that lack of expertise was a reason for not using existing tools. A substantial number of the responses indicated that, where tools were used, the reason was ease of use. It could be easily inferred (though the external validity is limited due to the lack of respondents) that the respondents desire and require (and are most likely to use) easy-to-use, easy-to-learn tools.

Throughout the study, responses overwhelmingly indicated there is an overarching lack of awareness of the existing product support tools. Inasmuch the research results suggested there are numerous tools in use by the AT&L community, and confirmed the original assumption that, while the tools are available and in use, the community lacks awareness of which are available and when they should be used.

The Application

Research is most useful when acted upon. The application aspect of this effort was to develop a central repository, accessible by the AT&L workforce, that provides a current list of not only the tools available for use, but also where workforce members would find more information about the tool and how to obtain the tool for their own use. A “Product Support Analysis e-Toolbox” has been developed and fielded, accessible at https://acc.dau.mil/psa-tools.

Many tools are listed, each codified and filterable by Supportability Analysis Tools, Program Planning/Control Tools, Military Department, Integrated Product Support Element, and Licensing Requirements. When you click on one of the tools, you are immediately provided information regarding the tool’s purpose, the type of process(es) it supports, military department(s) currently using the tool, fees associated with the tool, and where to go to get more information.

The Living Data Base

While this study culminated with a repository of product support tools, it must not only be advertised to the workforce but must also be maintained if it is to become and remain effective. The Product Support Analytical Tools data base needs to be an ever-maturing repository: living, breathing, and growing. This is our part. We need to use it, add to it, update it, and refine it. In the 3 months since its release, we’ve added more than 220 validated product support analysis tools, and the site has been viewed more than 36,000 times. I would say we’re off to a great start. It’s your toolbox, access it, use it, and add to it!


To print a PDF copy of this article, click here.

Bayer is a professor of Life Cycle Logistics Management at the Defense Acquisition University’s Midwest Region in Kettering, Ohio.

The author can be contacted at michael.bayer@dau.mil.

Comments

comments

Leave a Reply

Your email address will not be published. Required fields are marked *