Calculating Return on Investment for U.S. Department of Defense Modeling and Simulation


To print a PDF copy of this article, click here.

Authors: Ivar Oswalt, Tim Cooley, William Waite, Elliot Waite, Steve “Flash” Gordon, Richard Severinghaus, Jerry Feinberg, and Gary Lightner

As budgets decrease, it becomes increasingly important to determine the most effective ways to invest in modeling and simulation (M&S). This article discusses an approach to comparing different M&S investment opportunities using a return on investment (ROI)-like measure. The authors describe methods to evaluate “benefit” (i.e., increased readiness, more effective training, etc.) received from an investment and then use those metrics in a decision analysis framework to evaluate each M&S expenditure. Finally, they conclude by discussing the importance of viewing M&S investments from a Department of Defense (DoD) Enterprise view, evaluating investment over multiple years, measuring well-structured metrics, and using those metrics in a systematic way to produce an ROI-like result that DoD can use to evaluate and prioritize M&S investments.


Successful Department of Defense (DoD) Enterprise modeling and simulation (M&S) investment requires structure, persistence, and common valuation for effective execution. The methodology summarized in this article provides a systematic process, based upon theoretical aspects of capital structure, by which DoD investments in M&S can be compared, evaluated, and directed to achieve the greatest return on investment (ROI) in this “national critical technology” (House Resolution [H. Res.] 487, 2007).

To effectively apply a technology like M&S to a DoD Enterprise, application, or program, it is critical to define and assess rigorous measures of merit and metrics that reflect the results of M&S application across the relevant spectra of management, mission, and system. Such assessments are especially critical as budgets are reduced, opportunities for live tests and exercises are curtailed, and acquisition time lines are shortened. Currently, most M&S value assessments use metrics that are uneven in scope, very case-specific, do not allow consistent aggregation, or are not well structured. Additionally, some measures that are used, like ROI, are actually incorrectly defined; others, however, are undefined, thus making the assertions of value at best vague, and at worst incorrect. Finally, all too often important distinctions are not made between and among terms critical to consistent ROI assessment, such as metrics, measure, scale, quantity, quality, cost, utility, and value.

Prior efforts to characterize the cost-benefits of M&S have included surveys, assessments, and methodological developments. Surveys summarize the results of efforts already conducted (Worley, Simpson, Moses, Aylward, Bailey, & Fish, 1996). Methodological development articles provide insights into how to improve M&S value calculation (Gordon, 2006). Assessments typically provide insights based on one of four approaches: nominal description, case-based, business-oriented, or multi-attribute examination. All four have advanced the state-of-the-art in M&S assessment, but have not yielded an overall, rigorous, and effective approach for placing metrics in a decision analysis framework to allow the evaluation of M&S investment. The methodology developed here (Figure 1) is distinctive insofar as it provides prescriptive guidance while allowing for the comparison of alternative M&S investments (M&S compared to other M&S or M&S compared to other alternatives [analysis, war games, etc.]) to support a mission or meet a goal.

Figure 1. M&S Investment Methodology

It also facilitates an assessment of an M&S alternative over time (how the capabilities provided change from the initial application to subsequent use). Such time-considerate assessments are especially critical in today’s environment of shrinking budgets. By viewing investments from a DoD Enterprise view, evaluating investment over a multiyear time line, measuring metrics developed from this viewpoint, and using these metrics in a systematic way to produce an ROI-like result, the DoD can evaluate and prioritize M&S investment.

Market Context and Business Practice

Stand-alone strategies don’t work when your company’s success depends on the collective health of the organizations that influence the creation and delivery of your product. Knowing what to do requires understanding the ecosystem and your organization’s role in it. (Iansiti & Levien, 2004)

This quote from the Harvard Business Review addresses the fundamental premise that commercial businesses exist and thrive (or not) within the context of a business environment much larger than exists within the boundaries of an individual firm. To succeed, individual firms must learn to recognize and create value within “the ecosystem” in which they exist. Translated to the domain of DoD M&S Enterprise management, the quote, as interpreted by the authors, could read:

Stand-alone M&S strategies don’t work when DoD’s success depends on the collective value created across the Enterprise, and its creation and delivery of value derived from its investment in M&S. Knowing what to do requires understanding DoD’s ecosystem and leadership’s role in it.

Within the DoD, many organizations influence the creation of value from M&S investment. On an Enterprise level, the key to maximizing value is understanding who shoulders the costs and who potentially derives value from the allocation of resources to M&S. DoD investment strategies need to address, at a minimum, these aspects of economic valuation:

  • Government (DoD) being the (only) buyer in many parts of its M&S market does discriminate it from private-sector M&S investment.
  • A lack of “marketplace” from which to gauge economic valuation often complicates DoD’s efforts to make sound, credible valuation judgments.
  • Government must account for intangible benefits as contrasted to monetized benefits or simple revenue.
  • Unlike commercial practice (e.g., corporation- or company-based), when the DoD invests, a misalignment often occurs between the “cost bearer” (the resource sponsor) and the “benefit accruer” (the group that gains an advantage from the investment), especially when the investment creates and returns value to DoD components that exceeds the expected ROI.

The last bullet is particularly significant. In assessing a candidate investment, a practice or methodology does not exist in the DoD to capture and characterize the future and extended value accruing to users beyond the primary recipients of the investment. Having a methodology to capture such extended benefits could change the outcome of an investment decision from “not possible” to “approved,” and provide a mechanism for assessing all beneficiaries for their fair share of investment costs. Additional difficulties arise in the fact that in many cases the DoD M&S investment cannot be monetized (translating elements of value to units of dollars) in a manner analogous to commercial business. Placing a monetary amount on lives saved, readiness improved, or warfighters better trained is difficult if not impossible. The DoD’s characterization of value must often be in terms that are naturally qualitative, making the calculation of extended benefit (analogous to the time-value-of-money) very different than in the commercial sector.

Across the DoD, the present practice is to base investment in M&S on a number of methods; at an Enterprise level, however, the practice is neither systematic nor consistent. Writing in Acquisition Review Quarterly, the Army Developmental Test Command Director for Test and Technology C. David Brown and co-authors G. Grant, D. Kotchman, R. Reyenga, and T. Szanto wrote:

Most program managers justified their M&S investment based on one or more of the following: reducing design cycle time; augmenting or replacing physical tests; helping resolve limitations of funds, assets, or schedules; or providing insight into issues that were impossible or impracticable to examine in other ways. (Brown et al., 2000)

Simply put, program managers (PMs) are under intense pressure to complete their programs on budget and within time lines. They lack an institutional mandate to develop or use M&S tools that may have wider application to other programs, or that will be cheaper to operate and sustain in the long term (Brown et al., 2000). This focus on the program level, while potentially good for the PMs, can be detrimental to the Enterprise at large. When considering an allocation of resources, PMs must consider not only costs, but also explicitly definable benefits. Equally important at the Enterprise level are values (economies of scope), which must be assigned by leadership to complete the process of estimating ROI and other measures of value with respect to M&S assets. The methodology proposed here is a step in accounting for these competing, yet equally important value metrics.

Stakeholder and Community of Practice Specification

Understanding stakeholders and their role-dependent sensitivities within the M&S community of practice provides the context within which to determine M&S metrics. DoD stakeholders operate within a broad M&S market, where “market” includes the full economic landscape over which M&S products and services have impact. DoD M&S stakeholders fall into seven categories:

  1. Consumers/Users—End users of M&S-powered products or of M&S services
  2. Buyers—Expenders of funds for M&S-powered products or of M&S services
  3. Sellers—Providers of M&S tools, data, or services
  4. Investors—Providers/appropriators/deciders on expenditures of funds for M&S products or services
  5. Approvers/Raters—Providers of a “seal of approval” for M&S tools, data, or services
  6. Reviewers—Providers of “advice and consent” on M&S issues, including M&S products or services
  7. Promoters/Advocates—Independent providers of “encouragement” to the development of the M&S market for M&S-powered products or services

Each stakeholder category comes to the M&S market with a role-dependent perspective. These perspectives are designated as: Program, Community, Enterprise, Federal, and/or Society. For DoD M&S investment, the first three perspectives—Program, Community, and Enterprise—are considered to be internal to the DoD. The final two—Federal and Society—are considered to be external to the DoD. Stakeholders provide another dimension that is useful in characterizing DoD M&S investment considerations and elements of value.

  1. Program stakeholders’ concerns focus on applicability, availability, and affordability; credibility, analytic soundness, user friendliness, and entertainment delivered, as well as modularity, interoperability, and portability; and concentrate on systems-of-systems or system-level functionality.
  2. Community stakeholders’ focus is on managing M&S within specific areas such as acquisition, analysis, planning, testing, training, and experimentation, and is oriented toward application-level indicators of success or failure.
  3. Enterprise stakeholders’ concerns focus on M&S capabilities that apply across diverse activities of the Services, combatant commands, and DoD agencies.
  4. Federal stakeholders’ concerns focus on M&S developments across departments and agencies of the U.S. Government.
  5. Society stakeholders’ concerns focus on the role and impact of M&S on governments, cultures, academia, industries, and populations.

These concerns are broad and encompassing, and include standards, policies, management, tools, and people, along with reuse, interoperability, collaboration, interactiveness, and sharing of assets in a defense-wide manner.

Use Case

Developing and understanding use cases, including stakeholder needs and requirements, help determine, refine, and evaluate the process for defining M&S investment metrics. Use cases illustrate stakeholder issues and role-dependent sensitivities together with investment decision processes, and serve to support and guide the definition, explanation, and evaluation of processes and metric alternatives. We have developed a framework that encompasses a consistent and complete set of use-case descriptions for use in the analysis of M&S investment metrics. Table 1 lists the parameters of a framework that provides a consistent and complete set of use-case descriptions to help analyze M&S investment metrics. The full report of the study details three use cases from different perspectives (AEgis Technologies Group, 2008). The use cases examine exercise options, Live, Virtual, Constructive (LVC) middleware choices, as well as conceptual modeling alternatives for the Missile Defense Agency. In each of these, the steps in this process are delineated and discussed, sample data included, and a decision recommended based upon the given scenario. Due to space limitations, we were unable to include them in this article.

Table 1. M&S Use Case Framework

Parameter Selected Values
What/Where Investment situation, investment goal, investment time line, asset types, asset numbers, other asset information, geographical constraints
Who Stakeholder market category, stakeholder perspective, stakeholder office
Why Concerns, issues, forcers, drivers, constraints
When Near-term investments, mid-term investments, long-term investments, schedule constraints
How Costs (near term, mid term, long term)
So what Result, benefit, utility, cost savings
Data Support Sources, pedigree, availability, timeliness

Assets

To fully understand DoD M&S investment, it is also critical to identify those items that DoD buys. We first define the difference between assets (items for DoD investment) and consumables (in accounting terms: expenses). Then we list the assets and categorize them depending upon the point of view (POV). For example, if one views assets from the DoD POV (Acquisition, Analysis, Planning, Training, Experimentation, and Testing), then the assets are categorized one way. Alternatively, if the POV is that of the DoD Enterprise Community (which from its M&S Vision Statement articulates the categories of Infrastructure, Policies, Management, Tools, and People), then the assets are characterized differently (AEgis Technologies Group, 2008).

From a DoD perspective, an asset is defined as: “Something of monetary value, owned by DoD, that has future benefit.” A consumable, on the other hand, is: “Something capable of being consumed; that may be destroyed, dissipated, wasted, or spent.” The primary difference is the concept of future benefit. “Future” in this sense is typically thought of as more than 12 months in the future. Examples of DoD M&S assets include: F-16 simulators, the Navy’s Battle Stations 21 simulator, and the online game “America’s Army.” Consumables, on the other hand, are items such as paper, pencils, jet fuel, printer ink—all typically used and depleted within 12 months of purchase. In light of this, those types of items that constitute DoD investment assets, using the DoD M&S Vision Statement and the DoD Communities, are shown in Table 2.

Table 2. Assets Listing

Hardware Software Networks Facilities People Products & Procedures
Computers Models Communication Lines Buildings Expertise Plans/Policies
Electronic Hardware Simulation Architecture Labs Experience Standards
Hardware in the Loop Tools (CAS/CAM)* Transaction Protocals Ranges Skills/Education Analysis Results
Mock-ups Data/Databases Physical Models Operational Knowledge Conceptual Models
Spares Repositories Management Process

*Computer-Aided Design/Computer-Aided Manufacturing

By comparing this categorization with that developed by cross-mapping this list with the DoD Communities (from both mission and organizational views), and with the DoD M&S Vision Statement categories, we noted some interesting relationships. To start, every listed asset correlates to more than one major DoD Community. For example, every DoD M&S Community invests in the Asset Models. While this is not surprising, it shows that there may be efficiencies gained by studying the Enterprise view and how the DoD invests in models since that investment is widespread. Also, the assets are quite varied from the tangible items to the esoteric. This means that some assets are easy to value, making the determination of the cost of the investment relatively straightforward, and some extremely difficult. Finally, it is difficult to place assets neatly into bins. All assets cross functional, mission, organizational, M&S Community, and DoD M&S vision category lines, meaning an investment in any one of these assets affects multiple commands, agencies, and perhaps Services. All categories and sub-categories invest in multiple assets. Because of this, to be the most effective and get the highest ROI, investing in M&S needs to be viewed at the Enterprise level, not at an individual Community level. A true measure of investment effectiveness cannot be achieved unless one considers all the costs and benefits.

Asset Costs

A decision to purchase or modify an M&S asset should be based on the needs of the customer(s) and the cost of the purchase or modification. That cost and associated decisions are best understood within the context of multiyear fiscal calculations. In looking at costs and the ROI of those costs, it is important to again acknowledge that business and government operate differently. If a business were to purchase an asset, the business owner would likely evaluate the impact of the asset on the bottom line: profit. The owner would likely predict the changes in profit and the costs to purchase or modify the asset over the useful lifetime of the asset, and then compute (“discount”) all those changes in profit and asset costs back to the current year (today’s) decision point. Different options, such as “purchase asset A” or “modify asset B,” can be compared in this way, even if these options have different payoff and cost streams over varying numbers of years. The comparison of the options in terms of current-year dollars at the time of the decision gives a standard metric that allows a fair evaluation of the alternatives.

Government and industry cost comparisons differ in that while government generally does not compute profit, it does compute changes in expenses. Additionally, in government the changes one stakeholder or one PM makes can have cost impacts on another PM, so one PM can show cost savings while others have the burden of increased costs because of a change in an asset. This shows once again that considering the Enterprise perspective across all impacted programs is essential to calculating an accurate and complete value of M&S investment.
Typically, cost elements for M&S assets can be grouped into useful classifications (Office of the Director, 2007) for evaluation of alternatives through the calculation of current-year metrics:

  • Infrastructure: standards, architectures, networks, and environments
  • Policies at the Enterprise level (including interoperability and reuse)
  • Management processes for models, simulations, and data
  • Tools in the form of models, simulations, and authoritative data
  • People (including well-trained and experienced users)

The overall study illustrated how an increased level of granularity for these classifications could be tailored to the project and asset particulars, and could be used to facilitate the calculation of costs by year (AEgis Technologies Group, 2008). The following example illustrates the type of M&S alternatives that could be evaluated using a cost element structure to characterize costs of several alternatives over several years.

Using the Cost Element Structure to Compare Alternative M&S Courses of Action

A simulation professional was directed to establish an annual experiment in Alaska to evaluate capabilities such as the combat benefits of a new system for position determination of friendly ground forces. The simulationist will need to evaluate alternative simulations for use in this annual experiment. Could a different simulation be used each year depending on what systems are being evaluated, or would it be acceptable and cheaper to use a standard core simulation over the next 5 years? The cadre of simulation operators is limited in Alaska, so the simulationist must also evaluate distribution of the simulation environment from other locations.

In this first year, the position determination system may need to be simulated or assumed. Databases for geography and other environmental factors may need to be purchased with requisite lead time. Connectivity and simulation architecture costs will have to be evaluated. The estimated cost of conducting the experiment, using all live forces, would be the most costly option, and could be used to estimate cost avoidance for the other LVC options.
Depending on the alternatives evaluated, some may be more costly in the current year and cheaper in the out-years; while others may be cheaper in the current year but with a high stream of out-year. Hence, the cost comparison of the alternatives is evaluated based on the sum of the discounted costs across the entire 5 years of the experiment.

Results

To understand ROI of M&S, it is necessary to accurately characterize the results of its application—the return in ROI. Such results need to be rigorously described in a manner that accounts for both qualitative and monetary dimensions. The approach developed and detailed in this section describes the metrics required for such analyses, including types, variability, and application particularities. The development of such metrics is especially important in M&S, where the impact of investment and application is not exclusively monetary, naturally quantitative, or sometimes even intuitively obvious. Where the word “results” appears, its use reflects the outcome of M&S; includes both positive and negative; encompasses terms like value, utility, contribution, benefit, and return; and allows for both monetary and qualitative effects.

The results calculation methodology begins with a series of assumptions and definitions. It is assumed that decision makers in a governmental agency are rational actors who seek to optimize relevant outcomes. Also, outcomes can be characterized using terms that reflect the investment value of alternatives (meaning, no hidden agendas or overriding private concerns). The next assumption is that the metrics can be accurately quantified (whether inherently numeric [like money] or subjectively assessed). For this effort, we define three organizing principles or perspectives that can be consistently applied: Program, Community, and Enterprise. Next, it is important to understand the scope of the results determination. For instance, will they be used to compare alternatives in meeting a goal (M&S to M&S or M&S to other options), or to the evolution of an M&S capability over time? Next, in calculating results metrics, it is important to define the term “metric” in context (Table 3). The next step of results metric calculation is measurement or assessment. The focus here is on qualitative or subjective judgments that can be numerically characterized and indices that are naturally quantitative. Finally, it is often very important to aggregate, calculate, or derive an overall measure from a decision theoretic approach.

Table 3. Results Metrics Context

Relationships Example
First are the classes/categories e.g., Technical
Associate with each group are characteristics/terms describing features Maintainability, Design
Associate these with more specific properties Main Time between Failures, Type
Decompose these into metrics, standards of measurement, like variables Hours, Days/Compiled, Interpreted
Metrics values are relative to a scale (a specified graduated reference used to measure) an dmay be nomical, ordinal, interval, or ratio in type 1-2-3-4-5-6-7-8-9-10
May range from 0 or no representation to X, which X represents a complete implementation of the areas Continuously for interval and ration data
Metrics are assigned values, based on the features of the M&S (the act of measurement) or mission requirement e.g., 9, Compiled
Values can be combined into aggregate measures of merit C = 2*1, I = 1, Value = 18

Three perspectives apply within the DoD to the derivation of relevant M&S results metrics and the calculation of their ROI. They are the Program perspective, which includes both M&S programs and programs or activities that use M&S (Oswalt & Kasputis, 2006); the Community perspective, as described in the Application Area Descriptions (Oswalt, 2005) (i.e., the “Surfboard Chart”); and the Enterprise perspective, as articulated in the Strategic Vision for DoD M&S. Acknowledging these three perspectives is critical, since the results metrics applicable to each are different (Figure 2). However, due to space constraints and the desirability to view M&S investments from an Enterprise aspect, only Enterprise metrics are summarized here.
The Enterprise perspective focuses on M&S capabilities that apply “across the diverse activities of the Services, combatant commands, and agencies” and thus presents goals that are necessarily broad and encompassing. They include standards, policies, management, tools, and people that are collaborative, interactive, and sharing of assets in a defense-wide manner that includes other “governmental agencies, international partners, industry, and academia.” Metric categories for each were derived previously (Oswalt & Tyler, 2008). A sample set of Leadership metrics is provided in Table 4.

Figure 2. Results Perspectives

Enterprise (Brain)
Leadership, Implementation, Business, Infrastructure, System of Systems
Enterprise metrics reflect organization and management-type activities
Communities
Design, Manufacturing, Sustainment, Time to Market Alternatives, Complexity, Sesitivity, Result Time Projection, Familiarity, Comprehensive, Decision Time, Test, Design, Augmentation, Extrapolation, Completion Time Availability, Scenario Variation, Experimental, Retention, Time Discovery, Doctrine, Technology, Cycle Time
Community metrics reflect more specific uses and yet can include both enterprise-type and program-type metrics (when the program crosses boundaries within a community or between communities)
Programs (Blood)
Applicability, Availability, Affordability, Rigorousness, Engaging, Usability, Creditability, Technial
Program metrics reflect the key dimensions of individual M&S system developments or M&S system develpoments or M&S use within platform development of programs.

Table 4. Enterprise Metric Sample

Enterprise Perspective Sample Metrics
Term (Characteristics) Definition Quality Monetary
Leadership (class/category)
Leadership Statement of vision and associated advocacy/support of timely actions needed for an effective enterprise (property) #/currency of vision and resulting/supporting docs (metrics) senior leaders adopt vision within their (other) areas % alignment of funding to vision, Savings from reduced unused sunk costs
Empowerment Developers, managers, users that are engaged, asked, able to make significant contributions # innovative ideas forwarded without solicitation. % M&S decision makers attending key meetings Reduction in costs to get new M&S concepts. Savings from innovative M&S use
Situational Awareness Decision maker’s and users’ understanding and awareness of M&S standards, tools, etc. # innovative ideas forwarded without solicitation. % M&S decision makers attending key meetings Reduction in costs to get new M&S concepts. Savings from innovative M&S use
Management Human Capital Management for recruiting, assigning, career development of M&S workforce % M&S billets staffed with M&S qualified people. % M&S qualified personnel promoted/retained Unnecessary training/retraining costs. Cost-effective M&S decisions
Processes Adoption of rigorous, timely, and relevant standardization and certification of M&S policy, tools, workforce, etc. # promulgated processes consistently adopted. Decreased product (policy, tool, etc.) generation time Reduced labor, travel, and software reworks. Savings from error-rate reduction

ROI Methods

In financial analysis, the concept of return is critical and is principally used to measure the change in “value” over time. As such, return is used by the Financial Community to determine two important concepts: (a) whether or not the benefit of an investment (or similar action) was positive or negative—this is the “direction” of the change; and (b) how positive or negative the change was—this is the “magnitude.” Financial analysts typically calculate only one value by which both direction and magnitude can be ascertained. The use of a single value is possible because analysts usually compare changes in a single, same quantity: U.S. dollars. The two most common ways to measure return are as a percentage increase in a holding’s value between two time periods. Return consists of (a) the income and the capital gains relative to an investment. It is usually quoted as a percentage (INVESTOPEDIA®, 2010); and (b) as the amount of cash (or revenue) generated from a set, fixed asset base, expressed as a percentage of investment. Examples of this include Return on Equity, Return on Assets, Return on Common Equity, or Return on Invested Capital. Both of these methods typically use dollars as the unit of measure.

So how do we apply the concepts of financial analysis to DoD M&S projects? The concepts of magnitude and directionality mentioned previously are essential to this endeavor. To make a decision between a finite set of options, a relative sense of order is needed; that is, to be able to distinguish which project is better than the others. Therefore, while we might not assign a specific dollar value to the benefit of one choice over another, by using directionality and magnitude, we can arrive at a “relative ranking” that will let us compare those options among which we are seeking to decide. Additionally, the notion of “internal consistency” in evaluating different options is vital. If we are not able to gain an absolute value (such as, say 83 percent), but are to rely on relative values (A is better than B, which is better than C), we must make sure that we are consistently applying the same evaluation criteria to all the potential choices. The methodology for evaluating DoD M&S investment described in the following discussion meets these criteria and is completely consistent with the manner in which financial analysis seeks to evaluate return.

Investment Decision Process

Having now determined metrics for the costs and results associated with an investment, we are in a position to decide whether or not to make the investment using these metrics and others. Our goal is to employ a decision process that takes into account the data gathered, does not rely upon chance, is fundamentally simple to explain and defend, and is consistent (would give the same answer each time with the same data).

Rational actors, when faced with a decision, will choose the option that maximizes their gain by some measure. In previous sections, we presented methods to evaluate the costs and results of an M&S investment; noted that monetization of these metrics may be difficult, if not impossible, to perform for the DoD; and discussed ROI methods, including key financial analysis elements. Given this environment with its constraints, we developed a decision process that produces an ROI-like quantitative result for use in M&S investment evaluation. We used assessed metrics as input to a Multi-Attribute Decision Making (MADM) network, which has the qualities of being robust, relatively explainable, objective, consistent, and once established, can be executed fairly simply. MADM (Figure 3) is not new and has been shown to work well in structuring complex decisions involving a multidimensional decision space. In its simplest form, MADM is a weighted sum. The total utility score is calculated by multiplying each attribute’s normalized input score by its relative weighting (which would be assigned earlier) and summing all the products. This process is repeated at every layer. While other formulae can be employed to calculate a utility score, the weighted linear method is most often used due to its simplicity and transparency (Tompkins, 2003). In this case, multilayers are desirable for a few reasons.

Figure 3. Diagram of MADM Process for DoD M&S Investment Organized by DoD Communities

First, it allows for the higher level DoD decision makers to put different emphasis on certain communities by assigning different weights to each community. Additionally, multilayers are desirable for transparency since grouping the metrics by community makes it easier to see how certain measures impact the overall utility score.

It should be noted that attributes measured should be mutually exclusive (no overlap) to prevent one attribute from influencing the final score by a higher amount than intended. Additionally, the weights are typically set by a team of subject matter experts, which should consist of experts from every area affected by the decision under consideration, and these weights should be reviewed regularly. Finally, risk for an investment can be incorporated in this process either as its own category or as a cost metric input to the framework.

Conclusions

By viewing investments from a DoD Enterprise perspective, evaluating investment over a multiyear time line, measuring metrics developed from this POV, and using these metrics in a systematic way to produce an ROI-like result, the DoD can evaluate and prioritize M&S investment. The process outlined in this article meets these criteria and is robust, consistent, and adaptable. If followed, the prescribed methods and guidelines should allow the DoD, and similar types of organizations, to make M&S investment decisions that result in an increased ROI when compared to the current state. An important next step in the development and use of this methodology is its application. Whether as an assessment technique for a historical examination or an approach for future M&S investment analysis, the techniques described herein would provide rigorous and useful insights.


To print a PDF copy of this article, click here.

Author Biographies

Dr Ivar OswaltDr. Ivar Oswalt defines modeling and simulation requirements, assesses their value, proposes design and development concepts, and evaluates their application. He led an effort, sponsored by the Navy Modeling and Simulation Office, to assess the value of modeling and simulation to the Navy and has defined measures of merit that reflect command, control, communications, computers, intelligence, surveillance, and reconnaissance, and warfighting effectiveness. Dr. Oswalt holds a PhD in Political Science from Claremont Graduate School.

(E-mail address: oswalt@visitech.com)

Dr Tim CooleyDr. Tim Cooley is currently the president of DynamX Consulting, an independent consulting firm specializing in modeling and simulation and operations research techniques, as well as advising businesses in decision analysis. He served over 20 years in the Air Force, and was previously the Modeling and Simulation Chair at the U.S. Air Force Academy. He holds a PhD in Computer Science and Biomedical Engineering from Rutgers University.

(E-mail address: dynamXConsulting@gmail.com)

Mr William WaiteMr. William Waite, as chairman and co-founder of The AEgis Technologies Group, Inc., directs his staff in the delivery of a wide variety of modeling and simulation products and services. He is currently the Chairman of the Board of the Alabama Modeling and Simulation Council. He holds master’s degrees in Physics from Pennsylvania State University and in Administrative Science from the University of Alabama in Huntsville.

(E-mail address: bwaite@aegistg.com)

Mr Elliot WaiteMr. Elliot Waite was formerly a research analyst at SMC Capital, Inc., an asset advisory firm specializing in discretionary money management for qualified institutional buyers. Mr. Waite’s undergraduate degree and MBA are both from Vanderbilt University. He has previously worked for two large, not-for-profit health care companies and for the Private Client Group at Merrill Lynch. He is a PhD student in economics at the University of Connecticut.

(E-mail address: ewaite@aegistg.com)

Dr Steve Flash GordonDr. Steve “Flash” Gordon has worked for GTRI in his current position for 5 years. He retired from the U.S. Air Force after 26 years of service. As a government civilian, he was the technical director for the Air Force Agency for Modeling and Simulation for 5 years. He holds a PhD in Aeronautical and Astronautical Engineering from Purdue University.

(E-mail address: steve.gordon@gtri.gatech.edu)

Mr Richard SeveringhausMr. Richard Severinghaus is the medical simulation/healthcare manager for The AEgis Technologies Group, Inc., the human systems and technology performance integration lead for Naval Submarine Medical Research Laboratory. He has nearly two decades of modeling and simulation experience in the training, systems engineering, acquisition, and test and evaluation domains. He holds a master’s degree in Systems Management from University of Southern California, Marshall School of Business.

(E-mail address: rseveringhaus@aegistg.com)

Dr Jerry FeinbergDr. Jerry Feinberg is the chief scientist at Alion Science and Technology. He has supported a variety of modeling and simulation analyses for DoD. His most recent project was a survey of methods used to determine the return on investment and value of modeling and simulation from an Enterprise view. Dr. Feinberg holds a master’s degree in Physics and Mathematics, and a PhD in Mathematics from Stanford University.

(E-mail address: jfeinberg@alionscience.com)

Mr Gary LightnerMr. Gary Lightner currently manages the High Level Architecture (HLA) Cadre outreach efforts that AEgis provides for the Defense Modeling and Simulation Office (DMSO) and provides HLA tutorials at selected conferences and workshops for DMSO. In addition, he is a Master HLA Instructor for AEgis Technologies. Mr. Lightner holds a master’s degree in Computer Science from the U.S Air Force Institute of Technology.

(E-mail address: mlightner@aegistg.com)


References

AEgis Technologies Group. (2008). Metrics for modeling and simulation (M&S) investments. Naval Air Systems Command Prime Contract No. N61339-05-C-0088. Retrieved from http://www.sim-summit.org/BusinessPractice/pdfs/ROI%20Final%20Report.pdf

Brown, D., Grant, G., Kotchman, D., Reyenga, R., & Szanto, T. (2000). Building a business case for modeling and simulation. Acquisition Review Quarterly, 24(4), 312, 315.

Gordon, S. (2006). M&S reuse success stories: Improving economics & effectiveness of M&S support to warfighting. Proceedings of 2006 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC), Orlando, FL.
H. Res. 487, 110th Cong., 1st Session (2007) (enacted).

Iansiti, M., & Levien, R. (2004). Strategy as ecology. Harvard Business Review, 82(3).

INVESTOPEDIA®. (2010). Return [online word search]. Retrieved from www.investopedia.com/terms/r/return.asp

Office of the Director of Defense Research and Engineering. (2007). Strategic vision for DoD modeling and simulation. Retrieved from http://www.msco.mil/StrategicVision.html

Oswalt, I. (2005). Navy M&S value analysis, structure, results & ongoing efforts. Washington Navy Yard, DC: Navy Modeling and Simulation Office.

Oswalt, I., & Kasputis, S. (2006). Characterizing models, simulations, and games. Proceedings of 2006 Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC), Orlando, FL.

Oswalt, I., & Tyler, R. (2008). Simulation vectors: Deducing key insights. Proceedings of 2008 Spring Simulation Interoperability Workshop, Providence, RI.

Tompkins, E. L. (2003). Using stakeholders’ preferences in Multi-Attribute Decision Making: Elicitation and aggregation issues. Centre for Social and Economic Research on the Global Environment (CSERGE) and The Tyndall Centre for Climate Change Research, University of East Anglia, Norwich UK. CSERGE Working Paper No. ECM 03-13. Retrieved from http://www.uea.ac.uk/env/cserge/pub/wp/ecm/ecm_2003_13.pdf

Worley, D. R., Simpson, H. K., Moses, F. L., Aylward, M., Bailey, M., & Fish, D. (1996). Utility of modeling and simulation in the Department of Defense: Initial data collection. Institute for Defense Analyses (IDA) Document D-1825. Alexandria, VA: IDA.

Comments

comments

Leave a Reply

Your email address will not be published. Required fields are marked *