Maximizing Federal IT Dollars: A Connection Between IT Investments and Organizational Performance


To print a PDF copy of this article, click here.

Authors: BG Ennis C. “Jim” Whitehead, USAR (Ret.), Shahram Sarkani, and Thomas A. Mazzuchi

Evaluating how best to invest government information technology (IT) dollars means making choices. Should agencies strengthen infrastructure with energy-efficient servers and increased network bandwidth, purchase software to cut costs, increase collaboration, or invest more to meet stakeholders’ future needs? Is there a connection between the way agencies invest IT dollars and successful mission accomplishment? In this article, the authors show a connection between IT investment allocations and organizational performance in federal government agencies, and demonstrate how higher performing agencies invest differently in IT than lower performing agencies. Federal managers can compare their organization’s IT investment portfolio with high-performing agencies and compare their investment allocations with other federal organizations with similar missions to determine optimum IT investment allocations for their agencies.


“The federal government has largely missed out on the transformation in the use of IT due to poor management of its technology investments. Government IT projects all too often cost millions of dollars more than they should, take years longer than necessary to deploy, and deliver technologies that are obsolete by the time they are completed.”
—Vivek Kundra
U.S. Chief Information Officers Council, July 1, 2010

So testified U.S. Chief Information Officer Vivek Kundra before the House Committee on Oversight and Government Reform, July 1, 2010. The federal information technology (IT) budget stands at $87.9 billion for 2010, with the Department of Defense (DoD) budget consuming $35.7 billion of this total (Office of Management and Budget [OMB], 2010). The government faces significant pressure to improve agency performance and reduce IT costs, but challenges and questions emerge. Is there a connection between IT investments and agency performance? If so, where should IT funds be spent to be most effective—on improved IT infrastructure with more energy-efficient servers and increased network bandwidth, or on new software to cut costs and increase productivity? Would IT dollars be most effectively used to increase collaboration and management control, or should agencies forecast their stakeholders’ future needs and invest more to meet those requirements?

With each new administration, laws and programs are created to improve agency performance through improved IT productivity. In 1996, the Clinger-Cohen Act adopted a private sector IT investment approach of performance and results-based management; in 2002, e-Government initiatives were begun to bring government services online; and today, agencies are looking at Information and Communications Technologies (ICTs) to improve services to constituents. If we could find evidence that certain categories of IT investments were associated with higher organizational performance, perhaps we could spend IT budget resources more effectively.

Sinan Aral and Peter Weill at the Massachusetts Institute of Technology Center for Information Systems Research (MIT CISR) found that corporate performance was affected by IT asset allocation (Aral & Weill, 2007). Although federal government requirements and measures of performance differ significantly from those of the private sector, with agencies focusing on mission accomplishment and responding to political conditions more than market conditions (Holmes, 2001; Ostroff, 2006), this research examines a similar connection between federal IT investments and agency performance.

Studies of IT investments and organizational performance have been conducted in the private sector for two decades, but research has not been applied to the federal sector. The authors of this study examined federal government agency IT portfolio investments for 30 agencies, as provided annually to the OMB on their Exhibit 53 data submissions, and divided federal agency IT investments into four categories: Innovation, Management Support, Process Automation, and Infrastructure. They statistically compared agency program performance, as determined from the OMB’s Program Assessment Rating Tool (PART), with IT investment allocations and sought to answer the following question:

Do higher performing federal government agencies invest IT assets differently than lower performing agencies?

Evidence pointed to a significant difference in how higher performing and lower performing agencies invest their IT budgets. A causal effect between IT investment and agency performance could not be statistically proven, but significant differences were found. This research provides a new perspective on IT investment allocation and suggests a technique by which IT investments in the federal government can be evaluated and adjusted to achieve agency goals.

Definitions

IT Assets and Investments

The U.S. Government Accountability Office, formerly the General Accounting Office (GAO), adopted a definition of IT assets in its 2000 report on Information Technology Investment Management: “computers, ancillary equipment, software, firmware and similar procedures, services (including support services), and related resources used by an organization to accomplish a function” (GAO, 2000). IT investments, the report cited, are “the actual expenditure of resources on selected information technology or IT-related initiatives with the expectation that the benefits from the expenditure exceed the value of the resources” (emphasis added) (GAO, 2000). IT investments are expected to create value for any organization, private or public sector, and at least for the long term, to return more than their costs.

Organizational Performance

Organizational achievement for federal government agencies includes continuous improvement on mission goals (Popovich, 1998); cost-effective program delivery, accountability to taxpayers, improved productivity, and human resources strength, including quality of the workforce and employee satisfaction (Kaplan & Norton, 2005; Keyes, 2005); and the approval of political stakeholders and the public perception that they are “doing the right thing” (Holmes, 2001). Since the Clinger-Cohen Act attempted to replicate IT investment portfolio success in the private sector (Van Over, 2009), and earlier IT investment portfolio research is based on the private sector, it is instructive to examine some of the differences between public and private sector performance measures. Table 1 is an adaptation of a chart by Paul Arveson (1999), customized for federal government agencies.

Table 1. Private Sector vs. Federal Agency Organizational Performance

Private Sector Federal Agencies
Strategic Goal Competitiveness Mission effectiveness
Financial Goal Profit, growth, market share Cost reduction, efficiency
Values Innovation, creativity, good will, recognition Accountability to public, integrity, fairness
Desired Outcomes Customer satisfaction (customer pays for product/service) Stakeholder satisfaction (stakeholder may not pay proportionally for service)
Stakeholders Stockholders, owners, market Taxpayers; legislative, executive, and judicial branches
Budget Priorities Defined by: Customer demand Leadership, legisators, planners
Justification for Secrecy Protection of intellectual capital, proprietary knowledge National security
Key Success Factors Growth rate, earnings, market share Best management practices, legislative compliance
Uniqueness Sameness, economies of scale
Advanced technology Standardized techonology

Note. Adapted from “Differing Measures for Private Sector and Deferal Agency Organizational Performance,” by P. Arveson, 1999.

Background

IT portfolio management is rooted in Markowitz’ Modern Portfolio Theory for investments, where diversification of financial assets (stocks, bonds, and cash) is balanced by expected returns and risk (Markowitz, 1952). Warren McFarlan (1981) used the work of Markowitz to apply the principles of investment and risk to information systems investment, noting that IT project and portfolio risk alone was neither positive or negative, but must be seen in context to the degree of risk and potential reward. IT portfolio risk requires examining current and future projects, constantly evaluating projects to determine effectiveness, and investing and divesting as necessary (Van Over, 2009).

Several IT investment portfolio allocation methods have been proposed over the years. Glen Peters (1988) and John Ward (1990) both divided IT investments into functional categories. Bryan Maizlish (2005) furthered this concept with an IT asset portfolio system, which allocated investments into Infrastructure, Process, Information/Data, and Human Capital investments. He noted that in the federal government extra challenges are inherent to successfully designing an IT asset portfolio: Few common standards or historical bases are available for evaluating IT investments; IT investments are difficult to retire without other agency systems and databases being affected; investments and their component interdependencies must be monitored, adjusted, and disposed of for their full life cycle; and IT projects must be evaluated not only for performance goals, risk management, and life-cycle cost formulation, but also for security and privacy, and support of the President’s management agenda.

G. David Garson (2003) noted that, traditionally IT investments were made on a project basis—a conservative policy with riskier projects often not funded. Portfolio management allows higher risk/higher payoff projects to be balanced with less risky/lower payoff projects, justifying some large nontraditional systems, including e-Government initiatives. An awareness of the IT investment process must be built into the agency so that formal processes of IT evaluation are adopted, and the agency moves from a project-centric base to a portfolio approach, evaluating investments according to their support of the agency’s overall mission. High-risk, low-value, or obsolete IT investments are evaluated and may be de-selected from the portfolio, and through benchmarking IT investments to successful organizational investments, better technologies can be chosen.

Researchers from the MIT CISR have done extensive research on private company and industry IT investment portfolios. Using survey data from chief information officers or their representatives from 1,508 companies in 60 countries over a 10-year period, and statistically controlling for industry, firm size, advertising expense, and research and development, they divided IT expenditures into four asset classes: Infrastructure, Transactional, Informational, and Strategic. From this data, they established patterns within industries, evaluated business strategies, and defined measures companies could take to evaluate their current IT expenditures and improve organizational performance.

Weill and Aral (2003), Weill and Ross (2004), Aral and Weill (2007), and Weill and Aral (2008) found that:

  1. Strategic investments, which are designed to gain competitive advantage in the marketplace and to develop new business products and services, on average consumed 13 percent of private sector IT investments in 2007. These investments are higher risk, have a longer lead-time, and often involve very new technologies. The failure rate for strategic IT investments can be as high as 50 percent, but their successes could put a company several years ahead of competitors. Weill noted that strategic investments in the small sample of public sector organizations (charities, private schools, local government) he studied led to greater innovation, increased interaction with customers, enabled major changes, and easier facilitation within the organization.
  2. Informational investments, which provide internal information (e.g., for accounting, and communication) are designed to reduce costs and add potential profitability improvements in the future. In 2007, they consumed on average 13 percent of annual firm IT investment.
  3. Transactional investments, which often automate existing operations, may result in immediate cost reductions. In 2007, the average private sector organization studied by Weill apportioned 27 percent of annual IT investment to transactional investments. Private sector companies that invest more heavily in transactional systems than their competitors had higher productivity (sales per dollar of assets) and lower costs. They estimate transactional investments return 25–40 percent per IT dollar invested.
  4. For Infrastructure costs, the objective is to either reduce IT costs via consolidation or to establish a flexible, reusable base for future business needs. Infrastructure investments typically have high initial costs and lower short-term profitability, but higher operational performance and profitability over the long run. In 2007, the average firm in the CISR study allocated 47 percent of IT investment to infrastructure.

One important consideration for government organizations is their high legacy system costs. When Weill, Woerner, and Rubin (2008) looked at sustaining IT investments, which maintain and update current systems, and new investments, which encompass major initiatives and changes to systems, they found that in 2007 the average firm spent 66 percent of IT investments on sustaining investments, and one-third of the firms studied by the MIT CISR spent over 75 percent of their IT dollars to run existing systems. With so much of IT assets spent supporting existing complex and often redundant systems, portfolio assets are not freed up to support new IT systems. Weill found that organizations allocating more to new investments rather than sustaining investments had greater revenue growth, and by that measure, were higher performing in their industry. Top performers in each industry spent 4 percent more on IT, but had similar portfolios (Weill & Aral, 2004).

Research Data and Methods

Our study adapted private sector research methods relating organizational performance and IT investment portfolios to U.S. Government agencies. Much of our comparative resource information came from studies by the MIT CISR, but because of differences between the public and private sector, we adapted our categories to public sector requirements.

Agency Selection

Federal government agencies in this study were selected for comparative purposes if both publicly available performance data and IT investment data were available. Thirty agencies were evaluated, as shown in Table 2. The 10 highest performing agencies were compared to the 10 lowest performing agencies in terms of their IT investment allocations. Agencies with similar mission focus were also disaggregated and compared within each focus area to develop trends.

Table 2. Agencies Evaluated for this Study

Department of Agriculture Department of the Treasury
Department of the Air Force Department of Transportation
Department of the Army Department of Veterans Affairs
Department of Commerce Environmental Protection Agency
Department of Defense – Other General Services Administration
Department of Education National Aeronautics and Space Administration
Department of Energy National Archives and Records Administration
Department of Health and Human Services National Science Foundation
Department of Homeland Security Nuclear Regulatory Commission
Department of Housing and Urban Development Office of Personnel Management
Department of the Interior Small Business Administration
Department of Justice Smithsonian Institution
Department of Labor  Social Security Administration
Department of the Navy US Agency for International Development
Department of State  US Army Corps of Engineers – Civil Works

Data for Agency Performance

To determine agency performance, we looked for publicly available, objective measures by which federal government agency performance could be ranked. Comparative data are limited, and agencies are reluctant to be compared with one another: They perceive themselves as unique in mission and resources—both of which are dictated by Congress—with changing political environments and politically determined budgets. Comparisons could only be found for employee satisfaction (Brewer, 2000), management excellence (Office of Personnel Management, 2008), and website technology effectiveness (West & Lu, 2009).

The Government Performance and Results Act of 1993 established a framework for department and agency performance reporting, designed to assist federal organizations in improving program performance. Through 2008, the OMB had assessed the performance of 1,017 federal government programs, representing 98 percent of the federal budget, using the Program Assessment Rating Tool (PART) (OMB, 2009a).

The PART is an annual agency report that evaluates each agency’s key programs in four key areas: purpose, performance, measurement, and results across a common matrix. The PART questionnaire asks 25 questions specific to the category of each program, and must be documented with data. Agencies have some flexibility in choosing methods of evaluation, which must be approved by the OMB. Four areas are assessed: purpose and design, strategic planning, management and results, and accountability. A numeric score is derived from each of the four areas of assessment, ranging from 0–100, with 100 being the best. These numbers determine one of four qualitative ratings: Effective (85–100), Moderately Effective (70–84), Adequate (50–69), or Ineffective (0–49). If a program is not measured by acceptable performance measures or does not yet have performance data, a rating of Results Not Demonstrated is given. Programs may be reassessed after corrective actions are completed to improve their ratings (OMB, 2009a; OMB, 2009b).

By design, the PART has historic agency data on listed programs that reflect the latest evaluations, including any reassessments. It was initiated in 2004 with approximately one-fifth of agency programs rated each year until the PART reflected all 1,017 programs in 2008. The 2008 reporting year was selected to evaluate the maximum number of programs, and the 2-year mean was used to establish weighted scores for program size in each agency. With these numbers, agencies could be ranked as to the effectiveness of the programs they self-described as most important to their agency mission, using cluster analysis in performance categories of “highest performing” (top third of grouped agencies) and “lowest performing” (bottom third of grouped agencies).

Only agency information that was publicly available at the time of this study was used. Intelligence agencies participate in these evaluations, but their results are not publicly available and not included in this report. The methods and results appear to be consistent throughout different organizations, however, and should be internally applicable to these agencies as well.

Data for IT Investment Allocations

The Clinger-Cohen Act of 1996 put the OMB in charge of improving the productivity, efficiency, and effectiveness of federal agencies by linking planning and investment strategies and IT portfolio management to the federal budget process. Each agency is required to create capital IT asset portfolios and review them to determine if a project is still attainable and has a high benefit/cost ratio compared with other investments in the portfolio. The agency annual IT investment portfolio, known as OMB Exhibit 53, is a reporting mechanism for agencies to evaluate all IT projects and ensure that they are well-planned and meet cost, schedule, and performance goals planned for the investment. It has six major categories of IT investments: Mission Area Support; Infrastructure, Office Automation, and Telecommunications; Enterprise Architecture and Planning; Grants Management Systems; Grants to State and Local IT Investments; and National Security Systems. Exhibit 53 is used by each agency to report the information in its annual IT investment portfolio for both major and nonmajor programs to the OMB, and is published as part of the federal budget. It is designed to assist agencies in selecting investments to improve the management of IT programs, understand the amount spent on IT modernization and support of legacy systems, and encourage interagency cooperation to eliminate redundant and nonproductive IT investments. The purpose of Exhibit 53 is to encourage agencies to focus IT spending on high-priority modernization initiatives; to manage major IT investments within 10 percent of cost, schedule, and performance objectives; and to protect the security of information systems (OMB, 2008).

Data for this research were taken from the 2008 Exhibit 53. Over 7,200 IT investments were listed for the 2008–2009 budget years for the 30 departments and agencies used in the study (OMB, 2009c). Again, the study used the mean of 2 reporting years to better accommodate any spikes in one type of investment in a given year. This study allocated over 7,200 17-digit coded investments in a total of 30 federal government agencies, based on project descriptions and codes, according to the four investment allocation categories of Innovation, Management Support, Process Automation, and Infrastructure. These categories are similar to the four categories described in Weill and Aral (2003): Strategic, Informational, Transactional, and Infrastructure, but adapted to better describe Exhibit 53 categories. Total IT investments were disaggregated into these four IT investment categories, as shown in Figure 1.

Figure 1. Agency IT Investment Disaggregation Process

  1. Innovation investments include those investments that provide a new service or major innovation that impacts external stakeholders. Examples include the e-Grants portal, giving citizens one central location from which to access federal government grant information; the Veterans Administration e-Gov Benefits program, providing a single point of access for citizens to locate and determine potential eligibility for government benefits; and the Army’s Force XXI Battle Command, Brigade and Below information system, a new graphical information system significantly improving battlefield awareness for commanders.
  2. Management Support investments are designed to provide information to employees to improve accounting, management, reporting, communication, collaboration, or analysis. This would include the DoD’s Defense Enterprise Accounting and Management System, a financial management system designed to modernize internal accounting systems, and the Department of State’s Treaty Information Management System, which makes treaty data more accessible to department employees.
  3. Process Automation investments are used to cut costs or increase throughput for the same cost in organizational operations, often through automating existing operations. Examples include the National Institutes of Health Electronic Research Administration System, which automates formerly paper-based functions in grant administration; the Department of Labor PeoplePower system, which integrates all human resources processes into one system; and the Defense Travel System, which automates travel authorization and vouchering—previously a manual, labor-intensive process.
  4. Infrastructure investments are shared resources used by multiple applications (e.g., servers, networks, desktop computers, and customer databases), and comprise a substantial proportion of IT investments.

In reality, individual investments often span two or more of these categories, and can change over time as Infrastructure investments are retired and Innovation investments become accepted as mainstream.

Figure 2 illustrates these investment categories as interdependent Building Blocks of IT. Infrastructure is at the base and provides support for all other IT investments. At the next level, Process Automation investments, which often automate existing procedures for cost-cutting purposes, rely on a solid base of Infrastructure, but also symbiotically relate to both Management Support and Innovation investments. Management Support investments improve communications and operations within an agency, and rely both on a solid Infrastructure base and Process Automation investments. Innovation investments—the high-risk/high-potential investments providing new, strategic services for stakeholders—rely both on Infrastructure as a shared base and Process Automation investments to improve cost effectiveness.

Figure 2. Building Blocks of IT Investments

Reliability and Limits of the Data

Data from both the PART and Exhibit 53 were obtained from official public U.S. Government reports. Each agency’s report is required by law annually and must be verified by agency leadership. Our statistical results were limited to 30 federal agencies, and expanded research in the future could be pursued with the addition of agency subgroups. We did, however, find trends that were consistent throughout each of the federal government agency categorical comparisons we studied, and were similar to trends Weill and Aral (2004) found in the private sector.

It can be argued that 2 years of Exhibit 53 data is not enough, since the effects of IT investments may lag behind implementation and therefore need a longer time frame. We agree that further research should examine the Exhibit 53 and PART data over different time frames. Rai, Patnayakuni, and Patnayakuni (1997) note that IT investments may have less of a lag effect than capital investments, however, due to the accelerated rate of IT obsolescence. They further note that more than 80 percent of an organization’s IT expenditures are for current operations, and the depreciation of new hardware further dilutes any investments, which would impact any lag time for current performance.

Recurring complaints are that the PART does not accurately measure the focus of the programs, allow flexibility as to the program’s mandates, or give credit for any programs that cannot be currently quantified, and are therefore awarded a “Results Not Demonstrated” classification (Gueorguieva et al., 2009). The PART is not a perfect interagency program evaluation tool, but it has advantages for this study in its public availability, verified data, consistent criteria among agencies, multiyear time frame, allowance for rescoring and updating of programs, and coverage of 98 percent of the federal budget (OMB, 2009d).

Results

Do higher performing agencies, based on program performance, invest in IT differently than lower performing agencies?

We first examined the data to determine normality of distributions. Using Minitab 15, we could only show normality within a 95 percent confidence interval for Management Support investments in both the 10 highest performing agencies and 10 lowest performing agencies. As a result, we chose to use nonparametric tools in an attempt to prove the null hypothesis—that higher performing federal agencies have the same IT investment strategies as lower performing agencies. Using the Mann-Whitney test, which assesses whether two independent samples of observations have equally large values, we showed that within a 95 percent confidence interval, Innovation investments are different for the highest and lowest performing agencies. Since the Innovation investments are different, the overall portfolios of investments are different. The null hypothesis is not supported, and therefore, the 10 highest performing agencies invest their IT assets differently than the 10 lowest performing agencies.

We could not prove causal statistical correlations, but we can see relationships between aggregate performance and IT investment categories. We first looked at the IT investment breakdown for the average of the 10 highest performing and 10 lowest performing agencies, and the average of all agencies (Table 3). We then classified Cost-focused and Agility-focused agencies according to agency mission statements and evaluated their IT investment allocations (Table 4).

Table 3. IT Investment Averages for Agencies By Rankinga

Four Categories of IT Investmentsb Lower Performing Agencies Average of All Agencies Higher Performing Agencies
Innovation 3% 8% 16%
Management Support 16% 13% 20%
Process Automation 24% 23% 18%
Infrastructure 57% 56% 56%
IT Spending as a Percentage of Overall Agency Budgetc 1.68% 2.04% 2.87%

aPerformance determined by ranking of PART scores over a 2-year period.

bAgency category distribution determined by Exhibit 53 data.

cTotal Agency Budget information for 2008-2009 retrieved from Government Printing Office (GPO) Access database http://www.gpoaccess.gov/usbudget/

Table 4. IT Investment Strategy Benchmarks for Agencies By Cost-Focused and Agility-Focused Missiona

Four Categories of IT Investmentsb Average Agency Cost-Focused Agility-Focused
Innovation 83% 5% 17%
Management Support 13% 16% 5%
Process Automation 23% 30% 21%
Infrastructure 56% 49% 57%
IT Spending as a Percentage of Overall Agency Budgetc 2.04% 0.62% 5.89%

aMission determined from agency websites.

bAgency category distribution determined by Exhibit 53 data.

caTotal Agency Budget information for 2008-2009 retrieved from Government Printing Office (GPO) Access database http://www.gpoaccess.gov/usbudget/

Higher performing agencies spent twice as much as the average agencies and more than five times that of lower performing agencies for Innovation investments and less than average agencies for Management Support and Process Automation systems. Infrastructure investments were consistent across all performance categories. Higher performing agencies invested 41 percent more on IT than the agency average and 71 percent more than lower performing agencies (Table 3).

DoD agencies—which include the Department of the Air Force, Department of the Army, Department of the Navy, and Department of Defense–Other—each independently ranked high in the PART evaluation. Their IT investment portfolio spending showed the following investment allocations: Innovation (30 percent), Management Support (3 percent), Process Automation (12 percent), Infrastructure (55 percent), and an overall IT spending percentage of 5.29 percent, as compared to their budget. These results are consistent with other top-performing agencies.

We next divided the agencies into Cost-focused and Agility-focused, according to the missions stated on their agency websites. Cost-focused agencies (e.g., the General Services Administration and Social Security Administration) are committed to providing optimum value for taxpayers, and Agility-focused agencies (e.g., the Department of Defense and Homeland Security) have a primary objective to protect the nation and react promptly to mission changes.

Cost-focused agencies spent less than average agencies on Innovation and Infrastructure, and more than average on Management Support and Process Automation. Agility-focused agencies spent twice as much as average agencies on Innovation, less on Management Support and Process Automation, and the same on Infrastructure (Table 4).

These results are consistent with those found in private sector studies. Weill and Aral (2003) noted that cost-focused firms spent less than average on Strategic (similar to Innovation) investments, Informational (similar to Management Support) and Infrastructure, and significantly more on Transactional (Process Automation) investments. Agility-focused firms spent more on Strategic, less on Informational and Transactional, and slightly more on Infrastructure investments.

Discussion

This research proves that federal government agencies that are most successful in program performance have a different IT portfolio of investments than those of less successful agencies. Higher performing agencies invest more in Innovation and less in Management Support and Process Automation as a percentage of their total portfolios. This result may be because Innovation investments can set the stage for new services or major agency improvements, or signal that improved IT governance has allowed more of an agency focus on modernization. Higher performing agencies may not place top priorities on reducing costs, and therefore not channel their resources into Management Support and Process Automation. Higher performing agencies also invest more in IT as a percentage of their budgets than lower performing or average agencies, possibly indicating a greater management focus on IT as a way to improve agency performance.

Our study also found that Cost-focused agencies invest higher than average in Process Automation, lower than average in Innovation and Infrastructure, and lower in IT as a percentage of their overall budget. This is completely consistent with private sector results found in the research of Weill and Aral (2003). Process Automation investments bring immediate cost savings while Innovation and Infrastructure investments increase costs in the short term and may never lead to lower costs in the long term.

Finally, this study also showed that Agility-focused agencies, like DoD and all the Services, have higher than average investments in Innovation, lower than average investments in Management Support, and higher than average IT investments as a percentage of their overall budget. Again, these results are very consistent with the private sector results found in the MIT CISR research (Weill & Aral, 2003). Agility-focused federal agencies and private firms will invest more in Innovation IT that brings new services to stakeholders and less on Management Support that brings them future cost savings and future profitability. For DoD agencies, it may simply be that the high spending (30 percent) on Innovation in the IT portfolio crowds out dollars available for “lower priority” IT, such as Management Support.

To apply the results of our study, DoD managers can disaggregate an organization’s IT spending allocations into the four investment categories of Innovation, Process Automation, Management Support, and Infrastructure, and compare them with the results in our study. DoD managers can also compare current IT portfolio investments with the high-performing agencies or the Agility-focused agencies. Those who are in Cost-focused DoD organizations, like the Defense Finance and Accounting Service, can compare their IT investments with other Cost-focused agencies.


To print a PDF copy of this article, click here.

Author Biographies

BG Ennis C Jim Whitehead IIIBG Ennis C. “Jim” Whitehead III, USAR (Ret.), is chief of External IT Operations at the National Geospatial-Intelligence Agency and has held executive positions at major telecommunications corporations both domestically and internationally. BG Whitehead received his BS in Engineering from West Point, an MBA from Harvard University, a master’s in Strategic Studies from the Army War College, and a master’s in Systems Engineering from The George Washington University.

(E-mail address: Jim.C.Whitehead@nga.mil)

Dr Shahram SarkaniDr. Shahram Sarkani is a professor of Engineering Management and Systems Engineering at The George Washington University. Since 2001 he has served as Faculty Adviser for Off-Campus Programs in the Department of Engineering Management and Systems Engineering. His current research interests include stochastic methods of structural dynamics and fatigue, fatigue and fracture reliability, structural safety and reliability, and smart infrastructure systems for natural hazard mitigation. Dr. Sarkani is a Professional Engineer and earned his PhD from Rice University.

(E-mail address: emseocp@gwu.edu)

Dr Thomas A MazzuchiDr. Thomas A. Mazzuchi is a professor of Engineering Management and Systems Engineering at The George Washington University. His current research interests include reliability and risk analysis, Bayesian inference, quality control, stochastic models of operations research, and time series analysis. Dr. Mazzuchi earned his PhD from The George Washington University.

(E-mail address: emseocp@gwu.edu)


References

Aral, S., & Weill, P. (2007). IT assets, organizational capabilities, and firm performance: How resource allocations and organizational differences explain performance variation. Organization Science, 18(5), 763–780.

Arveson, P. (1999). Translating performance metrics from the private to the public sector. Retrieved from Balanced Scorecard Institute website: http://www.balancedscorecard.org/BSCResources/PerformanceMeasurement/TranslatingMetrics/tabid/139/Default.aspx

Brewer, G. A., & Selden, S. C. (2000). Why elephants gallop: Assessing and predicting organizational performance in federal agencies. Journal of Public Administration Research and Theory, 10(4), 685–711.

Chief Information Officers Council. (2010, July 1). Statement of Vivek Kundra, Federal Chief Information Officer, before the House Committee on Oversight and Government Reform. Cloud computing: Benefits and risks of moving federal IT into the cloud. Retrieved from http://www.cio.gov/Documents/Vivek-Kundra-Testimony-Cloud-Computing_07-01-2010.pdf

Clinger-Cohen Act of 1996, 40 U.S.C. 1401 et seq., Pub. L. 104-106 (1996).

Garson, G. D. (2003). Information technology: Policy and management issues. Hershey, PA: Idea Group Publishing.

General Accounting Office. (2000). Information technology investment management: A framework for assessing and improving process maturity (Version 1). GAO Report No. GAO/AIMD-10.1.23. Retrieved from http://www.gao.gov/special.pubs/ai10123.pdf

Government Performance and Results Act of 1993, Pub. L. 103-62 (1993).

Gueorguieva, V., Accius, J., Apaza, C., Bennett, L., Brownley, C., Cronin, S., & Preechyanud, P. (2009). The Program Assessment Rating Tool and the Government Performance and Results Act: Evaluating conflicts and disconnections. American Review of Public Administration, 39(3), 225–245.

Holmes, D. (2001). eGov: eBusiness strategies for government. Naperville, IL: Nicholas Brealey.
Kaplan, R. S., & Norton, D. P. (2005). The balanced scorecard: Measures that drive performance. Harvard Business Review, 83(7/8), 172–180.

Keyes, J. (2005). Implementing the IT balanced scorecard: Aligning IT with corporate strategy. Boca Raton, FL: Auerbach Publications.

Maizlish, B., & Handler, R. (2005). IT portfolio management step-by-step: Unlocking the business value of technology. Hoboken, NJ: John Wiley & Sons.

Markowitz, H. (1952). Portfolio selection. Journal of Finance, 7(1), 77–91.

McFarlan, F. W. (1981). Portfolio approach to information systems. Harvard Business Review, 59(5), 142–150.

Office of Management and Budget. (2008). Preparation, submission, and execution of the budget. OMB Circular No. A-11, Section 53. Retrieved from http://www.whitehouse.gov/omb/circulars/all/current_year/s53.pdf

Office of Management and Budget. (2009a). The program assessment rating tool. Retrieved from http://www.whitehouse.gov/omb/expectmore/part.html

Office of Management and Budget. (2009b). ExpectMore.gov: Frequently asked questions. Retrieved from http://www.whitehouse.gov/omb/expectmore/faq.html#005

Office of Management and Budget (2009c). Budget of the United States Government. Retrieved from Government Printing Office (GPO) Access database: http://www.gpoaccess.gov/usbudget/

Office of Management and Budget. (2009d). The FY 2008 performance report of the federal government. Retrieved from http://www.whitehouse.gov/omb/expectmore/2008Performance.pdf

Office of Management and Budget. (2010). IT dashboard. Retrieved from http://it.usaspending.gov/

Office of Personnel Management. (2008). President’s quality award program. Retrieved from http://www.opm.gov/pqa/past_pqa_winners/2008/index.asp

Ostroff, F. (2006). Change management in government. Harvard Business Review, 84(5), 141–147.

Peters, G. (1988). Evaluating your computer investment strategy. Journal of Information Technology, 3(3), 178–188.

Popovich, M. G., Ed. (1998). Creating high-performance government organizations. San Francisco: Jossey-Bass.

Rai, A., Patnayakuni, R., & Patnayakuni. N. (1997). Technology investment and business performance. Communications of the ACM, 40(7), 89–97.

Van Over, D. (2009). Use of information technology investment management to manage state government information technology investments. In A. W. K. Tan & P. Theodorou (Eds), Strategic Information Technology and Portfolio Management (chap. 1). Hershey, PA: Information Science Reference.

Ward, J. M. (1990). A portfolio approach to evaluating information systems investments and setting priorities. Journal of Information Technology, 5(4), 222–231.

Weill, P., & Aral, S. (2003). Managing the IT portfolio (update circa 2003). Retrieved from MIT Sloan Center for Information Systems Research website: http://cisr.mit.edu/

Weill, P., & Aral, S. (2004). IT savvy pays off. Retrieved from MIT Sloan Center for Information Systems Research website: http://cisr.mit.edu/

Weill, P., & Aral, S. (2008). Managing the IT portfolio (update circa 2008): It’s all about what’s new. Retrieved from MIT Sloan Center for Information Systems Research website: http://cisr.mit.edu/

Weill, P., & Ross, J. W. (2004). IT governance: How top performers manage IT decision rights for superior results. Boston, MA: Harvard Business School Press.

Weill, P., Woerner, S. L., & Rubin, H. A. (2008). Managing the IT portfolio (update circa 2008): It’s all about what’s new (Vol. VIII, 2B). Retrieved from MIT Sloan Center for Information Systems Research website: http://cisr.mit.edu/

West, D. M., & Lu, J. (2009). Comparing technology innovation in the private and public sectors. Retrieved from Brookings Institution website: http://www.brookings.edu/papers/2009/06_technology_west.aspx

 

Comments

comments

Leave a Reply

Your email address will not be published. Required fields are marked *