arj-76-article-5-lead

Human Systems Engineering and Program Success—A Retrospective Content Analysis


To print a PDF copy of this article, click here.

Author: Liana Algarin

This investigative study demonstrates the benefits of addressing human considerations early in the system development life cycle that will bring long-term benefit to program managers and systems engineers. The approach used a retrospective content analysis of documents from weapon systems acquisition programs, namely Major Defense Acquisition Programs. Binary logistic regression analyses were conducted to predict the effect of the presence of words relating to Human Systems Integration on the success of programs. This investigative study corroborates the idea that some benefit may be derived from implementing Human Systems Integration during the weapon systems acquisition life cycle.

In 1981, U.S. congressional watchdogs recommended improving weapon systems design by addressing human consideration problems early during system acquisition (U.S. General Accounting Office, 1981). Today, the process by which human considerations are included in the planning and implementation of a system is known as Human Systems Integration (HSI) (International Council on Systems Engineering [INCOSE], 2012). The benefit of including HSI in weapon systems design and acquisition is best realized by giving HSI early attention and priority during the planning stage. This article will demonstrate that a decreased percentage of HSI words in documents originated during acquisition will coincide with unforeseen costs, delays, and performance problems.

In fact, HSI-related content in acquisition documents may influence program success. Typically, a program is considered successful if it avoids cost overruns, avoids performance breaches, or avoids schedule breaches. Systems Engineering (SE) is the interdisciplinary approach for developing systems (INCOSE, 2012). HSI is an important part of SE, and thus the acquisition life cycle (Karwowski, 2012). Understandably, the decisions that program managers and systems engineers make early in the acquisition life cycle will affect program success and life-cycle costs.

For example, to help organizations with incorporating HSI into their design process, Handley and Knapp (2014) have created the Human Viewpoint tool for early implementation of HSI into the acquisition life cycle. Ahram and Karwowski (2012) warn that failing to address costs related to human performance (e.g., Human Total Ownership Cost) early in the life cycle will lead to schedule overruns, diminished system performance, inadequate training, and misaligned plans for manpower and personnel allocation. Cramer, Sudhoff, and Zivi (2011) posit that integrating survivability as a design objective early in the life cycle can benefit the design process. Assessing human capabilities during technology readiness level evaluation, according to Wallace, Bost, Thurber, and Hamburger (2007), can help the program avoid cost overruns.

The interaction between humans and the systems they use affects program success, as well as life-cycle costs. The documents from weapon systems acquisition programs, namely major defense acquisition programs (MDAP), contain a history of each program’s system development life cycle, and this history indicates what considerations were involved in the system development life cycle. It follows then that HSI-related content in acquisition documents is interrelated with program success.

This article is essentially an investigative study or retrospective content analysis of MDAP documents. The author’s goal is to present a sound argument that omitting HSI during weapon systems acquisition will coincide with acquisition life-cycle cost overruns, as well as schedule slippages and performance breaches. More specifically, this investigative study addresses the gap in knowledge among weapon systems acquisition, HSI, and acquisition life-cycle cost, performance, and schedule (Figure 1).

Figure 1. Venn Diagram of Interrelationships Among Weapon Systems, HSI, and Life-Cycle Cost, Performance, and Schedule

arj-76-article-5-figure-1

Problem Statement

Stakeholders, such as program managers and systems engineers, strive to mitigate unforeseen costs during the system development life cycle. One way to achieve this objective is to prioritize human considerations early in the system development life cycle. A program manager or systems engineer could predict program success using regression analysis of historical data. Program documents, such as Selected Acquisition Reports (SARs), provide a valuable source of historical data about weapon systems programs (Assidmi, Sarkani, & Mazzuchi, 2012; Bielecki & White, 2005; Birchler, Christle, & Groo, 2011). For purposes of this investigative study, logistic regression and HSI-related terminology in documents will be used to make predictions about program success.

Presence of HSI-related terminology is defined by the percentage of HSI-related words per document. Program success is defined by avoidance of cost overrun, avoidance of performance breach, or avoidance of schedule breach. Overall, the objective is to conduct a retrospective content analysis of MDAP documents as an approach by which to seek the presence of HSI-related terminology in weapon systems acquisition. This approach is designed to demonstrate that an earlier presence of HSI-related terminology will predict better outcomes for weapon systems acquisition programs with money saved, time saved, and good performance. A definitive research question that addresses the problem identified by this investigative study follows.

Does the percentage of HSI words within the document coincide with unforeseen cost overruns, performance breaches, and schedule slippages?

Although the findings of this investigative study did not yield strongly predictive regression models, significant findings emerged that suggest schedule slippages and cost overruns may be associated with a reduction of HSI-related terminology. From the findings, the presence of terminology about human factors engineering, habitability, and survivability as well as manpower, personnel, and training suggests that a program will likely succeed. This finding corroborates the idea that a solution to the problem—specifically saving time, saving money, and improving performance—will be the inclusion of HSI-related content early in the weapon systems acquisition life cycle.

Data Collection

The data collection effort identified HSI-related terminology in each document, including HSI word percentages. Program success data were collected with regard to cost, schedule, and performance. Weapon systems acquisition programs, specifically MDAPs, were also identified. Additional data were collected to identify when each document was published with regard to its corresponding program’s Milestone B.

HSI-Related Terminology

As shown in Table 1, the HSI-related terminology in this investigative study consisted of words that refer to the nine HSI domains defined by the Department of the Air Force (2014). Also included were the terms HSI and MANPRINT, which are synonymous (Drillings, 2014). It is helpful to note that HSI is defined differently among organizations. For example, Headquarters Department of the Army (2014) defined seven HSI domains:

  1. manpower
  2. personnel capabilities
  3. training
  4. human factors engineering
  5. system safety
  6. health hazards
  7. soldier survivability

Department of the Navy (2009) defined seven slightly different HSI domains:

  1. manpower
  2. personnel
  3. training
  4. human factors engineering
  5. environmental safety and occupational health
  6. habitability
  7. personnel survivability

Department of the Air Force (2014), however, defined nine versus seven HSI domains:

  1. manpower
  2. personnel
  3. training
  4. environment
  5. safety
  6. occupational health
  7. human factors engineering
  8. survivability
  9. habitability

This investigative study refers to the most complete list of HSI domains, as identified by Department of the Air Force (Table 1).

Table 1. HSI-RelaTED Words and Corresponding HSI Domains

arj-76-article-5-table-1To indicate HSI-related terminology within each program document, data consisted of word percentages for each HSI-related word of interest for each of 546 program documents. These word percentages were calculated using word counts for each HSI-related word of interest and the total word count for each document. Table 2 shows the number of documents per program and the range of word counts per program.

Table 2. Major Defense Acquisition Programs

arj-76-article-5-table-2

Data were collected from the 546 documents and entered into SPSS Statistics Version 22.0 for Windows. HSI words within the sampled documents ranged from zero to 2,262. The average number of HSI words was 42.60 (SD = 160.73). Total words for the sampled documents ranged from 68 to 64,661, and the total word count was 2,010.53 on average (SD = 5,109.40).

HSI-related words were separated into three categories (Table 3). Some overlap occurred among each of the HSI domains, and some words fit into more than one domain description. Typically, environmental, safety, and occupational health issues are grouped together and identified with the acronym ESOH, as are manpower, personnel, and training issues, which are identified as MPT. For this investigative study, these same groups were identified, thus the data included ESOH and MPT word percentages. Because the terms habitability and survivability generated a small quantity of words, they were grouped together with Human Factors Engineering (HFE), along with the terms HSI and MANPRINT, thus the data included HFE/Hab/Surv word percentages.

Table 3. HSI-Related Words and Corresponding HSI Categories

arj-76-article-5-table-3

Success Metrics

Program success data were collected to investigate each MDAP’s cost overruns, schedule slippages, and performance breaches. These success metrics were (a) SAR-identified cost breaches, (b) SAR-identified schedule breaches, (c) SAR-identified performance breaches, (d) GAO assessment indicating total program over budget, (e) GAO assessment indicating program unit cost increase, and (f) Weapon Book program amount spent went over budget. None of these six metrics had absolutely complete data because data were not available for each program for each fiscal year. To collect ample data, all six metrics were considered for this investigative study.

MDAPs

Several factors were considered in the selection of the sample of MDAPs analyzed in this study. Each program needed sufficient documentation for the HSI word analysis as well as cost, schedule, and performance data. Table 4 lists the 16 MDAPs alongside their common names. MDAP documents were collected between June 2013 and October 2014, primarily from the Defense Acquisition Management Information Retrieval (DAMIR) database (DAMIR, n.d.). Additional MDAP documents were collected from the Acquisition Decision Memoranda (ADM) Web site (ACQWeb) (Acquisition Decision Memoranda, 2014). The objective was to acquire acquisition program documents that are consistent from program to program; the DAMIR database and ADM Web site made this possible.

Table 4. HSI-related Words and Corresponding HSI Domains

arj-76-article-5-table-4
Note. If a program had any increments, then the increment was noted, and appropriate data for that increment were collected.

None of the MDAP documents identifying HSI-related terminology were used to obtain cost, schedule, or performance data. In addition to collecting MDAP documents, other sources of data were collected to identify program cost overruns, schedule slippages, and performance breaches (Table 5). Furthermore, none of the documents that were used for obtaining information about program cost overruns, schedule slippages, and performance breaches were used to obtain HSI data.

Table 5. Breach Data Sources

arj-76-article-5-table-5

Experimental Design

Because the regression analysis was intended to measure six dependent variables, six analyses were conducted for each of the six dependent variables: (a) SAR cost breach, (b) SAR performance breach, (c) SAR schedule breach, (d) GAO total program over budget, (e) GAO program unit cost increase, and (f) Weapon Book amount spent over budget. Subcategories of HSI word percentage per MDAP document included (a) ESOH word percentage, (b) HFE/Hab/Surv word percentage, and (c) MPT word percentage. The independent variables were the subcategories of HSI word percentage per MDAP document.

Data Analysis

As shown in Table 5, qualitative breach data were categorized with 0s and 1s. When dependent variables are qualitative, a logistic regression equation can be used to create a model of the probability that the dependent variable’s value will be either 0 or 1 (Chatterjee & Hadi, 2012). After the regression model has been created, the data are compared to the model to discover what ratio of the data was classified 0 or 1 correctly (Chatterjee & Hadi, 2012). Binary logistic regression, the data analysis method selected for this data set’s interpretation, is a method for modeling probabilities when the outcome falls between 0 and 1.

arj-76-article-5-secondary-1Because each budget for each MDAP is unique from the budgets of other MDAPs, the numeric values for each budget would consist of different numbers that cannot be directly compared. To assess the data, one may ask the question, “Did an MDAP go over budget, or didn’t it?” Answering this question opens the possibility to make the data dichotomous with 0s for no and 1s for yes. These dichotomous data are qualitative because they yield a qualitative value, such as good or bad (Chatterjee & Hadi, 2012). In this manner, data regarding whether or not there was a breach can be assigned (categorized) 0s and 1s, and the appropriate statistical method is binary logistic regression.

Shown here is the model for the predictive formulae that were the outcome of this study’s analysis. Because predictive formulae from logistic regression are nonlinear, they must be transformed. The probability of something happening, such as a cost overrun, is the odds ratio. The logit is identified by finding the logarithm of the odds ratio. This transformation ensures that the predictive formulae will be linear (Chatterjee & Hadi, 2012).

logit[Pr(Y=Unforeseen cost overrun, performance breach, or schedule slippage)] = B0 +/– B1(ESOH word percentage) +/– B2(HFE/Hab/Surv word percentage) +/– B3(MPT word percentage)

Results

To examine the hypothesis, a series of binary logistic regressions was conducted. Due to the exploratory nature of this investigative study, an alpha level of 0.10 was employed for assessment of statistical significance. The hypothesis is shown here.

The percentage of HSI words within the document will coincide with unforeseen cost overruns, performance breaches, and schedule slippages.

Results of the six regression analyses are presented in Table 6, while regression coefficients are presented in Table 7. Four of the six regressions yielded significant models. SAR schedule breach, GAO total program over budget, GAO program unit cost increase, and Weapon Book amount spent over budget were associated with either ESOH word percentage, HFE/Hab/Surv word percentage, or MPT word percentage. These four regressions were further analyzed to assess the hypothesis.

Table 6. Model Fit for Six Binary Logistic Regression Analyses

arj-76-article-5-table-6Note. An asterisk (*) indicates significance at the p < .10 level.

Table 7. Model Coefficient details for binary logistic regressions

arj-76-article-5-table-7Note. A dagger (†) indicates model significance at the p < .10 level.

To examine the hypothesis, the three predictors (ESOH word percentage, HFE/Hab/Surv word percentage, and MPT word percentage) of the four significant models were examined (Table 6). Model coefficients are presented in Table 7. For SAR schedule breach, MPT word percentage was a significant predictor (B = -0.74, p = .029, OR = 0.48), suggesting that as the percentage of MPT words increased, the likelihood of a SAR schedule breach decreased. For GAO total program over budget, MPT word percentage was a significant predictor (B = -0.98, p = .095, OR = 0.38), suggesting that as the percentage of MPT words increased, the likelihood of a GAO total program over budget decreased. For GAO program unit cost increase, MPT word percentage was a significant predictor (B = -1.05, p = .040, OR = 0.35), suggesting that as the percentage of MPT words increased, the likelihood decreased for a GAO program unit cost increase. For Weapon Book amount spent over budget, ESOH word percentage was a significant predictor (B = 5.23, p = .067, OR = 187.23), suggesting that as the percentage of ESOH words increased, the likelihood of a Weapon Book amount spent over budget increased. Also for Weapon Book amount spent over budget, HFE/Hab/Surv word percentage was a significant predictor (B = -4.90, p = .006, OR = 0.01), suggesting that as the percentage of HFE/Hab/Surv words increased, the likelihood of a Weapon Book amount spent over budget decreased. Last, MPT word percentage was a significant predictor of Weapon Book amount spent over budget (B = 1.06, p = .098, OR = 2.87), suggesting that as the percentage of MPT words increased, the likelihood of a Weapon Book amount spent over budget also increased.

Predictive Equations

For each of the four significant models, each regression was solved to provide a predictive formula for the relationship between HSI word percentage outcomes and the variables of interest to Hypothesis One. Each of these predictive formulae consider ESOH word percentage, HFE/Hab/Surv word percentage, or MPT word percentage as subcategories of HSI word percentage. The first significantly predictive model suggested that MPT word percentage was the only factor that made a unique contribution to the prediction of SAR schedule breaches. Increased percentage of MPT words contributed to a lower likelihood of SAR schedule breaches. This model resulted in the final equation shown here.

logit[Pr(Y=SAR schedule breach)] = 0.06 – 0.74(MPT word percentage)

The second significantly predictive model suggested that MPT word percentage was again the only factor that made a unique contribution to the prediction of GAO total program over budget cost overruns. Increased percentage of MPT words contributed to a lower likelihood of GAO total program over budget cost overruns. This model resulted in the final equation shown here.

logit[Pr(Y=GAO total program over budget cost overrun)] = 3.20 – 0.98(MPT word percentage)

The third significantly predictive model suggested that the percentage of MPT words was again the only factor that made a unique contribution to the prediction of GAO program unit cost increases. Increased percentage of MPT words contributed to a lower likelihood of GAO program unit cost increases. This model resulted in the final equation shown here.

logit[Pr(Y=GAO program unit cost increase)] = 2.61 – 1.05(MPT word percentage)

The fourth significantly predictive model suggested that ESOH word percentage, HFE/Hab/Surv word percentage, and MPT word percentage all made a unique contribution to the prediction of Weapon Book amount spent over budget cost overruns. An increased percentage of ESOH words or MPT words contributed to a greater likelihood of Weapon Book amount spent over budget cost overruns, while an increased percentage of HFE/Hab/Surv words contributed to a lower likelihood of cost overruns. This model resulted in the final equation shown here.
logit[Pr(Y=Weapon Book amount spent over budget cost overrun)] = -0.61 + 5.23(ESOH word percentage) – 4.90(HFE/Hab/Surv word percentage) + 1.06(MPT word percentage)

Analysis of the hypothesis with all predictor variables yielded the Nagelkerke R2 values for the four significant models: SAR schedule breach, R2 = 0.06; GAO total program over budget, R2 = 0.13; GAO program unit cost increase, R2 = 0.12; and Weapon Book amount spent over budget, R2= 0.14. Therefore, cost overruns identified by Weapon Books are more affected by the presence of HSI-related terminology than are total program cost overruns identified by GAO assessments, program unit cost overruns identified by GAO assessments, and schedule breaches identified by SARs. However, the Nagelkerke R2 value is low for each of these four outcomes, thus rendering little predictive power.

Model Sensitivity and Specificity

Sensitivity and specificity were examined for each model using classification plots. Each plot describes the percentage of correct classifications for a predictive equation. Four models indicated a significant predictive ability. Therefore, the four models were examined for their ability to correctly classify cases. Results of the classification tables are presented in Table 8.

Table 8. classification tables for each binary logistic regression

arj-76-article-5-table-8Note. An asterisk (*) indicates significance at the p < .10 level.

As shown in Table 9, sensitivity and specificity were also examined using classification plots for each of the 16 MDAPs. Data were separated by MDAP, and 16 models were examined. Nine of the 16 models indicated a significant predictive ability and were examined for their ability to correctly classify cases.

Table 9. Classification Tables for Each Major Defense Acquisition Program

arj-76-article-5-table-9Note. An asterisk (*) indicates significance at the p < .10 level. An additional dependent variable was defined from the six dependent variables. If there was any breach indicated by any of the dependent variables, then a value of 1 was assigned to this new variable. If none of the dependent variables indicated any breach, then a value of 0 was assigned to this new variable. Because the Advanced Deployable System (ADS) had no breaches, as indicated by the six dependent variables, the new variable had a value of 0 for each case (each program document). Because there was no variance, binary logistic regression analysis could not be applied to the data for ADS. There were seven program documents for ADS, and those seven cases have been omitted from this table.

Discussion

In reference to Hypothesis One, the presence of HSI-related words in an MDAP document may be associated with whether or not a program will experience a schedule breach or a cost overrun. First, assessing individual predictors from three significant regression models suggests that a schedule breach identified by a SAR or a cost overrun identified by GAO assessment of total program cost or program unit cost is less likely to occur when more MPT words are present. Second, assessing individual predictors from another significant regression model suggests that a cost overrun identified from Weapon Book budget data is more likely to occur when more ESOH or MPT words are present. However, a cost overrun is less likely to occur when more HFE/Hab/Surv words are present. Therefore, considering SAR data and GAO assessment data, the presence of terminology about MPT in a weapon system’s acquisition program documents suggests that the program might not experience a schedule slippage or cost overrun. Considering Weapon Book budget data, the presence of terminology about MPT in a weapon system’s acquisition program documents suggests that the program will experience a cost overrun. Also regarding Weapon Book budget data, the presence of terminology about ESOH in a weapon system’s acquisition program documents suggests that the program will experience cost overruns, whereas the presence of terminology about HFE/Hab/Surv suggests that the program might not experience a cost overrun.

During the acquisition life cycle, program success can be affected by various efforts conducted by program managers and systems engineers. As shown in Figure 2, three milestones within the Analysis of Alternatives (AoA) occur prior to Milestone A, where a group of concepts are identified and compared among one another (Department of Defense [DoD], 2015). Milestone A is when the Risk Reduction Decision is made, whereby a specific concept is selected for further development and resources are committed to the maturation of relevant technology (p. 7). Milestone B is when the Development Decision is made, and contracts are awarded for producing and testing the concept (p. 7). Milestone C is when Low-Rate Initial Production of the concept begins (p. 7).

Figure 2. Acquisition Life Cycle

arj-76-article-5-figure-2

Meanwhile, a program’s cost estimation can be impacted during the time period leading up to Milestone B. By Milestone B, 70 percent of a system’s life-cycle cost will have been determined by design decisions regarding the program’s features and efforts (Deitz, Eveleigh, Holzer, & Sarkani, 2013; General Accounting Office, 1981; Zimmerman, Butler, Gray, & Rosenberg, 1984). After errors have been made at the Milestone B decision point, repairing the errors or compensating for them costs between three and 10 times more than the cost of the original, erroneous efforts (Deitz et al., 2013).

Program managers and systems engineers can apply the observations from this investigative study to their understanding of human considerations and what impact human considerations have on the development of a given program. Systems engineering includes HSI, and HSI can be incorporated into the content of program documents, such as the requirements documents.

arj-76-article-5-secondary-2Requirements definition is one facet of SE, which is the interdisciplinary approach for developing systems (INCOSE, 2012). MDAPs employ SE to conduct weapon systems acquisition for the DoD. To minimize weapon systems acquisition costs, the DoD created the Better Buying Power mandate, which has identified some focus areas with appropriate initiatives, such as (a) eliminating requirements that lead to nonvalue-added processes, (b) improving how requirements are defined, and (c) inhibiting requirements from changing over time, in other words, requirements creep (DoD, n.d.).

Conclusions

The data for this investigative study were representative of different customers within the U.S. Government (Air Force, Army, DoD, Marine Corps, and Navy) and of different types of weapon systems (aircraft, communications network, ground vehicle, ship, etc.), which helps ensure validity of the findings among customers and weapon systems. This investigative study looked back at existing MDAP documents in a retrospective content analysis as a means to look forward for program success. Considering how many MDAPs exist, the sample size was relatively small. However, the value of this study is that it has revealed a trend that HSI practitioners already suspected and that can be examined further by investigating more programs with more documents. Exposing trends by looking at historical data, such as how HSI impacts weapon systems acquisition, is informative for planning and developing future systems.

Acknowledgments

This research was primarily supported by the Army’s HSI office (formerly MANPRINT) and Dr. Beverly G. Knapp. Also, Dr. Holly A. H. Handley and Jeffrey Thomas provided techniques, data, and advice. Former colleagues Susan Archer, Chris Plott, and Shelly Scott-Nash at Alion Science and Technology provided an introduction with the HSI office, as well as additional support for this research.

References

Acquisition Decision Memoranda. (2014). In ACQWeb [Secure database]. Retrieved from https://extranet.acq.osd.mil/dab/adm/index.html

Ahram, T. Z., & Karwowski, W. (2012, October). A framework for human total ownership cost based on universal human performance cost components. Proceedings of the Human Factors and Ergonomics Society 56th Annual Meeting, HFES 2012, Boston, MA.

Assidmi, L., Sarkani, S., & Mazzuchi, T. (2012). A systems thinking approach to cost growth in DoD weapon systems. IEEE Systems Journal, 6(3), 436–443. doi: 10.1109/JSYST.2011.2167816

Bielecki, J., & White, E. D. (2005). Refinement of estimates: Using logistic and multiple regression to predict cost growth. Military Operations Research, 10(3), 45–56.

Birchler, D., Christle, G., & Groo, E. (2011). Cost implications of design/build concurrency. Defense Acquisition Research Journal, 18(3), 237–256.

Chatterjee, S., & Hadi, A. S. (2012). Regression analysis by example (5th ed.). Hoboken, NJ: Wiley.

Cramer, A. M., Sudhoff, S. D., & Zivi, E. L. (2011). Metric optimization-based design of systems subject to hostile disruptions. IEEE Transactions on Systems, Man and Cybernetics, Part A – Systems and Humans, 41(5), 989–1000. doi: 10.1109/TSMCA.2010.2093887

Defense Acquisition Management Information Retrieval. (n.d.). In OUSD (Acquisition, Technology & Logistics) Authentication Services [Secure database]. Retrieved from https://ebiz.acq.osd.mil/DAMIR

Deitz, D., Eveleigh, T. J., Holzer, T. H., & Sarkani, S. (2013). Improving program success through systems engineering tools in pre-milestone B acquisition phase. Defense Acquisition Research Journal, 20(3), 283–308.

Department of Defense. (n.d.). Better buying power. Retrieved from http://bbp.dau.mil/

Department of Defense. (2015). Operation of the defense acquisition system (DoDI 5000.02). Retrieved from http://www.acq.osd.mil/fo/docs/500002p.pdf

Department of the Air Force. (2014). Integrated life cycle management (AFPAM 63-128). Washington, DC: Office of the Secretary of the Air Force.

Department of the Navy. (2009). Navy personnel Human Systems Integration (NAVPRINT) (OPNAVINST 5310.23). Washington, DC: Office of the Chief of Naval Operations.

Drillings, M. (2014). Army MANPRINT. Retrieved from http://www.dtic.mil/ndia/2008maneuver/Drillings.pdf

Handley, H. A. H., & Knapp, B. G. (2014). Where are the people? The human viewpoint approach for architecting and acquisition. Defense Acquisition Research Journal, 21(4), 852–874.

Headquarters Department of the Army. (2014). Soldier–materiel systems: Manpower and personnel integration in the system acquisition process (AR 602-2). Washington: DC: Office of the Army Chief of Staff.

International Council on Systems Engineering. (2012). Systems engineering handbook v. 3.2.2.: A guide for system life cycle processes and activities (Report No. INCOSE-TP-2003-002-03.2.2). San Diego, CA: Author.

Karwowski, W. (2012). A review of human factors challenges of complex adaptive systems: Discovering and understanding chaos in human performance. Human Factors: The Journal of the Human Factors and Ergonomics Society, 54(6), 983–995. doi: 10.1177/0018720812467459

Under Secretary of Defense Comptroller. (2014). DoD budget request: Budget materials. Retrieved from http://comptroller.defense.gov/BudgetMaterials.aspx

U.S. General Accounting Office. (1981). Effectiveness of U.S. forces can be increased through improved weapon system design (Report No. PSAD-81-17). Retrieved from http://www.gao.gov/products/PSAD-81-17

U.S. Government Accountability Office. (2014). Reports and testimonies. Retrieved from http://www.gao.gov/index.html

Wallace, D. F., Bost, J. R., Thurber, J. B., & Hamburger, P. S. (2007). Importance of addressing human systems integration issues early in the science and technology process. Naval Engineers Journal, 119(1), 59–64. doi: 10.1111/j.0028-1425.2007.00004.x

Zimmerman, W., Butler, R., Gray, V., & Rosenberg, L. (1984). Evaluation of the HARDMAN comparability methodology for manpower, personnel, and training (Jet Propulsion Laboratory Publication 84-10). Pasadena, CA: California Institute of Technology.


Biography

Ms. Liana Algarín is the founder of LianaWorks. Previously, she was a human factors engineer for Alion Science and Technology where she designed user interfaces and user guides to support the U.S. Army and National Aeronautics and Space Administration. During her tenure at Alion, she also conducted critical task analyses and usability analyses of Coast Guard cutters. Ms. Algarín holds a BA in Psychology from University of South Florida and an MS in Industrial and Systems Engineering from Virginia Polytechnic Institute and State University.

(E-mail address: liana@lianaworks.com)


To print a PDF copy of this article, click here.

Comments

comments

Leave a Reply

Your email address will not be published. Required fields are marked *