jan-16-article-2-lead

Building Resilient Systems Via Strong Human Systems Integration


To print a PDF copy of this article, click here.

Author: Mica R. Endsley, Ph.D.

Imagine a land called Nonods in which the people built a great many bridges. These bridges had a tendency to collapse frequently, however, killing or injuring a number of Nonods in the process. The bridges were also fairly rickety requiring lengthy training as well as many procedures to avoid falling off of them, significantly slowing traffic across the land. Now within Nonods there were many civil engineers who had amassed significant knowledge about how to build strong bridges that would not fall and that would support much more rapid traffic.

However, the Nonod bridge builders generally ignored these engineering principles. “Why, we cross bridges all the time,” they said, “so we know perfectly well how to build bridges.” As a result, the Nonods continued spending a great deal of their treasure on building bridges that worked poorly, and periodically a number of Nonods were killed trying to use them. “Oh, well,” they would say. “Bridges fall down. Not much one can do about that.” Or they would say, “The people walking on them must have done something wrong to make them fall.” And thus the Nonods were quite unprepared to move their people across the land quickly when they needed to repel an invasion from the north and they were summarily defeated in battle. The Nonods were no more.

The story of our imagined Nonods illustrates a reality in our acquisition system. But the problem is not that of building bridges but systems that allow for effective human performance. Like the Nonods, many program managers believe that “people just make errors, and that is not something that can be remedied.” However, there is a strong base of scientific research and engineering foundation in the field of human factors, developed over the last 60 years, that provides a rich basis for developing robust systems that can significantly reduce human error. Human factors engineering is based on the scientific understanding of how people perceive and process information, their physical characteristics, and how people make decisions and carry out tasks with the use of technology.

jan-16-article-2-secondaryOne can substantially improve human performance and reduce the likelihood of errors, simply by designing a system that is compatible with the characteristics of the people who must operate and maintain it. For example, research shows that simply making text a combination of capital and small letters (rather than all capitals) can improve reading time for lines of text by between 10 percent and 15 percent and reduce errors by about 12 percent, according to Sanders and McCormick in “Human Factors in Engineering and Design” (1993). If displays use colors consistent with human expectations (e.g., red for stop and green for start), performance will be significantly faster and people will make far fewer errors than when the colors are the opposite of expectations. These are two very simple examples, but they demonstrate the significant improvements in human performance that can be made with design features that cost almost nothing to implement. And I have found systems in the military that violate both principles, leading to unnecessary problems and poor performance.

By applying human factors principles during the design and development of our military systems, we can significantly reduce instances of catastrophic failures that lead to crashed aircraft or fratricide. And we can significantly reduce the ongoing operations and maintenance costs that eat into our limited budgets.

For example, today’s manned aircraft have benefited significantly from the application of good human factors principles during system design. Early flight experience during World War II led aviation experts to realize that perfectly good aircraft were crashing because pilots had difficulty integrating and understanding displays that worked in nonintuitive and inconsistent ways and that were prone to spatial disorientation and other hazards.

The field of human factors developed to address these problems and the incidence of “human error” decreased rapidly. Military Standards such as MIL-STD-1472 and MIL-STD-1295 were developed to codify this work. However, acquisition changes in the 1990s led many programs to stop requiring attention to these human factors design standards and we saw a resurgence of problems. For example, the grounding of the F-22 fleet of tactical fighter aircraft amid concerns about pilots’ hypoxia-like symptoms was found to be due to the lack of a critical backup for the Onboard Oxygen Generation System (OBOGS). That backup system was eliminated to reduce weight, even though there had been insufficient modeling and testing of the life-support system to support the decision or detect problems with the pressure vests used by the pilots. The Air Force’s failure to incorporate Human Systems Integration (HSI), including human factors, in its requirements and acquisition process was a major contributing factor to this problem, according to the Air Force Scientific Advisory Board that investigated the incidents.

Figure 1. Poor Vs. Proper Interface Design

jan-16-article-2-figure

Today, we see similar problems with many remotely piloted aircraft. Basic human factors design principles were not applied during the initial development of the Predator ground stations. Recent analysis by the Air Force Safety Center shows that our unmanned aircraft have 6 times more Class A mishaps than our manned aircraft, and 73 percent of these were associated with human-factors problems. While the loss of an unmanned aircraft generally does not involve loss of life, it does involve loss of an expensive asset and of mission capability.

The costs of ignoring human factors during system design are too great. How people perform with technology is a critical component of total system performance. While our systems development processes often focus only on the mechanical performance of the technology, it is important to remember that our job is not only about the technology; it’s also about how well the technology will support the people who need to use it to accomplish their missions.

Human Systems Integration

The military has worked to improve the incorporation of human-factors design principles into the development of its programs through HSI, which is a disciplined, unified and interactive systems engineering approach for integrating human considerations into system development, design and life-cycle management. This works to both improve total system performance and reduce costs of ownership across the system’s life cycle. It incorporates nine key areas: manpower, personnel, training, human factors engineering, environment, safety, occupational health and survivability. HSI takes into consideration human factors engineering principles, along with plans for the numbers and qualifications of the people assigned to use the system, and the amount and type of training needed to operate the system. This helps achieve effective system designs by simplifying the actions required for use, providing compatibility with human capabilities, and significantly easing training and manpower requirements in many cases. The environment in which the system must operate, along with various important safety factors, also is addressed in developing systems to support robust human performance.

HSI provides a detailed process for determining and incorporating requirements for effective human performance and safe operations, for applying sound engineering principles, and the metrics and analysis for enhancing overall system performance in a wide variety of demanding situations. The Department of Defense (DoD) has mandated inclusion of HSI in the development of our military systems. DoD Instruction (DoDI) 5000.02, Enclosure 7 addresses HSI, stating that the program manager should plan for and effect HSI, beginning early in the acquisition process and throughout the product life cycle, charging the program manager with responsibility for ensuring that HSI is considered at each program milestone.

The U.S. Army addresses HSI with its longstanding HSI (formerly MANPRINT) program through Army Regulation 602-2. The Navy has developed an HSI Management Plan for carrying out DoDI 5000.2. And the Air Force has incorporated HSI into its Air Force Instruction (AFI) on Life Cycle Management and has developed an HSI Guidebook, HSI Requirements Guide, and Air Force Pamphlet 63-128 with mandatory requirements for conducting HSI as a part of systems development.

Nevertheless, in my travels across the Air Force, I have found that many programs still lack adequate consideration of HSI. Experience within the Army and Navy has been similar. While some programs manage to include HSI, in many cases HSI requirements take a back seat to other engineering considerations or are missing completely. It turns out that, like the Nonods, some program managers do not fully appreciate the ways in which HSI can improve system performance, or they remain confused about how to effectively incorporate HSI into their programs. This is due to a number of fundamental gaps in understanding about HSI.

Table 1. Human Systems Integration (HSI) Domains

jan-16-article-2-table

Myth No. 1: HSI Means Asking What Users Want

Often when I have asked program managers what sort of HSI considerations they have included in their programs, they proudly tell me, “We showed it to some users.” While a good step, this unfortunately is quite insufficient. Human preference does not equal human performance. User input is very important to development of good systems. Users know a lot about what their jobs entail and where the difficulties are, and they can provide useful feedback when looking at new system designs or when trying them out during Developmental Test and Evaluation (DT&E) or Operational Test and Evaluation (OT&E). However, they generally are not experts at understanding the detailed physical, physiological, perceptual and cognitive processes, capabilities and limitations of humans, and they often will miss the many subtle features of technology that can negatively impact human performance.

Good HSI means applying known human engineering design principles and performing objective evaluations of the functioning of the system when in use by a representative sample of its intended users. Time to perform tasks, error rates, workload and situation awareness can all be objectively measured to find problems and make design trade-offs with the goal of creating effective total system performance. Just as we would not test an engine simply by having pilots look at it, we will not get a good assessment of the human interface just by having the user look at it.

Myth No. 2: HSI Means Including the Newest Display Techniques and Hardware

At the opposite end of the spectrum from neglecting HSI, some programs go looking for HSI in all the wrong places. That is, they want to make really cool user interfaces by incorporating the latest ideas from science fiction movies or computer scientists. I have seen displays built into three-dimensional rotating cubes, displays that project information into holograms and virtual reality headsets, or those that involve large arm movements for extended periods to interact with displays. While well intended, many of these so-called advancements can be fatiguing, can reduce situation awareness in critical situations, and actually can lead to much slower performance and higher error rates on critical tasks. Cool does not equal effective. Good user interfaces may not always require the latest hardware and software concepts. Instead designers must pay attention to the requirements associated with users’ tasks and match the most effective hardware and software approaches to those tasks.

Myth No. 3: HSI Should Be Done at the End of a Program

Among program managers, one of the most pervasive misunderstandings is the belief that the user interface should be considered at the end of the program after the technology issues are sorted out. This is the worst time to do HSI. At that point, generally only small fixes can be applied to a system that has placed controls in the wrong places or that has software logic and layouts that fundamentally confuse users and do not provide the needed information in ways that will help users achieve good situational awareness or rapid performance. Just as one cannot really fix a poorly designed Nonod bridge with a few Band-Aids, one cannot fix a poor user interface with a few tweaks at the end of the program. And making the extensive changes needed is generally very costly at that point and causes program timelines to be exceeded.

HSI should be started at the very beginning of a program. By conducting an early analysis of user requirements, tasks and information needs, an HSI team can create early prototype interface designs that can be tested with users early in the program. These prototypes then can create the foundation for software and hardware development. They provide a clear indication of what is needed before a penny is spent on bending metal or on expensive software coding of interfaces that will need to be changed repeatedly as users try them out.

This creates significant time and money savings for the program. The Air Force recently was forced to cancel its Expeditionary Combat Support System (ECSS) program, costing more than $1.1 billion and 8 years of effort. A major reason was the program’s inability to understand the system requirements, leading to extensive churn in requirements and solutions and failed reprogramming efforts. Had this HSI process been employed early, there would have been a prototype system available for testing with the many users of the system. This would have established a means to ensure that the needed functionality and information flow was well understood before software development even started.

Myth No. 4: Anyone Can Do HSI

Just as the Nonods believed that they could design bridges because they were bridge users, many people believe anyone can do HSI because they are people and so they know what people need. However, even well-meaning people will not do an adequate job of HSI if they have not received the appropriate training—combining knowledge of human capabilities (physical, cognitive and perceptual) with knowledge on how to design systems, develop training or conduct the needed HSI domain analyses. As in other areas of engineering, there is a significant body of knowledge that needs to be acquired. Most HSI practitioners have advanced degrees in industrial engineering, psychology or physiology. However HSI is a multidisciplinary profession, so practitioners may have a wide variety of degree titles that can leave some people confused as to how to find the right expertise. Just as you can hire a Certified Public Accountant (CPA) to do your taxes, you also can find an HSI expert for your team who is a Certified Professional Ergonomist (CPE)—after having passed the required exams and demonstrated proficiency in the field.

Figure 2. Use HSI Tools and Processes to Define Requirements and Interfaces Early

jan-16-article-2-figure-2

Myth No. 5: We Can Just Train Around HSI Problems

There is a long history of trying to use training to compensate for poorly designed user interfaces. Unfortunately, training alone cannot overcome interfaces inconsistent with human expectations (for example, requiring the user to push down on a lever to go up), that create known physiological problems (for example, a lever that requires the pilot to move her head down and to the side during landing, resulting in the pilot’s disorientation), or that require extensive time-consuming procedures for simple tasks. Even with extensive training, people will continue to make errors when the technology is incompatible with how they think and operate, particularly when under stress. And trainers will tell you that good HSI can significantly reduce the training time required for any system. Good training is important, but it is no substitute for good system design.

Myth No. 6: With Automation, We Don’t Need to Worry About HSI

Many people believe that as systems become more automated, worrying about HSI or the human operators of the systems will become less important. However, exactly the opposite is true because almost all this automation still requires human interaction. Extensive experience with automated systems over the last 30 years has shown that automation actually can make the user’s job more complicated. For example, pilots and system operators find that their cognitive workload can increase substantially as they work to understand how to properly program the automation during operations. And they can suffer from lower situational awareness when working with automation because it often leaves them out of the loop and struggling to understand what it is doing so they can supervise the automation and intervene in time-critical situations. The move toward more automation or autonomy in many systems requires that we pay even more attention to the user interface than ever to make the behavior of the system more transparent and understandable, creating effective human-automation teams.

Myth No. 7: HSI Costs Too Much

Actually, good HSI saves programs money, both during system development and later in operations. Attention to HSI early in a program can provide clear directions for system development, saving extensive rework later, when it is much more expensive to redo software or hardware. Attention to HSI also can save a great deal of money in the military’s limited operations and maintenance budgets. Life-cycle costs account for between 35 percent and 70 percent of a system’s overall costs. These costs can be significantly reduced if HSI is emphasized during system development. For example, attending to the design of the interface for a satellite control ground station or a command-and-control system can significantly reduce the number of operators required. Attending to the design of the aircraft for supporting maintainer tasks can significantly reduce the hours required for routine maintenance and increase its availability for flight. The truth is our development programs cannot afford a failure to apply good HSI.

jan-16-article-2-quote

The Acquisition Community Is the Linchpin for HSI

Acquisition professionals have a critical role in developing technology for their users. All of our airmen, soldiers and seamen have demanding and critical jobs to do that depend on well-designed systems that will work the way that they do—supporting the accomplishment of their tasks rapidly and effectively. It is critical that we avoid system designs that are obstacle courses of hidden hazards and latent failures.

Acquisition programs can accomplish these goals by first paying attention to HSI requirements when establishing program requirements. If these requirements are not spelled out in clear measurable ways, experience has shown that contractors will not, and often feel they cannot, spend any effort in ensuring that systems are easy to use or consistent with human capabilities and limitations. And if HSI requirements are not included in program documents, there is little that can be done to make contractors fix even egregious interface problems without making expensive program modifications.

Second, make sure not only to require that system developers create an HSI plan but that it is implemented early in the program, and include it as a critical part of design reviews. In some cases, we have found programs that required an HSI plan but failed to require the contractor to actually implement it, which did no good at all. Design reviews should include not only a review of the contractor’s progress on HSI tasks, but also a review of objective test metrics showing whether their work has been successful and identifying areas for further improvements.

Third, make sure you have the needed HSI professionals as a part of your program team. You won’t be able to tell if contractors have done a good or a poor job if you don’t have people with the required knowledge and experience to evaluate the system design, the methods used or the test results. In the Air Force, the 711th Human Performance Wing has a body of HSI professionals who can provide the expertise needed. The Army has the Army Research Laboratory Human Research and Engineering Directorate (ARL HRED), and the Navy has HSI professionals imbedded at the Naval Sea Systems Command (NAVSEA) and the Space and Naval Warfare Command (SPAWAR).

To learn more about HSI a number of resources are available. The Defense Acquisition University offers a 2-hour introductory course in Human Systems Integration (CLE 062). The Air Force Institute of Technology offers courses in Basic Human- Systems Integration (SYS 169), Intermediate Human Systems Integration (SYS 269), and a certificate in Human Systems Engineering, as well as advanced degrees. The Naval Postgraduate School offers an online Human Systems Integration Certificate, in addition to master’s and doctoral degrees with emphasis in HSI.

The good news is that there is an extensive body of knowledge and expertise that can help all of our acquisition programs develop safe and resilient systems that promote effective human performance as a part of total system performance. Like the Nonods, we just need to apply that knowledge to our programs to be successful.


To print a PDF copy of this article, click here.


The author can be contacted at mica@satechnologies.com.

Endsley is president of SA Technologies Inc. in Mesa, Arizona, is the former chief scientist of the U.S. Air Force and has 30 years of experience in Human Systems Integration for the military.

Comments

comments