The Value of Training: Analysis of DAU’s Requirements Management Training Results


To print a PDF copy of this article, click here.

Authors: Charles M. Court, Gregory B. Prothero, and Roy L. Wood

In response to Congress, the Defense Acquisition University (DAU) designed and fielded a course of study for Requirements Management, including a 1-week advanced classroom course. While teaching this course, the DAU faculty routinely conducts pre-testing and post-testing to assist the faculty and students in assessing learning and retention. The faculty uses data from these tests, along with student demographics, to assess the value of learning the course provides and to explore some initial assumptions about the readiness of the workforce to learn. Results show a greater than 30 percent increase in learning from pre- to post-test and debunk nearly all the preconceived notions the university held about the incoming students.

Every successful system acquisition begins with a well-thought-out set of operational capability requirements. The military services have always had some sort of requirements generation process that told the armories and shipyards what to build for the warfighter. As acquisition became more complex, expensive, and risky, the Department of Defense (DoD) recognized the need for a more formal system of articulating requirements and the importance of training both the acquisition and the requirements workforces.

The Joint Capabilities Integration and Development System

arj73-article-1-secondary-4In 2003, then-Secretary of Defense Donald Rumsfeld initiated a formal DoD-level requirements generation process—the Joint Capabilities Integration and Development System (JCIDS). According to Chairman of the Joint Chiefs of Staff Instruction (CJCSI) 3170.01H, “The JCIDS process exists to support JROC [Joint Requirements Oversight Council] and CJCS [Chairman of the Joint Chiefs of Staff] responsibilities in identifying, assessing, validating, and prioritizing joint military capability requirements” (CJCS, 2012). Within the context of the National Military Strategy, JCIDS provides a process to identify and assess the capabilities joint operational forces need to meet future military challenges. A capabilities-based assessment process identifies potential gaps in warfighting capability and drives changes to doctrine, organization, training, materiel, leadership and education, personnel, facilities, and/or policy (DOTmLPF-P). Many requirements lead to nonmateriel solutions, while other requirements call for materiel solutions. The JCIDS process generates the requirements and the associated performance criteria for those materiel solutions. The Defense Acquisition Management System then fulfills those requirements and delivers the required capabilities.

Articulating a new warfighting capability requirement and defending this need through rigorous discussion and analysis is a nontrivial undertaking for a requirements manager. A new military requirement can initiate a decades-long acquisition that requires the investment of billions of taxpayer dollars to develop, manufacture, and field. Requirements managers must be able to correctly identify, document, and support the compelling need for any new system, then be able to work alongside their acquisition counterparts to field the new capability. This is a complex undertaking. In 2007, Congress formally directed the DoD to train the men and women who develop new requirements under JCIDS.

Requirements Management Training

The National Defense Authorization Act (NDAA) of 2007 mandated the Under Secretary of Defense for Acquisition, Technology, and Logistics (AT&L), in consultation with the Defense Acquisition University (DAU), to develop a training program to certify DoD personnel with the responsibility to generate capability requirements for major defense acquisition programs (NDAA, 2006). The congressional mandate called for training both military and DoD civilian managers charged with assessing, developing, validating, and prioritizing requirements through the JCIDS process. This broad definition covered relatively junior members of the workforce up to and including 4-star generals and admirals on the JROC who ultimately validate the requirements. This mandate created a need for a broad and diverse training program at several levels of sophistication. Further, as Court (2010) pointed out, “no one person does all four tasks of assessing, developing, validating, and prioritizing” requirements, so the training program would also need to address a wide variety of tasks and competencies.

DAU responded quickly to meet the congressionally imposed deadline to create and deploy a requirements management certification-training curriculum by September 30, 2008. Working with AT&L and the Joint Staff Directorate for Force Structure, Resources and Assessment (J8), DAU developed two online courses for requirements managers and a 1-day classroom workshop for general and flag officers. These courses were very successful, and by the end of fiscal year 2008, the community had logged more than 4,200 course completions. In 2010, DAU added a 1-week Advanced Concepts and Skills for Requirements Management (RQM 310) classroom capstone course to the curriculum. Table 1 shows the requirements management curriculum for designated individuals as recently as 2014.

table 1. requirements management training curriculum

arj73-article-1-table-1

Requirements Management Training Curriculum

Developing new courses for requirements management was an entirely new area for DAU training outside the customary acquisition disciplines. The effort demanded an intense effort from DAU, supported and sponsored by both the AT&L staff and the Joint Staff. DAU established integrated product teams that included warfighter representatives to define the basic competencies requirements managers need to operate successfully at different levels of responsibility. The DAU faculty and outside subject matter experts meticulously developed instruction to meet these competencies across the spectrum of requirements tasks. The faculty adopted several innovative assessment tools to help DAU answer the question of whether or not the training, once deployed, would be effective.

Requirements Certification Capstone Course: New Beginnings and Opportunities

Developing RQM 310, the Advanced Concepts and Skills for Requirements Management course demanded an intense, months-long effort by requirements and acquisition experts to ensure the course conformed to the requirements management competency model and would challenge students to reach higher levels of understanding and performance. DAU designed and piloted the new 1-week course and rolled it out to students in 2010.

Creating an entirely new classroom course allowed DAU to test and apply many new concepts and technologies. RQM 310 includes faculty discussions, guest speakers, computer simulations, and a challenging student capstone exercise. One of the technology innovations in RQM 310 was the routine use of a classroom-participation system. With this system, each student uses a response device that looks like a small remote control to respond to questions and assessments. During the first morning of the class, students use their response devices to take a course pre-test and review material from the course’s online prerequisites. Throughout the week, students continue to use the response device to interact with faculty questions in the lessons. The RQM 310 students also use the response devices in an in-class simulation to evaluate and discuss differences between programs depending on their timeline, financial state, Service and Defense Agency priorities, and issues such as a budget breach or a failed operational test.

arj73-article-1-secondary-2RQM 310 student demographics. Both military and civilian requirements managers attend RQM 310. Students come from the Pentagon as well as from far-flung Combatant Commands and field activities. Military members bring current and relevant experience to the requirements generation process. Typically, military requirements managers come from operational and warfighting specialties, and complete a requirements management tour between field assignments. However, there is a relatively high turnover of military personnel through requirements management positions, bringing in new personnel with limited to no JCIDS or acquisition experience, thus creating a steady demand for training. Civilian requirements managers have greater tenure in their positions, and provide continuity in requirements offices and a “corporate memory” for their organizations.

Assumptions about the workforce. Given the vastly different demographics of the workforce who attend RQM 310, initial expectations were that incoming knowledge and experience of the students might also be vastly different. For example, the DAU faculty assumed that civilian requirements managers, because of their longer tenure, would be better versed in JCIDS and acquisition procedures than their military counterparts. Another commonly held belief was that students working in the nation’s capital or on a combatant commander’s staff would be more knowledgeable coming into the course because of more direct involvement in generating and vetting requirements. In addition to assessing the overall value of training, this study tested these major assumptions about the workforce, and the results are presented later in this article.

Study Method

Participants

This study used the data the DAU faculty normally collects in the process of executing each RQM 310 class. For purposes of this study, the data collected were from the 2013 course offering. The faculty did not originally anticipate using this course pre-test data in a study, but rather as a review specifically to assist the students in identifying their own individual knowledge gaps, and to alert the faculty to particular areas of knowledge weakness in the class as a whole. Educational research has consistently shown that pre-testing can help increase student attentiveness during the course (Sadhasivam, 2013), and aid in focusing both students and faculty on improvement of particular knowledge gaps (Blin & Wilson, 1994; Wetstein, 1998).

While DAU developed the assessments and data collection primarily to improve learning outcomes, the data have been useful in providing valuable insights into other aspects of the training. The DAU faculty compares pre-test data to post-test data to determine overall student improvement and to assess the value of learning. Post-test data from the end-of-course assessment have similar, but not identical, questions as those on the pre-test. The faculty also analyzed pre-test data in this study against student demographics to determine whether one group might be better prepared for the advanced concepts course.

Research Design

arj73-article-1-secondary-1As noted earlier, this research used data collected from a total of 263 students during the normal execution of the RQM 310 course in 2013. The data collected include pre-test and end-of-course assessment scores collected with the student response system. Questions on the two tests are similar, but not identical, and both instruments focus on key learning and competencies needed by requirements managers to be effective in their jobs. All of the students attending the RQM 310 advanced course had previously completed the two online prerequisite courses: Introduction to the Joint Capabilities Integration and Development System (CLR 101), and Core Concepts for Requirements Management (RQM 110). These online courses are self-paced, computer-based training that include their own online assessments of student progress and understanding. RQM 110 classes have assigned faculty who are available to answer questions, mentor students who might be experiencing difficulty in the course, and otherwise provide academic or technical assistance the students might need.

DAU also collects student demographics in the RQM 310 class to help the faculty better appreciate the level of experience and exposure to identifying, assessing, and formulating capability requirements. Based on a priori assumptions mentioned earlier, the faculty collects student data on each student’s assignment at the time he or she attended the course, their tenure in their current billet, aggregate experience working in the requirements management field, and how much of each student’s day-to-day work content related to managing requirements. Table 2 shows a breakdown of the demographic questions and the granularity of the answers collected.

Analysis of Pre-Test and Post-Test Scores

As a first step in this analysis, tabulating and analyzing pre-test scores produced a mean score of 51.6 with standard deviation (s.d.) of 12.81. The tally of end-of-course scores showed substantial improvement with a mean of 80.97 and s.d. of 10.68. A paired-samples t-test showed the improvement in scores to be statistically significant, t(262) = 37.173, p < 0.0005. As noted earlier, many researchers—and faculty practitioners—recognize that pre-testing students can help focus their attention on desired outcomes and influence post-test outcomes. According to Kim and Wilson (2010), “there can be substantial effects of pretest on posttest, especially when the duration between them is short, that is, less than a month” (p. 755). Researchers must consider and compensate for this fact in a strict research context. However, since the underlying purpose of the classes was to improve student knowledge and retention, the substantial improvement in scores was desirable regardless of the cause.

Table 2. Student Demographic Categories

Analysis of the Student Demographics

As noted earlier, a number of assumptions about the student demographics produced expectations among faculty for those who might perform better in the class, and those who might require more assistance or remediation. During this research process, the DAU faculty wanted to test these assumptions statistically to determine their accuracy. To do so, the faculty tested each of the assumptions using SPSS t-tests or analysis of variation (ANOVA) to examine the mean scores of each subgroup on the pre-test data. The discussion below outlines the assumptions and test results. In short, almost none of the entering assumptions proved to be true, and the classes were far more homogeneous in terms of pre-test performance without regard to prior experience or assignment.

Assumption 1. Students from inside the (Washington, DC) Beltway would be better prepared than those in field activities outside the Beltway. An independent-samples t-test assessed the means of the pre-test scores between the two groups. The inside-the-Beltway group average pre-test score was 52.28 ± 12.5 and the outside-the-Beltway group posted an average score of 59.79 ± 13.2. The t-test analysis found no statistically significant differences between student groups at a 95% confidence level, t(263) = 0.93, p = 0.473.

Assumption 2. Students with more time in their current billet will be better prepared than those with shorter tenures. The assessment divided the students into those with less than 6 months in their current positions, those with 6–12 months, those with 12–24 months’ tenure, and those with greater than 24 months in the job. Since many military requirements managers historically have shorter tours in requirements billets between operational tours, observers could assume that longer tenures might better prepare students for the advanced course. The
analysis did not support this assumption, however. The means of the group scores on the pre-test varied only between 49.5 and 53.7. An ANOVA test on the groups revealed no statistically significant differences in their respective performances on the pre-test, F(3, 258) = 1.11, p = 0.344.

Assumption 3. Students with greater experience in requirements management would be better prepared. To test this assumption, the analysis subdivided the students into groups with less than 6 months’ experience, those with 6–12 months’ tenure, 1–3 years, 3–5 years, and greater than 5 years. An ANOVA test on this data did find a single statistically significant difference between groups of students as determined by the one-way ANOVA, F(4, 258) = 3.096, p = 0.016. A Tukey post-hoc test on the data revealed that students with 3–5 years of experience showed a statistically significant average higher score (56.7 versus 48.2) on the pre-test than less experienced students with 6–12 months’ experience.

Assumption 4. Students who spend a greater amount of day-to-day time working on requirements will show better preparation for the class. For this test, the analysis divided the students into five groups: (1) students who reported working on requirements-related tasks less than 25% of the time; (2) those with requirements work between 25% and 50%; (3) students with requirements work from 50% to 75%; (4) those whose requirements content in their workday were between 75% and 100%; (5) students whose work was 100% exclusively related to requirements. The ANOVA analysis for these groups again pointed to no statistical differences between the pre-test means, F(5,257) = 1.48, p = 0.195. The pre-test average scores for these groups varied only between 50 and 53.6.

Assumption 5. Designated requirements managers, and perhaps acquisition professionals, will be better prepared for the class. Here, the demographic questions asked the students to self-identify their primary career field: requirements, acquisition, operational/warfighter, or other. The ANOVA analysis of the mean pre-tests scores for these groups found no statistically significant differences, with mean scores between 48.9 and 53.6, F(3, 259) = 0.880, p = 0.452.

Assumption 6. Organizational assignment will have some impact on student readiness. The initial assumption was that there might be some relationship between the student’s assigned organization and his or her score on the pre-test. For example, the faculty might expect a student assigned to the Joint Staff or Combatant Command to do more work directly or indirectly in creating, assessing, or approving requirements than students from other organizations. For this analysis, the study broke the student sample into those who worked on the Joint Staff, Service Headquarters Staff, major military command, Defense Agency, Office of the Secretary of Defense Staff, a Combatant Commander Staff, or other. Once again, the ANOVA showed no statistical differences in mean pre-test scores of the students, regardless of their assignment, F(6, 256) = 0.312, p = 0.930.

Significance of the Analysis

This analysis debunked nearly every assumption about factors that might affect student preparedness for the advanced course. Each of these assumptions made sense on an intuitive level, and the results have been surprising. DAU will need to do more work to determine exactly why these assumptions were untrue, but preliminary analysis offers two potential explanations.

This analysis debunked nearly every assumption about factors that might affect student preparedness for the advanced course.

First, the knowledge of students coming into the course is much more homogeneous than originally believed. This may be the result of all students being required to take the same online preparatory courses, Introduction to the Joint Capabilities Integration and Development System, CLR 101, and Core Concepts for Requirements Management, RQM 110. Students who take these courses may come into the advanced RQM 310 with a common baseline of knowledge learned primarily from those classes. Another possibility is that individuals in the requirements community typically work only on single or perhaps a handful of tasks related to the broader process of identifying, assessing, validating, and prioritizing joint requirements. It is unlikely that any individual student would have a deep knowledge, based on experience, across the entire process, regardless of tenure or organizational assignment. Thus, expertise in any narrow area may not contribute to statistically higher scores on course material that covers all areas.

Summary and Conclusions

DAU responded to the congressional mandate and met the short deadline to train and certify requirements managers through a combination of online and classroom courses. The success of the initial DAU approach led to student demand and leadership support to expand the initial requirements curriculum. The most significant curriculum expansion was the development of the Advanced Concepts and Skills for Requirements Management course, RQM 310.

Developing a new classroom course in a different, nontraditional area of acquisition allowed the DAU faculty to apply new technologies. Classroom simulations enhanced traditional teaching approaches. The simulations encouraged the exchange of ideas. They helped requirements managers from different Services and Defense Agencies recognize their common problems. Classroom participation devices encouraged more student involvement.

The DAU faculty compares pre-test data to post-test data to determine overall student improvement and to assess the value of learning.

The success of using classroom-participation devices led the requirements faculty to additional innovation. Students take a pre-test on the first day of class, and a final exam post-test at the end of the 1-week course. Both exams use classroom-participation “clickers” with the exam questions projected on a classroom screen. By comparing the results of the pre-test to the results of the post-test, this analysis has established that statistically significant improvements in scores occur, leading us to conclude with confidence that student learning was taking place.

This analysis has also been a “myth buster” for a number of sincerely held assumptions about the workforce and how demographic factors influence RQM 310 student preparation.

Almost universally, the assumptions have been wrong, and students coming into the course are much more homogeneous than the faculty anticipated. Part of the homogeneity could result from all students taking the same prerequisite courses—CLR 101 and RQM 110—and coming into the advanced RQM 310 with a common baseline of knowledge learned from those classes. Another possibility is that individuals in the community work only on single or perhaps a handful of tasks related to identifying, assessing, validating, and prioritizing joint requirements, thus no individual student has a deep knowledge across the entire process, regardless of tenure or organizational assignment. Expertise in a narrow area may not contribute to statistically higher scores on course material that covers all areas.

This analysis has also been a “myth buster” for a number of sincerely held assumptions about the workforce and how demographic factors influence RQM 310 student preparation. Nevertheless, the success of pre- and post-testing in RQM 310 has encouraged the faculty to expand this approach to other requirements courses. Specifically, the faculty is investigating how to apply this approach to the online Core Concepts for Requirements Management course, RQM 110. Further, based on the success of RQM 310, additional classroom courses at the Defense Systems Management College have adopted the classroom simulations and the student-participation system, and are collecting student demographics and learning data to be able to continuously improve course content and learner performance.


To print a PDF copy of this article, click here.

Research Limitations and Future Research

As noted earlier, the data collected from the RQM students were primarily for the purpose of gauging the knowledge of the incoming students and ensuring that the course delivered important content in a way that was understandable and memorable. This analysis did not use random samples or experimental methods that would contribute to a rigorous scientific study. Future researchers may choose to close these obvious gaps in a more intentional way. In addition, post-testing performed at the end of the class does not guarantee the students will remember the information over the long term. Future research may wish to test students several weeks or months after graduation and assess the results of knowledge retention over time.

arj73-article-1-secondary-3References

Blin, F., & Wilson, D. (1994). The use of pre-test and post-test in CALL {computer-aided language learning}: A case study. Computers and Education, 23(1), 143–150.

Chairman of the Joint Chiefs of Staff. (2012). Joint capabilities integration and development system (CJCSI 3170.01H). Washington, DC: Joint Chiefs of Staff.

Court, C. M. (2010). The manager in the muddy boots. Defense AT&L Magazine, 39(1), 12–16.

Kim, E. S., & Wilson, V. L. (2010). Evaluating pre-test effects in pre–post studies. Educational and Psychological Measurement, 70(5), 744–759. Retrieved from http://epm.sagepub.com/content/70/5/744

National Defense Authorization Act (NDAA) for Fiscal Year 2007, Pub. L. 109–364 § 801, 109th Cong. (2006).

Rumsfeld, D. (2003, October 31). Initiation of a joint capabilities development process [Memorandum]. Washington, DC: Office of the Secretary of Defense.

Sadhasivam, M. (2013). Introduction of pre-test and post-test enhances attentiveness to physiology lectures: Students’ perceptions in an Indian medical college. International Journal of Biomedical and Advance Research, 4, 341–344.

Wetstein, M. E. (1998). Assessment of learning in the American government course: Results from a pre-test/post-test methodology. Presentation at the Annual Conference of the American Political Science Association, Boston, September 3–5.

Appendix

The RQM 310 Class Schedule

Table A1 illustrates when the DAU faculty administers the pre-course assessment and the end-of-course examination. The table also lists the course topics and uses a color code to illustrate the different class activities. Table A2 explains the color code.

arj73-article-1-table-3aarj73-article-1-table-3b

Author Biographies

arj73-courtDr. Charles M. Court is the Requirements Center director at the Defense Acquisition University. His career includes assignments as a Wild Weasel electronic warfare officer, a test realism manager, a program manager, and a laboratory supervisor. His teaching experience includes computer science, statistics, management, and physics. Dr. Court holds an MS in Physics from the Air Force Institute of Technology and a PhD in Management from Walden University. He holds Level III certifications in Program Management and in Systems Planning, Research, Development and Engineering.

(E-mail address: charles.court@dau.mil)

arj73-protheroMr. Gregory B. Prothero is the Requirements Center deputy director at the Defense Acquisition University and is the course manager for RQM 310, Advanced Concepts and Skills for Requirements Management. His military assignments include navigating operational C-130 missions, serving as Advance Agent for Air Force One, sponsoring Congressional travel as part of Air Force Legislative Liaison, and teaching as an assistant professor of Management at the United States Air Force Academy. He holds a Level C certification in Requirements Management and an MS in Operations Management from the University of Arkansas.

(E-mail address: gregory.prothero@dau.mil)

arj73-woodDr. Roy L. Wood is the acting vice president of the Defense Acquisition University, and previously the dean of the Defense Systems Management College. He has served as the Principal Assistant Deputy Undersecretary of Defense for International Technology Security and as the director of the Militarily Critical Technologies Program. Dr. Wood holds an MS in Electrical Engineering from the Naval Postgraduate School, an MS in National Resource Strategy from the Industrial College of the Armed Forces, and a PhD in Organization and Management from Capella University.

(E-mail address: roy.wood@dau.mil)

Comments

comments

Leave a Reply

Your email address will not be published. Required fields are marked *