National Academies Press: OpenBook

Naval Analytical Capabilities: Improving Capabilities-Based Planning (2005)

Chapter: 3 Review of the Navy’s Analytical Processes and Methods

« Previous: 2 Key Elements of Capabilities-Based Planning and Analysis
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

3
Review of the Navy’s Analytical Processes and Methods

THE COMMITTEE’S APPROACH

The committee framed its review of the Navy’s efforts to implement capabilities-based planning (CBP) by asking about the Navy’s (1) conceptual framework for CBP, (2) its analytic framework, (3) its explicit attention to future building blocks, and (4) implementation at the level of personnel and organizations.

Conceptual Framework for Capabilities-Based Planning

Basic Ideas

As discussed in Chapter 2, the CBP approach is fundamentally about planning under uncertainty by emphasizing flexibility, robustness, and adaptiveness, while doing so within an economic framework. That is, choices must be made about how much risk of various types is tolerable, how to exploit opportunities for efficiency and effectiveness, and how to live with the budget that is finally decided upon by national authorities. The capabilities in question are to be outputs—measures of the ability actually to execute tasks, missions, and operations. These capabilities should also be conceived as joint capabilities, even though in some instances a particular joint capability may effectively be a Service capability (e.g., undersea surveillance).

“Planning under uncertainty” is not a mere phrase being emphasized; it is the essence of CBP, which recognizes that the United States cannot reliably predict how its military forces will be used—against whom, for what purpose, and in

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

what circumstances. Nor can it reliably predict precisely how conflict situations will evolve or how well each and every system and operation will work. Hedging is essential. But hedging is costly. How much is enough? And what framework is used to judge?

Simple Tests

Some simple tests can indicate whether a Service is applying CBP. One is whether the emphasis is on achieving capabilities rather than, as in prior periods, platforms and weapons systems. A second test is whether options for achieving capabilities are joint and whether trade-offs cross Service boundaries where appropriate. A third test is whether risk is considered, in its various dimensions. And, last but not least, are assessments accomplished within an economic framework, which includes identifying funding sources for additions that would otherwise defy the fiscal guidance?

Analytic Framework

Given a conceptual framework, an organization also needs a suitable analytic framework to conduct CBP, preferably one that is widely understood and that provides for the following:

  • An understanding of capability needs at the mission or operation level;

  • An understanding of aggregate capability needs (for theater and multitheater challenges);

  • The development and assessment of options for providing needed capabilities, including options that maintain the overall funding level specified by fiscal guidance; and

  • The assessment of options and trade-offs in an integrative portfolio-management structure suitable to Chief of Naval Operations (CNO)-level review.

Because issues arise at different levels (e.g., strategic, campaign, and mission), it follows that the analytic framework must be hierarchical, with a clear logic trail from the high-level constructs down to those in which one can see the critical components and subcomponents of capability that make operations successful. The relationships among levels of analysis cannot merely be asserted on the basis of implicit assumptions; instead, they must be derived from thoughtful, explicit analysis with conscious trade-offs.

Consideration of Future Building Blocks

In domains in which one seeks flexible, adaptive, and robust capabilities, effective solutions typically depend on developing appropriate capabilities as

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

building blocks. These exist in the realm of systems and platforms, organization, and operations. Planning for the future in an uncertain era of dynamic change implies rethinking—and probably transforming—the building blocks. For the Navy, this will likely mean new strike groups, different concepts of manning, and new joint operations or ways of conducting old ones.

Implementation—Moving Toward First-Class Analysis

Finally, conducting CBP well will require first-class analysis. Achieving this objective involves institutional issues and has major implications for staffing, organization, and reward systems.

With the background presented in this and the preceding subsections as a framework, the remainder of the chapter addresses the issues in the same order discussed above and provides the committee’s assessments and recommendations.

THE CONCEPTUAL FRAMEWORK

Does the Department of the Navy have a sound, top-level conceptual framework to guide capabilities-based planning? To address this question, the committee drew primarily on the following documents and briefings: (1) “Sea Power 21 Series” articles from U.S. Naval Institute Proceedings,1 (2) Naval Transformation Roadmap 2003,2 (3) a set of briefings presented to the committee by the Office of the Deputy Chief of Naval Operations (DCNO) for Warfare Require-

1  

ADM Vern Clark, USN, Chief of Naval Operations. 2002. Sea Power 21 Series—Part 1, “Projecting Decisive Joint Capabilities,” U.S. Naval Institute Proceedings, October; VADM Mike Bucchi, USN, and VADM Mike Mullen, USN. 2002. Sea Power 21 Series—Part II, “Sea Shield: Projecting Global Defensive Assurance,” U.S. Naval Institute Proceedings, November; VADM Cutler Dawson, USN, and VADM John Nathman, USN. 2002. Sea Power 21 Series—Part III, “Sea Strike: Projecting Persistent, Responsive, and Precise Power,” U.S. Naval Institute Proceedings, December; VADM Charles W. Moore, Jr., USN, and LtGen Edward Hanlon, Jr., USMC. 2003. Sea Power 21 Series—Part IV, “Sea Basing: Operational Independence for a New Century,” U.S. Naval Institute Proceedings, January; VADM Richard W. Mayo, USN, and VADM John Nathman, USN. 2003. Sea Power 21 Series—Part V, “ForceNet: Turning Information into Power,” U.S. Naval Institute Proceedings, February; VADM Mike Mullen, USN. 2003. Sea Power 21 Series—Part VI, “Global Concept of Operations,” U.S. Naval Institute Proceedings, April; VADM Alfred G. Harms, Jr., USN, VADM Gerald L. Hoewig, USN, and VADM John B. Totushek, USN. 2003. Sea Power 21 Series—Part VII, “Sea Warrior: Maximizing Human Capital,” U.S. Naval Institute Proceedings, June.

2  

ADM Vern Clark, USN, Chief of Naval Operations; and Gen Michael Hagee, USMC, Commandant of the Marine Corps. 2004. Naval Transformation Roadmap 2003: Assured Access and Power Projection from the Sea, Department of the Navy, Washington, D.C.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

ments and Programs (N70)3 and the Assessments Division of the Office of the DCNO for Resources, Requirements, and Assessments (N81);4 the Naval Air Systems Command (NAVAIR);5 and the Office of the DCNO for Manpower and Personnel (N1).6 The relationships of these Navy CBP efforts to approaches of the Office of the Secretary of Defense (OSD) and the Office of the Joint Chiefs of Staff (OJCS) are discussed in Chapter 4.

This assessment addresses separately the Navy’s broad strategic approach, its system level of analysis, and its analysis at the mission and operational level, as in development of Program Objective Memorandums (POMs) responsive to strategic guidance.

Broad Strategic Approach

The committee concludes that the Department of the Navy has done a creditable job in laying out a broad strategic approach and has gone on to delineate sensibly the special responsibilities that the maritime Services have in national strategy and joint operations. The Navy’s approach is organized at the top level in terms of Sea Shield, Sea Strike, Sea Basing, and the enabling “glue” of FORCEnet, as indicated in Figure 3.1.7 These are supported by what are termed Sea Trial, Sea Warrior, and Sea Enterprise.

Were the planning to stop with this top level, it would produce little more than good viewgraphs, but the Navy has put considerable effort into assuring that all of the important functions of the department are mapped into this structure and that useful decompositions exist down to meaningful levels of detail (see the next subsection). Such breakdowns are always imperfect because of crosscutting factors, but the committee was satisfied that the structure largely makes sense. The structure will probably change over time as the Navy gains experience with the decomposition and makes adjustments, but the approach is sensible. This said, the committee notes that the approach is quite different from that being used in OSD and the Joint Staff (in Chapter 4, see the discussion of the Joint Capabilities Integration and Development System (JCIDS)), which involves functional capa-

3  

CAPT Terry McKnight, USN, N70, “Naval Capabilities Development Process,” presentation to the committee, July 27, 2004, Woods Hole, Mass.

4  

LCDR Kenneth Masson, USN, N815, “Capabilities Based Planning,” presentation to the committee, July 27, 2004, Woods Hole, Mass.

5  

Patrick McLaughlin, NAVAIR, “Naval Analytical Capabilities and Improving Capabilities-Based Planning,” presentation to the committee, July 28, 2004, Woods Hole, Mass.

6  

Richard Robbins, N1Z, “N1 and Capabilities-Based Planning,” presentation to committee members, July 21, 2004, Navy Annex, Washington, D.C.

7  

CAPT Terry McKnight, USN, N70, “Naval Capabilities Development Process,” presentation to the committee, July 27, 2004, Woods Hole, Mass., slide 10.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

FIGURE 3.1 Top-level components of Sea Power 21. SOURCE: CAPT Terry McKnight, USN, N70, “Naval Capabilities Development Process,” presentation to the committee, July 27, 2004, Woods Hole, Mass., slide 10.

bility areas identified as focused logistics, battlespace awareness, force application, force protection, command and control, network-centric operations, training, and force management. It is important that the Navy have clear mappings from its decomposition to the Department of Defense’s (DOD’s) functional capability areas if it is to participate and compete effectively in overall DOD planning. The mapping issue is nontrivial because the Navy capabilities, natural in an operations-oriented decomposition, depend on a number of the functional capabilities in JCIDS.

The committee is also convinced that at the highest levels of the Navy and the Marine Corps there is a commitment to jointness—not merely to offer lip service to it but because jointness is a fundamental aspect of overall transformation for the new era in which the United States finds itself. At that highest level, as reflected in the core documents, it is appreciated that warfighting will almost always need to be joint in the future. Even under those circumstances, however, enormous responsibilities will continue to devolve upon the maritime commanders.

Finally, the higher-level documents all reflect a commitment to flexibility, adaptiveness, and robustness. This is perhaps not surprising, since the Navy and Marine Corps have traditionally emphasized these qualities to a greater extent than have the Army and Air Force, which became more captive to planning for particular war scenarios.

In contrast, the committee was not persuaded that the translation of higher-level intentions into lower-level processes and practices is going well, as discussed below in the subsection “Operational Analysis for the Department of Defense and Office of the Chief of Naval Operations” and in the next major

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

section, “Analytic Framework.” First, however, work at the system-command level is discussed.

System-Level Analysis

The committee was generally impressed by the presentations made by representatives of naval organizations two levels down from the highest level. These presentations had been generated by the Navy’s Space and Naval Warfare Systems Command (SPAWAR), the Naval Sea Systems Command (NAVSEA), and the Naval Air Systems Command. Here the committee saw evidence of managerial rethinking about organization, process, and products to support CBP. For example, in the briefings cited, the committee saw reference not only to analytical work on capabilities, but also to life-cycle costs, and business-case assumptions. The Navy has even reorganized to operate what it calls a Virtual Systems Command to increase agility and integration.8 Illustrative discussion of the Virtual Systems Command’s analytical process reflected a systems-engineering perspective, with the inclusion of systems of systems and connections to mission-capability packages and the discrete “things” that end up being line items in budgets. It also suggested determination not only to identify overlapping capabilities but to distinguish between desirable and undesirable redundancy and to identify both capability gaps and trade-offs. At least in a quick-look review, this class of work appeared to be professional and responsive to the new paradigms of CBP. Whether the Virtual Systems Command will work out is, of course, something that only experience will show.

The committee also heard a briefing from the Navy Warfare Development Command (NWDC),9 which addressed rather extensive fleet-based experimentation to support near-term assessments closely related to filling recognized capability gaps (e.g., those against small-boat attacks). This effort reflected a laudable Navy decision to reemphasize fleet-level experimentation and the accumulation of substantial empirical and analytical data. The experiments described, however, were all focused on the near term. Although all of them were clearly desirable and important, the committee was concerned that the effort might remain too exclusively concerned with near-term, incremental issues. The Navy leadership will wish to review issues of balance over time.

8  

Patrick McLaughlin, NAVAIR, “Naval Analytical Capabilities and Improving Capabilities-Based Planning,” presentation to the committee, July 28, 2004, Woods Hole, Mass.

9  

Wayne Perras, Technical Director, Navy Warfare Development Command, “What We Do/Who We’re Doing It For,” presentation to the committee, July 27, 2004, Woods Hole, Mass.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

Operational Analysis for the Department of Defense and Office of the Chief of Naval Operations

In contrast to the experience described above, the committee found many reasons for concern at the level between top-level guidance and systems command (SYSCOM)-level work. Here the committee observed severe disconnects between top-level intentions and reality in the ranks. When the committee asked about excursions and exploratory analysis around baseline assumptions, briefers reported that there had been very little done. Thus, while some viewgraphs had been changed to be consistent with CBP, much of the ongoing work still had the problems of the previous era, particularly those surrounding point-scenario analysis. Although the problems seen by the committee may have been temporary, they appeared more likely to be chronic. If so, the Navy should recognize these as being systematic, indicating deep-seated issues, and act accordingly. Corrective measures will take much more than top-level documents, because staff take their lead from a myriad of actions and priorities expressed in the course of time. These problems are discussed more fully in the next section.

THE ANALYTIC FRAMEWORK

Understanding Needs at Mission and Operation Levels

Description of Analytic Approach at Mission and Operation Levels

The Department of the Navy’s core documents include useful decompositions from high-level components (e.g., Sea Strike) down to meaningful levels of detail. It is always a matter of judgment how far to carry such breakdowns. As one goes into more detail, issues and tasks become increasingly well defined and challenges become more explicit. However, excessive decomposition also generates a morass of detail that is not useful for higher-level planning.10 And, to make things worse, it can introduce biases by “hard-wiring” the way in which missions and higher-level tasks are to be performed. In capabilities-based planning, it is desirable to stop decomposing before that happens or, at least, to carry along alternative decompositions reflecting alternative concepts of operation.

As can be seen from Figure 3.2, the Department of the Navy’s objectives-to-challenge decomposition structure goes down three to four levels. For example, in the FORCEnet component (close to bottom right), it goes down to the level of “Detect and Identify Targets/Moving Land Targets.” This level of specificity is

10  

Detailed decomposition is, however, valuable for defining the myriad detailed tasks that must be mastered, supported, and coordinated. All Services are required to prepare detailed decomposition, and the results are published as the Unified Joint Task List by the Joint Staff.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

useful for highlighting an important mission that is very different from other detect-and-identify missions, and one that the Navy has not traditionally emphasized. If the mission to engage moving land targets had been left implicit, it might not receive adequate attention.

By and large, the committee concludes that the decomposition shown in Figure 3.2 is suitable as a top-down depiction. In particular, it has enough detail so that responsibilities for follow-up work can be assigned meaningfully. And, although there are scores of capability areas indicated (counting at the lowest level), the number is small enough to be managed. What matters, of course, is that for each one of these capability areas, the Department of the Navy does in-depth analysis to assess needs, capabilities, and improvement options. The committee could hardly review or assess that effort in a cursory review. However, Figure 3.3 illustrates how, for a large subset of items in Figure 3.2, the Navy has sought to assess capabilities versus time. For example, in the highest bar (“Neutralize Submarine Threats”), the Navy’s assessment is that the ability to neutralize submarine threats will improve from poor (black) to marginal (gray) within the time period shown (roughly through 2020). In contrast, much better progress is projected for countering minefields (by what mechanism was not made clear to the committee). The assessments were the result of subjective warfighter estimates, informed also by results of POM-06 campaign analyses and mission-level analyses. The process used to obtain the estimates was neither rigorous nor satisfactory to participants, but it was a systematic first effort that can be refined with time.

The analytical approach being employed, then, appears to be that of using the decompositions, examining needs and capabilities in each area, and projecting changes over time in high-level depictions. That approach is reasonable, and was also consistent with the need in CBP to go to mission level rather than merely reporting results of theater-level campaigns in particular scenarios. The committee was surprised by the results shown in Figure 3.3 (almost none of the assessments improve beyond marginal (gray) or poor (black), thereby suggesting that a hard look at the criteria used would be appropriate), but the assessment process was at least a good beginning for something that can be much enriched over time.

So far, so good. Unfortunately, the committee’s assessment was that many problems exist at the next level of analytical detail, as discussed below.

Assessment of Analytic Framework for Mission and Operation Levels

In assessing the Navy’s mostly implicit analytic framework, the committee drew on its experience and looked for generic problems that often beset analysis that is intended to, but actually does not, support capabilities-based planning. OPNAV will wish to review the situation when this report emerges, but the generic problems are as follows:

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

FIGURE 3.2 Decomposition of capability needs. SOURCE: CAPT Terry McKnight, USN, N70, “Naval Capabilities Development Process,” presentation to the committee, July 27, 2004, Woods Hole, Mass., slides 11-14. NOTE: SOF, Special Operations Force; CBRNE, chemical, biological, radiological, nuclear, explosives; C2, command and control; AFSB, afloat forward staging base; PNT, precision, navigation, and timing.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

FIGURE 3.3 Projected capability versus time (roughly through 2020) for each of the capability areas. SOURCE: Adapted from CAPT Terry McKnight, USN, N70, “Naval Capabilities Development Process,” presentation to the committee, July 27, 2004, Woods Hole, Mass., slide 25. NOTE: PNT, precision, navigation, and timing; SOF, Special Operations Force. Key: Poor – black; Marginal – Gray; Good – White

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
  • Trivializing uncertainty issues by examining a variety of name-level (or specific) scenarios (e.g., in a DOD context, the “WestPac” scenario), but approaching each such scenario in the traditional, myopic way, holding constant all of the many assumptions that distinguish one case from another within the scenario. For the purposes of CBP, there are often more important variations across cases within a given name-level scenario than across name-level scenarios.

  • Trivializing exploratory analysis, which is a core element of CBP, by conducting only a few excursions with one or a few assumptions changed (typically in organizationally comfortable ways), while other assumptions are held constant as though certain.11

  • Focusing on scenarios preferred by an organization through emphasis on detailed scenario assumptions that stress the particular organization’s issues and dramatize its role, without presenting a more holistic representation of how the organization fits into a larger enterprise and its activities.

  • Relying upon large, complex, and inflexible models and databases, which often bury issues and preclude exploration.

The Department of the Navy analysis (of a scenario of a major war of a traditional sort) briefed to the committee at its July 2004 workshop reflected serious problems in each of the categories listed above. There was very little discussion or appreciation of uncertainty, except sometimes in boilerplate slides. Worse, this vacuum seemed to be regarded by the analytic staff as the norm. Indeed, it was claimed that there was insufficient time to do many excursions. The committee finds this trivializing of uncertainty issues quite troubling and inconsistent with fundamental tenets of capabilities-based planning. From the committee’s own collective experience this is a general problem, not one confined only to the Navy staff. Analytical organizations in the DOD frequently do not have sufficient breadth in the tools, staff, and experience needed to carry out such broad analysis adequately, particularly in the current threat environment. This lack needs to be remedied quickly.

To make matters worse, as discussed in Chapter 4, the scenarios used in the OPNAV analysis did not include the DOD standard scenarios established by the OSD and the OJCS. Instead, they were scenarios and case assumptions within those scenarios chosen to stress naval capabilities in particular. It is entirely appropriate that such cases be examined, but not by themselves. It was difficult to avoid the conclusion that a reason for choosing the scenarios and cases was the

11  

An older example of this problem was the tendency of DOD studies to assume that adversaries were highly capable and motivated, thereby biasing analysis against concepts of operations calling for rapid action by small forces. A more recent example was the analysis for the Iraq war that uncritically accepted assumptions about the nature of the postwar environment rather than laying out starkly the consequences of different assumptions. In the Navy context, key assumptions often relate to the availability of other-Service assets, concepts of operation, and enemy strategy.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

classic motivation of “making the Navy case” in the battle for funds rather than objectively describing the analytic landscape.

Again, it is entirely appropriate that the Navy identify the cases that cause it stress and make their implications known to policy makers. However, there is an obligation to work and report on DOD standard cases, and to put Navy-stressful cases into analytical perspective. Moreover, first-rate capabilities-based planning in the modern era must be conceived in a joint context, with serious steps taken to identify how to exploit jointness effectively in mitigating problems that would classically have been seen as Service-level problems. If, for example, some Air Force bases that could potentially be used are assumed unavailable, it does not follow that a war must now be fought from carrier strike forces alone. Other options need to be addressed: What could be accomplished with long-range bombers (B-1s, B-2s)? What regional bases might reasonably be available and feasible to protect during operations? How would that change as a function of political-military considerations and the availability of ballistic-missile defenses?

In short, there was a stark disconnect between the CNO’s stated intentions with respect to jointness and practice in carrying out these intentions. On the one hand, top-level documents and briefings by the Assessments Division of the Office of the DCNO for Resources, Requirements, and Assessments (N81), among others, emphasized jointness and the need to consider a broad range of both scenarios and cases within them—a major departure from analysis in past years. On the other hand, OPNAV analytic work looked more like business as usual: narrow and isolated from joint analytical efforts. The causes of this disconnect are not clear to the committee. Although the committee is convinced that the Navy leadership is fully committed to jointness, it nonetheless believes that Navy leadership needs to do more to carry out its intentions.

Understanding Aggregate Capability Needs

Description of and Rules of Thumb for Aggregation

Even if Department of the Navy assessments of mission- and operation-level capability needs were perfect, the problem would remain of how to aggregate these needs. How much antisubmarine warfare (ASW) capability is needed? In how many places would ASW be conducted simultaneously against troublesome threats (e.g., diesel submarines)? How much ground-strike capability is needed in an era of precision weapons when such weapons can be called to bear from multiple sources (land-, sea-, and air-based), the sources can communicate rapidly, and all are locked together in the same time-and-space coordinate system?

In addition to questions about how much capability is needed, another aspect of understanding aggregate needs is characterizing how quickly one needs them and how quickly missions need to be accomplished. By way of analogy, the need to be able to interdict ground forces has been recognized for many decades. It was

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

only when the need arose to halt enemy ground forces quickly (within days rather than weeks, and within only kilometers rather than hundreds of kilometers of lost territory) that the need developed intensity and posed serious challenges. A number of studies were done between 1993 and 2001 to better understand what could be accomplished in this regard and how it translated into requirements for forward-deployed forces, deployment rates, per-sortie effectiveness, minimization of the time required to suppress air defenses, and so on. There was no single answer, because how much is enough depended on assumptions about usable warning, regional access to bases, the size of the enemy force, its rate of advance during a continuing interdiction campaign, and so on. Nonetheless, some conclusions were evident, such as the necessity of having forces available on D-Day and of being able to use them immediately, without a lengthy defense-suppression campaign of the type that had been assumed in most studies in the early-to-mid-1990s.

Some of the significant issues today, with obvious implications for assured access and Navy requirements, include whether to assume the availability and continued viability of regional bases for the Air Force to use, the warning time available to the Air Force, and possible political constraints on the use of those bases in times of crisis (particularly in Japan).

These are classic problems. They are also the problems for which strategic planning guidance is intended to be both helpful and, yes, directive. The Navy needs to provide objective analysis that describes sharply the potential consequences of overly optimistic assumptions about the early use of massive Air Force assets, but that analysis also needs to describe clearly (1) how maritime operations could be enhanced by the appropriate use of other-Service assets (long-range bombers, surveillance systems, conventional ground forces, and special forces) in those stressful cases; (2) how maritime forces would be supportive rather than dominant in other important cases (e.g., when they might be critical in the first days for theater-opening purposes, but less so thereafter); (3) the decreasing plausibility of the maritime Service having to deal with particularly stressful circumstances simultaneously in multiple theaters; and (4) how maritime Services would cope in plausible simultaneous operations in other theaters if focused upon operations in the principal theater.

Some rules of thumb for aggregation (roll-up) are as follows:

  • When working at the mission level (see the preceding subsection), mission-system analysis should be required—by which is meant that all of the critical components of capability must be healthy or the overall assessment of that capability should be ranked poorly.12

12  

This is akin to using a multiplication scheme rather than a linear weighted sum in aggregating upward.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
  • When addressing force structure, bad cases such as simultaneous conflicts should be taken seriously, but “requirements” should be developed in recognition that crises must often be addressed by the nation sequentially for many reasons, and that in the event of simultaneous crises, delay tactics may be appropriate.13

  • Similarly, while it is important and necessary to emphasize rapidity of response, this can also be overdone. The value of timeliness is highest for the first increments of capability; it diminishes thereafter—especially when it is recognized that real conflicts almost invariably unfold much more slowly than in studies, for numerous reasons having to do with decision making, ambiguity, and the need to avoid the high operational risks associated with committing to a course of action before it can be sustained (as in the march to the Chinese border in 1950). The point here is not to denigrate rapid response, the importance of which can be immense, but rather to avoid excess. Although it might appear that the Navy would get more return for its investment if it stressed rapid deployment of eight carrier strike groups, questions such as that involving the trade-off between slipping the arrival time of the last two carrier strike groups, say, and the ability to retain personnel, the ability to invest in experiments with new kinds of strike groups, and so on, need to be addressed.14

Assessment of Analysis of Aggregate Capability Needs

As is being widely discussed across the entire Department of Defense, there are serious problems in assessing overall force-structure needs.15 Some of these are inherent consequences of uncertainty, while others reflect various types of parochialism. The committee is encouraged that parts of the Department of the Navy are now vigorously pursuing concepts for improving effectiveness that were previously off the table. These include the Fleet Response Plan (with higher readiness levels that enable more flexible response), rotational crews, and the creation of expeditionary strike groups (ESGs).

These inventive solutions seem very much a response to the demands of the

13  

In DOD-level planning, this has sometimes been discussed under the rubric of “strategies” of “win-win” versus “win-hold.” The current strategy goal described as 1-4-2-1 is the result of considerable debate on such matters.

14  

Some of the analysis essential in answering these questions is being done. The committee heard Ariane L. Whittemore, N4, “Fleet Response Plan and the Integrated Readiness Capability Assessment (IRCA),” presentation to the committee, July 28, 2004, Woods Hole, Mass.

15  

Such discussions are occurring, for example, in meetings devoted to the implementation of capabilities-based planning within the DOD. See also, “Initiation of a Joint Capabilities Development Process,” memorandum from the Secretary of Defense, October 31, 2003, to the Service Secretaries, Chairman of the Joint Chiefs of Staff, and others, setting a goal and issuing guidance to “achieve a streamlined and collaborative, yet competitive, process that produces fully integrated joint warfighting capabilities.”

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

DOD’s strategic objectives for defending the homeland and conducting military operations. As so often happens in Service-based analysis, however, the innovations did not always have visibility in the capabilities analysis that was probably seen as relating to Navy “requirements” and in the battle of the budget. The committee would like to have heard about a capabilities-based exploration of the possible trade-offs between carrier strike group (CSG) force structure and a heightened readiness posture, or even a more radical approach such as trading off CSGs against ESGs and aerospace expeditionary forces.

Metrics

Establishing metrics is another major challenge throughout the DOD, as emphasized by the Secretary of Defense.16 Managers want metrics, as do oversight groups such as the U.S. Congress, the Government Accountability Office, and the Office of Management and Budget. Unfortunately, using the wrong metrics can be seriously counterproductive, and metrics that can be used internally in sensible ways can be used mischievously by organizations.

Following are some general principles for metrics in aggregate-level planning, such as force-structure studies. The first principle is to develop metrics as spin-offs of operational analysis, so that the metrics fit naturally into analysis of capabilities to accomplish missions and operations. This is in contrast with emphasizing simpler, bean-counting types of metrics such as numbers of platforms.

Goals for metrics should be based on analysis and realistic assumptions about technical feasibility. They should not be established ad hoc by decision makers who would like to be able to legislate magic. If metric goals are unrealistic, they corrupt not only subsequent analysis (who wants to go forward with an analysis showing that the boss’s goal cannot be reached!), but also broader aspects of the organization’s management.17

The metrics, however, should build in the emphasis of capabilities-based planning on flexibility, adaptiveness, and robustness. This can be done, for example, by reporting mission outcomes in an exploratory analysis across cases rather than reporting the mission outcome for some allegedly representative point case.

An example for a naval application might be that of assessing the distance required to stop an enemy’s maneuver force with a combination of aircraft and

16  

See Charles Kelley, Paul Davis, Bruce Bennett, Elwyn Harris, Richard Hundley, Eric Larson, Richard Mesic, and Michael Miller, 2003, Metrics for the Quadrennial Defense Review’s Operational Goals, RAND, Santa Monica, Calif.

17  

The committee’s concern is motivated in part by other Services having postulated unachievable capabilities, which then led to analysis “supporting” (but really just saluting) the postulates, wheel spinning, and loss of time from pursuing poor courses of action. Goals should be ambitious, but should also be rooted in the real world.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

FIGURE 3.4 A notional example of seeing a metric as outcome across a case space (exploratory analysis). See the discussion of outcomes in text.

missiles. Instead of looking at a particular point scenario, CBP would treat at least the following as key variables: (1) the number of forward-deployed aircraft and missiles (shooters), (2) the speed of the enemy’s maneuver, (3) the effectiveness per missile shot or aircraft sortie, and (4) the enemy’s cohesion.18

Figure 3.4 illustrates schematically how analysts could look at the simultaneous variation of these four variables. In the top left quadrant of the figure, the enemy moves a long distance before being stopped, because the maneuver speed is high, kills per sortie or shot are low, and the enemy’s cohesion is high. Even having a large number of shooters available would not result in success. Success

18  

Actual curves would not be linear. For exploratory analysis of the counter-maneuver problem (albeit, one that preceded the war in Iraq), see Paul K. Davis, Jimmie McEver, and Barry Wilson, 2002, Measuring Interdiction Capabilities in the Presence of Anti-Access Measures: Exploratory Analysis to Support Adaptive Planning for the Persian Gulf, RAND, Santa Monica, Calif.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

(stopping the attacker after only a short distance, as indicated by the horizontal dashed line) could be achieved in several ways, as shown by results in other quadrants of the figure. If kills per sortie or shot were increased (top right), then success would be achieved, but ony if maneuver speed were low. Similarly, if the enemy’s cohesion were low, so that maneuver stopped with relatively light attrition, success could be achieved (bottom left). But if effectiveness were high and enemy cohesion were low, success could be achieved even if the number of forward-deployed shooters were smaller (bottom right).

As of July 2004, it appeared to the committee that Navy studies under the rubric of capabilities-based planning did not typically have sound and sophisticated metrics, particularly metrics reflecting uncertainty. This lack appeared to be less true of more detailed analysis, such as was briefed to the committee by the system commands, but no strong conclusions can be drawn.

Characterizing Capabilities

One aspect of capabilities-based planning is keeping track of the multiple dimensions of capability and how programs project improvements in it. Figure 3.5 illustrates this type of effort with a purely notional, spider-chart depiction. It characterizes the Navy’s capabilities along several axes: “Control seas,” “Assure early access,” “Maintain presence,” “Project force inland,” “Defend homeland from missile attack,” and “Defend allies and deployed forces from missile at-

FIGURE 3.5 Notional capabilities-based planning depiction of present Navy capabilities (heavy dashed line) and future goals (heavy solid line). NOTE: Numbers represent percentages.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

tack.” The inner, dashed, contour suggests that capabilities are strong for two axes—“Control seas” and “Maintain presence”; fairly strong for another—“Assure early access”; and not very strong for the others. One possible goal for future Navy capabilities would be to achieve the outer, solid contour. That would require additional emphasis on ballistic-missile defense, early access, and the ability to project force inland even in difficult circumstances.

General Attributes of Rigorous Analysis

Another major concern of the committee relates to the need for first-rate analysis to be rigorous, documented, transparent, and as objective as possible. Rigor is a matter of degree. High-level decisions on programs and budgets depend on analysis being approximately right and appropriately insightful, not on high precision. Nonetheless, viewgraphs do not constitute analysis, nor do viewgraphs plus assertions about how hard people have worked. Good analysts universally acknowledge that the discipline involved in writing down assumptions and working carefully through the logic—that is, generating documentation—is exceedingly important.

The committee’s impression is that OPNAV-level analysis, by contrast with rigorous analysis, is more ad hoc, undocumented, and rather opaque on key assumptions, and that it tends to have an advocacy bias and is constructed to focus only on particular issues (e.g., what might be needed to deal with certain bad-case naval scenarios). The committee did not see broad areas of choice with a hard-edged assessment of strengths and weaknesses being presented to the leadership. The OPNAV-level analysis is worthwhile in some respects, but it is not yet of the quality appropriate for senior decision making.

Choosing Among Options in a Portfolio Framework Suitable to Top-Level Needs

Developing the appropriate portfolio views is a complex undertaking that is highly dependent on the particular organization and decision context.

Strategic Planning Versus Operations Research

As suggested above, the current approach of OPNAV to analysis appears to be one of presenting charts that indicate adequacies and shortfalls (see, e.g., Figure 3.3), by capability area, and presenting occasional operations-analysis charts illustrating particular points. For example, the committee was briefed on some interesting work by the Assessments Division of the Office of the DCNO for Resources, Requirements, and Assessments (N81) that examined “Scud hunting problems” in more technical depth and with more operationally realistic assumptions than usual. As a result, different conclusions were suggested about

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

the potential mix of satellites and unmanned aerial vehicles and about the speed needed for air-to-surface missiles.

Such work is useful and has some of the features of portfolio-management-style work in that one can look critically at the various capability areas and see where the greatest shortcomings exist (subject to the appropriateness of the underlying assumptions). That process, in turn, can lead to suggestions about resource allocation. It is, nonetheless, an operations-analysis perspective rather than one ideally suited to resource allocation.

Assessment of the Navy’s Portfolio-Management-Style Analysis

Strategic-Level Portfolio Analysis. In the committee’s view, portfolio-management-style presentations for the CNO and his top leaders should often have a more strategic, top-down character, and should more explicitly address economic issues. The committee did not see much economics-sensitive analysis, although some had been done in the preparation of the briefings that the committee was given. It seems, however, that the emphasis in the Navy’s CBP is exclusively on identifying shortfalls and finding ways to fund them. Notably absent, except at the systems command level, is the search for opportunities to accomplish missions effectively but at less cost, thereby freeing up funds for other purposes.


Zooming. Portfolio work should allow for zooming in on an area as desired, so that the basis for high-level charts can be examined in depth. It appears, however, that there is minimal rigor in the Navy’s current assessments and no systematic way to trace the assumptions and logic from a top-level portfolio view to deeper capabilities analysis in which assumptions and their consequences could be seen parametrically. Arguably, this type of ability requires a family-of-models approach.

The Navy is working to establish such a family of models, for which the CNO has provided funds. The architecture for what is needed, however, was not clear to the committee. It must include high-quality policy and system analysis, not just more investment in big models and simulations. It should also be tied to real-world data, not just to simulation.19


Highlighting Risks. Highlighting types of risk is a key part of portfolio-style analysis. Examples of different types of risk include the following: technical risk (Will a system in development work?), program risk (Will the program slip in time or have cost overruns?), future technology risk (Is the base being laid, in

19  

National Research Council. 2004. The Role of Experimentation in Building Future Naval Forces, The National Academies Press, Washington, D.C.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

research and development (R&D), for necessary future systems?), and strategic risk (Will the capability developments prove seriously inadequate because of changes in the strategic environment or national policy?).20


Adaptive Options. Strategic options should be explicitly adaptive and should hedge against their key assumptions’ proving to be wrong. One way for the Navy to do that is to consider seriously the broad range of scenario classes identified in the Strategic Planning Guidance (SPG). 21 Another is to recognize that the SPG itself is a baseline, not a definitive roadmap into the future. Indeed, key assumptions of the SPG will likely change with administrations and with strategic developments in the world. Thus, the Navy’s planning should also consider how robust its program would be in the event of such changes. The Navy should have contingency plans for such possibilities as (1) a greatly increased emphasis on defending the homeland from missile attacks (e.g., from containerized missiles) and (2) much-greater-than-expected threats to aircraft carriers, even at rather long ranges. These possibilities are offered purely as examples.

It appears that current Department of the Navy work does not include true strategic options (e.g., adaptive options that hedge against events unfolding in unexpected ways). It is too formulaic and too slavishly responsive to CNO and DOD guidance, without providing feedback that might help reaffirm or adjust that guidance.


Implications for Personnel. Presenting broad, discerning, strategic-level analysis for the CNO requires a higher level of analysis than that characteristic of systems analysis or operations research. This broad analysis is in the realm of strategic planning and policy analysis. Current personnel requirements for OPNAV analysis are predominantly limited to capabilities and experience possessed by operations-research-oriented personnel (and even those requirements are often not met). Revised personnel requirements and personnel policy changes would make it more plausible that those chosen would be rewarded with promotion.

20  

Portfolio methods are discussed in the following work and references therein: Paul K. Davis, 2002, Analytic Architecture for Capabilities-Based Planning, Mission-System Analysis, and Transformation, National Defense Research Institute, RAND, Santa Monica, Calif. Current applications to the Missile Defense Agency highlight these particular risks. Unpublished discussion of the subject arose in a summer 2004 workshop at the Naval War College sponsored by OSD’s Director of Net Assessment. Many portfolio-related discussions, of course, can be found in the business literature, some of which are relevant even though the DOD and military Services do not have simple “bottom lines” against which to measure everything. See, for example, Robert S. Kaplan and David P. Norton, 1996, The Balanced Scorecard: Translating Strategy into Action, Harvard Business School, Cambridge, Mass.

21  

Department of Defense. 2004. Strategic Planning Guidance, Secretary of Defense Donald Rumsfeld, Washington, D.C. (draft). (Classified)

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

The Need for Assimilating Capabilities-Based Planning Principles in Navy Analytic Processes

The committee based the following recommendation on the assessments in this section. The recommendation will remain applicable until Navy leaders, after reviewing the situation, discover that they have eliminated the disconnects referred to above between leadership intentions and day-to-day analysis.


Recommendation 1: The Chief of Naval Operations should reiterate principles of capabilities-based planning and ensure that they are truly assimilated in Navy analytic processes.

The criteria for implementing Recommendation 1 include the following: The work accomplished should be joint and output-oriented, with the ability to actually execute operations as output. Successful CBP will require analysis over a broader scenario space, extensive exploratory analysis within specified scenarios, development of options both to solve capability problems and to achieve efficiencies, and portfolio-style assessments of those options at different levels of detail. The portfolio-style assessments should assist in making difficult trade-off decisions and should also address various types of risk that Navy leadership must take into account. Strategic options should be adaptive, because world developments and technological developments will undoubtedly force changes—the potential need for which are not much discussed in the DOD’s Strategic Planning Guidance.

Although the committee did not discuss tools to support the types of analysis referred to here, it is quite aware that analytic organizations have trouble responding to the demands of good capabilities-based planning. The difficulties are rooted in excessive dependence on large, complex models and related databases; in management demands for detail; and in the ways in which analyses have been framed and conducted. Breaking these molds will not be easy. It will require a family-of-models approach that includes links to war-gaming and experimentation, but that must also include an often-ignored component: “smart,” low-resolution modeling and analysis that can support exploratory analysis (grounded in higher-resolution work or empirical data when appropriate) and put a premium on higher-level insights rather than focusing on minutia.


Recommendation 2: The Chief of Naval Operations and the Secretary of the Navy should ensure that the Navy invests in defining and developing the new generation of analytic tools that will be needed for capabilities-based planning.

Some of the attributes needed in tools include the following: agility in low-resolution modeling coupled with the ability to go into greater depth where needed (achievable with a sophisticated family of models and games); the ability to represent network-centric operations well (including publish-subscribe archi-

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

tectures, rather than node-to-node representations); and the ability to deal with challenges such as those that the OSD refers to as disruptive, catastrophic, and nontraditional scenarios.

The committee is aware that the CNO has funded new work on a family of models. It is quite possible, however, that the funds will quickly be exhausted in improvements to “big models” and databases, with little benefit for higher-level capabilities-based planning, as described above. The committee encourages a balanced use of funds, including the potential purchase or use of available off-the-shelf tools.

It is not possible for the committee to make more detailed suggestions here without a more extensive study. The committee notes, however, that examples of the kinds of tools mentioned above have been developed and applied.

FUTURE BUILDING BLOCKS

Flexible, robust, and adaptive capabilities invariably stem from having “building blocks” that can be quickly assembled, tailored, and used in diverse ways. Such capabilities require suitable building blocks, an appropriate command-and-control system, doctrine, and “practice, practice, practice.”

The Naval Services have always been relatively good at such things. Carrier battle groups were building blocks tailored to their theater; today’s strike groups have evolved a great deal since the Cold War, and the Navy is actively considering an even wider range of employment options. Marines have always done building-block planning, explaining the absence of a standard Marine Expeditionary Force.

Building blocks come in different forms: equipment (e.g., platforms), organization (e.g., carrier strike groups), and operations (e.g., for conducting long-range air strikes or mounting a surprise assault by Marines). Building blocks are also hierarchical. And, in today’s world, networking allows more and quicker tailoring and adaptation, as well as leveraging of the capabilities of individual platforms, units, and suboperations.

Overall, the Department of the Navy appears to be addressing the building-block issues vigorously. Problems are likely to occur, however, such as that of allowing important future building-block innovations to slip away when funding becomes tight. For example, funding the full contingent of carrier strike groups and raising their readiness for rapid deployment (up to eight strike groups within a specified number of days) might come at the expense of more actively pursuing non-carrier strike groups or next-generation carriers that would be more difficult for a future adversary to attack. To help reduce the likelihood of such problems, the Department of the Navy should conduct a review of future building-block options and focus on those designed to increase the range of decision options available to the top leadership. This could be accomplished as part of the actions suggested by Recommendation 1.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

IMPLEMENTATION—MOVING TOWARD FIRST-CLASS ANALYSIS

The Office of the Chief of Naval Operations has only recently been reorganized, and much more extensive research would be needed to make any useful recommendations about further changes. Thus, the committee did not discuss organizational issues in great depth, instead commenting on problems that could be resolved with incremental changes. The assessment below touches on organizational problems, developing a first-rate analytical staff, the culture needed for such a staff, and links to the DOD and the other Services. Finally, it touches on the temporal issue of what can be done in the short term rather the long term.

Organization

The committee’s conclusions about organizational problems are as follows. The logic for the responsibilities assigned to the Deputy Chief of Naval Operations (DCNO) for Warfare Requirements and Programs (N6/N7) on the one hand, and to the DCNO for Resources, Requirements, and Assessments (N8) on the other, is not entirely clear or persuasive, either to the committee or to some of the officers who briefed the committee and talked with its members.

To elaborate, the idea and use of competitive analysis and creative tension are fine, and the intention of generating alternative perspectives is excellent. However, the current competition between parts of N6/N7 and N8 does not appear to be helpful and involves high opportunity costs. Rather than having two alternative, ad hoc versions of any given issue and sets of undocumented analysis, it would be better to have a first-rate job done of objective analysis informed by alternative points of view. The analysis would be more nearly comprehensive, systematic, parametric, questioning of assumptions (even sacred cows), and transparent than today’s. Alternative perspectives could be compared by juxtaposing their implications in analysis charts. Consistent with this suggestion, the committee recommends more emphasis on solid, first-rate analyses by a single organization within OPNAV. These could spin off quick-response, ad hoc analyses as needed.

If the Chief of Naval Operations and the Secretary of the Navy (SECNAV) are to have a highly competent analytical organization, it is essential that the organization (1) report directly, or relatively directly, to the CNO/SECNAV, rather than being relegated to low levels in the Department of the Navy with layering to dilute its influence; (2) be institutionalized so that it cannot easily be disbanded at the whim of a future CNO or SECNAV (the dissolution of the Systems Analysis Division of the Office of the Chief of Naval Operations (OP-96) in the 1980s has long been viewed as a disaster); and (3) be closely linked to program builders. Whether these criteria can be met within the current OPNAV organization was not something that the committee could easily assess in the time available.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

The Need for a Small, First-Rate Staff

Particularly for strategic-level analysis, the CNO needs products that can best be obtained by a small, first-rate staff that would include a number of exceptionally talented individuals and future leaders and would be connected well to first-rate outside research-and-analysis organizations. The ideal staff should be seen not just as a collection of operations research personnel, but as a multidisciplinary group with a mix of warriors, policy analysts, systems analysts, engineers, economists, and managers (perhaps with master’s degrees in business administration). Further, this staff should include members with outstanding potential and promise (e.g., military who will reach flag rank).

Culture

Much is known about the cultural characteristics of first-rate analytical defense organizations, and the Navy can draw upon its own experiences over the years for examples of both good and bad practice. Generically, however, the good characteristics include the following:

  • The ethic of getting the problem straight, even if it revisits guidance or assumptions;

  • Loyalty to the boss—the CNO—but also to the Secretary of Defense, the President, and the nation, rather than to Navy warfare areas, platforms, and so on;

  • Integrity;

  • The mind-set to think joint, but also having the ability to do superb competitive innovation and analysis for the Department of the Navy;

  • The mind-set to seek broad, complete analyses rather than analyses to support a superior’s talking points;

  • Respect and energetic search for empirical and expert information, whether it is obtained from people in the field, through experiments, by augmentation of staff, or from other mechanisms;

  • Rigor in everything (but not always in numbers or precision);

  • A good process that includes (not always linearly: problem definition, identification of assumptions, a plan for analysis, appropriate tools, and so on);

  • A very high ratio of thinking and smart, simple analysis to model running and data analysis (but with subcontracts to specialists);22

22  

For many years the OSD’s Office of Systems Analysis, later the Office of Programming Analysis and Evaluation, did not use any large and complex models, believing that it was instead essential to remain focused on higher-level issues and relatively reductionist (but not naive) analysis. Today, most analytical shops appear to be tilted far to the big-model extreme, to their detriment.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
  • Dynamism, such as can be achieved by reasonable turnover rates and constant contact with “outsiders” and their ideas, whether elsewhere in the organization or in Federally Funded Research and Development Centers, professional societies, universities, or industry; and

  • The opportunity for self-motivated work that attempts to look beyond the current in-basket and conventional wisdom.

These characteristics can only be developed and sustained within a top-notch analytic organization if the leaders (i.e., the CNO and other key leaders in the Navy) instill and support them.

Links to the Department of Defense and the Other Services

For the Navy, it is important that its analytical shop(s) have good links within the DOD and particularly to the other Services, not just in required coordination meetings but also as a standard part of doing good, professional work on behalf of the nation.

Creating an Appropriate Analytic Organization

With the material provided in earlier subsections as background, the next question is how to go about creating the appropriate analytic organization. The committee’s basic recommendation on this question is presented below, followed by a discussion of possible models for the Navy to use in addressing the issue.

Recommendation 3: The Chief of Naval Operations and the Secretary of the Navy should develop a clearly delineated concept of the Navy’s future senior-level analytic support organization and define goals for its composition, including multidisciplinary orientation and officers appropriate for high positions.

Potential Models

At least two good models exist for creating and maintaining a highly competent organization to perform analysis and package options for choice. One is similar to the Navy’s OP-96 model of the late 1960s and early 1970s and the Army’s Office of the Assistant Vice Chief of Staff (AVCS) in the same period. In these cases the organization in charge of the analysis also prepared the resource-allocation decision packages.

The other model was developed by the Air Force. In this case, the position of Assistant Chief of Staff for Air Force Studies and Analyses (AF/SA) was created, with a responsibility limited to performing in-house, independent analyses on issues impacting Air Force operations and programs, current and future. The key

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

distinction was that OP-96 was part of the OPNAV organization responsible for providing resource-allocation packages to the CNO, whereas the AF/SA organization’s only function was to do studies and analyses for the Chief of Staff, the Service Secretary, and other Air Force organizational elements requesting analyses (i.e., it did not have a direct resource-allocation responsibility for preparing Air Force programs and/or budgets).

An advantage to having the strong analytic arm working for (or with) those preparing resource-allocation decision packages is that the analysts are grounded in the reality of the issues and types of decisions that must be made each year. However, a potential disadvantage of a direct tie to the resource-allocation staff is that the analytic staff members can get so caught up in the immediate issues that they are not provided the opportunity to build analytic capital and focus on larger, longer-term issues that may be much more important to the future of the Service (a matter of analysis production with available tools versus long-term analytic development with the anticipation of essential issues). In addition, some view a very strong analytic arm in the resource-allocation organization to be an overconcentration of power that may be misused and abused and thus be a detriment to the Service. This source of contention had developed in the Army, and the AVCS office was dissolved, in part because of criticisms regarding its power and effectiveness.

When at its peak effectiveness, the AF/SA office was viewed more as an honest broker of studies and analyses, with independence from functional program responsibilities, that would provide quality products to the Air Force staff and major commands. Those elements would then use the products in conjunction with their own work to prepare decision packages and advocacy positions for programs. The decision maker (e.g., Chief of Staff or Service Secretary) could get a “second opinion” on the decision packages if needed, by asking the Assistant Chief for AF/SA if AF/SA products were being misused. If the AF/SA team had not provided analyses on the issue, it could be asked to provide an independent assessment. A current analogy would be a request by the DOD or a military department acquisition authority for an independent cost estimate of an important acquisition program, or a request for an independent review of the program manager’s cost estimate by the Cost Analysis Improvement Group or its Service equivalent.

A key criterion for success is that the CNO and the SECNAV get as close as possible to unbiased analysis and presentation of decision packages. Additionally, they need to have the means to get an independent assessment on important issues when needed. It is best to find out if there are any weak links in a package and to deal with those weaknesses before making a decision on, or recommendation for, a major commitment.

What model would be best for the Navy in the current environment is not clear to this committee at the present time (and, as with much capabilities-based planning, there may be more than one viable solution). The Navy could address

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

the issue in a subsequent tasking to an outside entity, it might do an in-house assessment, or it could make some decision based on available information. This committee believes that the CNO and SECNAV should expend the necessary time and effort to be assured that they are making the best choice for the Department of the Navy. The way in which they choose to obtain and organize their high-level decision preparation and analytic support could be the most important decision that they make regarding possible future success of the Department of the Navy’s efforts in capabilities-based planning and overall resource-allocation for a decade or more.

In addition to the analytic organization, the Navy also will need to develop an increased supply of top-flight officers suitable for working in this organization and then moving on to operational assignments and flag rank. Developing this supply of officers will require looking at potential major changes of guidance and process in the personnel system, as well as related changes of the incentive structure.23 The following is worth noting:

  • In the 1960s and 1970s, the Navy made a concerted effort to retour officers in a subspecialty ashore (such as in the planning, programming, and budgeting system (PPBS)), to improve their experience and qualifications in that area. A number of admirals came out of that program and served the Navy and several CNOs well with their experience and insights.

  • In the 1980s, the Navy changed its policy and programmed officers with no prior experience into critical PPBS positions—a policy that continues today. The Navy would never consider assigning a captain to command at sea without extensive experience and proven performance in prior sea tours. However, it quite often assigns captains to critical billets in PPBS—on the OPNAV staff, SECNAV staff, and the Joint staff—with little or no prior analytical and resource-allocation (e.g., PPBS) experience or training. Having a small, qualified analytic staff for the Navy will be very difficult to achieve, at least in the near term, until sufficiently experienced officers are developed.

The committee believes that the Navy needs to change some current manpower and personnel policies in order to enhance its ability to build a longer-term, high-quality OPNAV staff with enhanced potential for performing excellent capabilities-based planning and analysis. A key element of those changes should involve creating assignment patterns for future leaders to introduce such individuals early to the discipline of analytical thinking in a real-world context (e.g., the analysis for, preparation of, and review of the Navy Program Objective Memorandum and/or equivalent parts of the overall DOD program). Such assign-

23  

In the past, some distinguished Navy four-star admirals have had Ph.D.s in “hard” disciplines as well as tours in Navy or DOD analysis.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×

ment patterns should continue to expose such individuals in their careers to the world of analysis and trade-offs in which outcomes influence budgets and/or major programs.

The Temporal Issue: Near Term and Longer Term

There clearly exists a temporal issue with regard to obtaining the type of analytical support that the committee believes the top-level Navy decision makers should have. Actually obtaining such support would take time even if a decision to do so were made immediately. Thus, the committee suggests the following:


Recommendation 4: In the short term, the Chief of Naval Operations and the Secretary of the Navy should go outside their organizations to sharpen concepts and requirements, drawing on the external community of expert practitioners in analysis. Also, they should augment their in-house analytical capabilities in the short term by drawing on Intergovernmental Personnel Act assignments (and other individuals who could take leave from their home organizations), Federally Funded Research and Development Centers, and national and other nonprofit laboratories.

Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 30
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 31
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 32
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 33
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 34
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 35
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 36
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 37
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 38
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 39
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 40
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 41
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 42
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 43
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 44
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 45
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 46
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 47
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 48
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 49
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 50
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 51
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 52
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 53
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 54
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 55
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 56
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 57
Suggested Citation:"3 Review of the Navy’s Analytical Processes and Methods." National Research Council. 2005. Naval Analytical Capabilities: Improving Capabilities-Based Planning. Washington, DC: The National Academies Press. doi: 10.17226/11455.
×
Page 58
Next: 4 Navy Participation in Capabilities-Based Planning Processes of the OSD and OJCS »
Naval Analytical Capabilities: Improving Capabilities-Based Planning Get This Book
×
Buy Paperback | $38.00 Buy Ebook | $30.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Naval Analytical Capabilities assesses current Department of Defense initiatives and the Department of the Navy's progress in transitioning from a requirements-based to a capabilities-based organization. The report also provides recommendations aimed at improving the organizational structure of the Office of the Chief of Naval Operations to best position the Chief of Naval Operations to fulfill his Title 10 (U.S. Code on Armed Forces) responsibilities. This report addresses key elements of capabilities-based planning, examines Navy analytical processes, and recommends an approach to making improvements.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!