Skip to main content

Currently Skimming:

4 Acceptance and Testing
Pages 79-96

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 79...
... These and other areas lack widely agreed on test methods and standards, and they lack operationally realistic T&E methodologies. Not surprisingly, commercial developers of large-scale applications experience similar challenges. The tenets of DOD Instruction (DODI)
From page 80...
... As shown in Table 4.1, the acquisition process, including the T&E process, is governed by a large set of rules, test agents, and conditions, each trying to satisfy a different customer. Traditional test and acceptance encompass three basic phases: developmental test and evaluation (DT&E; see Box 4.1)
From page 81...
... operational test agent typical users Milestone (DODI) 5000 seriesa Decision Authority Joint Interoperability Joint Interoperability Applicable capability Command, Control DOD Directive 4630.5 Test Certification Test Center environments and Communications DODI 4630.08 Systems Directorate Chairman of the Joint (J6)
From page 82...
... Next, more operationally realistic testing is conducted and overseen by the DOD. BOX 4.2 Operational Assessments Typically, operational testing, which incorporates user feedback, is more operationally realistic and stressing than is earlier developmental testing.
From page 83...
... Operational testers, independent evaluators, and the line organizations that represent end users are the key participants in operational tests. These tests include operationally realistic conditions that are designed to permit the collection of measurable data for evaluating whether a system is operationally suitable and operationally effective as measured against key performance parameters (KPPs)
From page 84...
... Cuts to budgets and personnel have significantly reduced the number of soldiers, sailors, airmen, and Marines available to serve as users during the test process, especially in the military T&E departments, even as systems have become more complex.2 This reduced pool of DOD testers impedes the early and close collaboration with systems acquirers and developers that is necessary to support an IID process adequately. In summary, IT testing in the DOD remains a highly rigid, serial process without the inherent flexibility and collaboration required to support an agile-oriented iterative, incremental development process, particularly as it might be applied to IT systems.
From page 85...
... The "small-r" requirements referred to in this report are the more detailed requirements, such as those associated with specific user interfaces and utilities, that are expected to evolve within the broader specified architecture as articulated in the initial big-R requirements document. In a sense, small-r requirements could also be thought of as lower-level specifications.
From page 86...
... In comparison, IID approaches to IT systems rely on user feedback early and at interme diary points to guide development. Without significant user involvement in developmental testing, the earliest point at which users will be involved may not be until opera tional testing, far too late in the development process for an IT system.
From page 87...
... Thus, requirements in IT systems are best guided through user input based on direct experience with current prototypes. The acquisition approach described in this report incorporates the restructuring of the DOD testing and acceptance process to be the enabler of this user input, and the incorporating of operationally realistic testing with user feedback into a routine of continuous opera tional assessment.
From page 88...
... In agile processes, iterations are based on time-boxing work schedules, whereby some content may slip from one iteration to the next, but iterations are closed according to the schedule, thus allowing for prompt identification of erroneous estimates of the time required to complete work items and ensuring continuous user input regarding priorities. In a similar manner, checkpoints with acquisition executives could be based either on time duration or on a funding milestone (as the two are frequently closely correlated in an IT project)
From page 89...
... Commercial IT companies drive their entire investment portfolios on the basis of anticipated and actual end-user consumption patterns and/or end users' engagement with their offered products and services. Those products and services with large and committed user bases drive a preponderance of the businesses' valuation and receive commensurate corporate leadership attention and investment.
From page 90...
... Establishing an MCRC along with leveraging current DOD tools and available commercial tools and practices would overtly move the opera tional evaluation assessment from a speculative proposition based on surrogate run times, users, test data, and marginally current requirements specifications, to a managed and measured investment assessment based on current, actual end-user missions and needs. INCORPORATING COMMON SERVICES DEFINITIONS For years, IT systems developers have employed functionalities that are externally supplied and operationally validated as a basis for their success, without expecting to revalidate those functionalities as part of their formal test regimen.
From page 91...
... Unfortunately, no mechanisms exist to identify and track these supplied services or to apply a consistent approach for their use throughout the current DOD acquisition process. This is another negative repercussion of the weapons systems-based acquisition approach, where far fewer opportunities for shared services exist.
From page 92...
... VIRTUAL INFORMATION TECHNOLOGy TEST ENVIRONMENTS The use of integrated virtual information technology test environments may be one way to facilitate testing that would allow early proto types of systems to be subjected to much more realistic test conditions, thereby helping to identify potential problems in development as soon as possible. Such test environments would rely on a distributed test network that could be accessed by both government and industry, when appropriate, for use in performing early acceptance testing.
From page 93...
... is intended to provide a persistent, composable, flexible infrastructure along with a series of tools, standards, processes, and policies for using the environment to conduct the continuous analysis required to support a capabilities-based planning process.7 • The Joint Mission Enironment Test Capability (JMETC) was established in October 2006 to "link distributed facilities on a persistent net work, thus enabling customers to develop and test warfighting capabilities in a realistic joint context."8 JMETC has already established a persistent test network, through the Secret Defense Research and Engineering Net work, which provides connectivity to both Service and industry assets.
From page 94...
... By inserting "ground truth" system simulation and stimulation data and then observing how the combat systems exchange and display tactical data, engineers can identify precisely and solve interoperability problems ashore well before those systems enter the operating forces. This approach emphasizes shore-based testing and warfare systems integration and interoperability testing and acceptance certification of operational IT systems in a test environment similar to their ultimate shipboard operational environment; it also emphasizes interop erability assessments, which are a prerequisite for the operational certifi cation of the ships in strike force configurations prior to deployment.
From page 95...
... Some of these issues are raised in the DOD's Acquisition Modeling and Simulation Master Plan, issued in April 2006, which focuses on improving the role of modeling and simulation for testing.10 Another challenge in making such environments usable is to ensure that the complexity required to perform the integration of systems and configuration for tests is minimized; otherwise, the costs of using such test environments would far outweigh the benefits. Paramount in man aging the complexity involved is the establishment of a formal systems engineering process involving test design, integration, documentation, configuration management, execution, data collection, and analysis.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.