Support system testing

Source: National Research Council (1997).

Source: National Research Council (1997).

applications associated with each of these four levels. Each level is discussed below in greater detail.

12.2.1 Engineering

Engineering-level simulation comprises the categories of environmental, propagation, noise, reverberation and sonar performance models. This level of simulation generates measures of system performance that are used to design and evaluate systems and subsystems and also to support system testing. Representative measures of performance include probability of detection and median detection ranges. Sonar technologists and acoustical oceano-graphers routinely use this level of simulation for prognostic or diagnostic applications. These performance metrics are also useful in system design, cost, manufacturing and supportability trade studies.

12.2.2 Engagement

Engagement-level simulation executes engineering-level models to generate measures of system effectiveness in a particular spatial and temporal realization of an ocean environment when operating against (engaging) a particular target. This level of simulation is used to evaluate system alternatives, train system operators and evaluate tactics. Engagement outputs can be used to estimate exchange ratios, which are useful in evaluating tactical effectiveness against known and postulated targets.

Tactical decision aids represent a form of engagement-level simulation products that blend environmental information with tactical rules garnered from higher-level, aggregate simulations. These decision aids guide system operators and scene commanders alike in planning missions and allocating resources by exploiting knowledge of the operating environment. While TDAs are usually associated with naval applications, the conceptual approach is valid in research and commercial applications as well.

12.2.3 Mission

Mission-level simulation aggregates multiple engagements to generate statistics useful in evaluating mission effectiveness. At this level, system concepts are evaluated within the context of well-defined mission scenarios. The outputs of this level of simulation are used to evaluate force employment concepts. The effectiveness of multiple platforms performing specific missions can be assessed using this level of simulation.

12.2.4 Theater

Theater-level simulation aggregates mission-level components to generate measures of force dynamics and analyze alternative system-employment strategies. This type of simulation is used in planning, budgeting and operational analysis. Planning includes decisions regarding force structure, modernization, readiness and sustainability. Budgeting includes decisions regarding specific line items in the defense budget. Operational analysis considers issues such as developing contingency plans, estimating logistics demands and analyzing specific combat plans (Bracken etal., 1995). This level of simulation is useful in wargaming with joint or combined forces.

12.3 Simulation infrastructure

The National Research Council (1997) portrayed modeling and simulation as a foundation technology for many developments that will be central to the US Navy over the next several decades. Representative applications of simulation in the defense industry were summarized by Bracken etal. (1995), who edited a useful collection of papers coordinated by the Military Operations Research Society (MORS).

The DMSO was established in 1991 to provide a focal point for information concerning US DOD M&S activities. The DMSO is leading an effort to establish a common technical framework (CTF) to facilitate the interoperability and reuse of all types of models and simulations. The foundation for this effort is the HLA (see Section 12.4). Two other elements of the common technical framework include conceptual models of the mission space (CMMS) and data standards. When completed, CMMS will provide simulation-independent descriptions of real-world processes, entities, environments and relationships. The data standards program will provide the M&S community with certified data to promote interoperability of models and simulations, thus improving the credibility of simulation results. Planned representations of the natural environment will include terrain, oceans, atmosphere and space.

The DMSO also provides services to complement the common technical framework including VV&A procedures and environmental databases. The US Department of Defense (1994) officially adopted definitions for VV&A that originated from the efforts of the MORS. These definitions are useful for applications in naval operations, offshore industries and oceanographic research:

• Verification - The process of determining that a model implementation accurately represents the developer's conceptual description and specifications.

• Validation - The process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model.

• Accreditation - The official certification that a model or simulation is acceptable for a specific purpose.

The US Department of Defense (1996) assembled a very useful compendium of VV&A techniques from sources in government, industry and academia. This evolving document provides practical guidelines for formulating VV&A procedures in a wide range of modeling and simulation environments.

12.4 High-level architecture

The HLA (Kuhl et al., 1999) is the highest-priority effort within the DOD modeling and simulation community. The HLA has been adopted as IEEE Standard 1516 and has also been proposed for acceptance by the North Atlantic Treaty Organization (NATO) as the standard for simulations used within the alliance. The HLA is composed of three parts: the HLA rules, the HLA interface specification and the object model template (OMT). The HLA rules describe the general principles defining the HLA and also delineate 10 basic rules that apply to HLA federations and their participating applications (called federates). The HLA interface specification defines the functional interface between federates and the runtime infrastructure (RTI). The OMT provides specifications for documenting key information about simulations and federations. Use of the OMT to describe simulation and federation object models (called SOMs and FOMs, respectively) is a key part of the HLA (North Atlantic Treaty Organization, 1998; Kuhl et al., 1999).

The NATO Alliance is generally dependent upon the modeling and simulation contributions of the member nations. These contributions include the cooperative development of technical capabilities such as the defence modeling and simulation technologies research program in the European Cooperation for the Long-term in Defense (EUCLID). In this environment, required simulations must either be specified and then developed anew, or else legacy simulations must be adapted to the meet the specified requirements.

The development of individual models and simulations has been occurring for decades and is therefore relatively well understood by NATO. The alliance, however, only has limited experience with the cooperative development of federations of diverse simulations. To be prudent, therefore,

NATO has opted to demonstrate the viability of this innovative development approach by conducting a pathfinder development of an HLA-based federation of national simulations. This federation would be planned and centrally integrated and tested, but the individual national simulation developments would be executed by the involved nations. Ideally, such a pathfinder effort would be built on the experience base established during NATO's Distributed Multi-National Defence Simulation (DiMuNDS) project. Selected legacy simulations will have to fulfill the specified criteria and will, therefore, require some modifications (North Atlantic Treaty Organization, 1998,2000).

12.5 Testbeds

Testbeds allow the simultaneous use of high- and low-detail system representations (i.e. variable resolution) in a single simulation. This flexibility enables an analyst to simulate a key system in high detail while simulating the less-critical contextual environment in lower detail.

In integrated hierarchical variable resolution (IHVR) simulations, highlevel variables are expressed as functions of lower-level (but higherresolution) variables. Here, hierarchies of variables can relate models at different resolutions (Davis, 1995). Complications may arise from so-called configural effects, which consider the influences of temporal and spatial correlations on simulated outcomes (National Research Council, 1997: 90 and 233). Specifically, the configuration of the simulated assets is inseparable from the outcome of the particular simulation. Consequently, if the assets were configured differently at the outset, the simulated outcome would likely be different.

An integrated testbed includes the processor on which the simulation software will run together with all other units that will interface with the processor. This arrangement affords the opportunity to perform early interface testing using the actual hardware. For example, the tactical oceanography simulation laboratory (TOSL) provides a testbed for the development, testing and validation of high-fidelity underwater acoustic models and supporting databases (Ellis et al., 1996).

From a broader perspective, simulation testing can be accomplished either in laboratory-based testbeds or in at-sea tests. At-sea tests provide engineers the opportunity to validate sonar-system performance in real (versus synthetic) ocean environments. For example, the littoral warfare advanced development (LWAD) project provides at-sea tests (including platforms and coordination) to identify and resolve technical issues that arise from operating undersea systems in littoral environments (Spikes et al., 1997). Sea tests can range from simple focused technology experiments (FTE) to more complex system-concept validations (SCV). The penalty paid for testing in a real (versus synthetic) environment is the loss of experimental control and repeatability.

Situating testbeds at a fixed site in the field can create an interesting hybrid testbed configuration. Such arrangements, sometimes referred to as "natural laboratories," have attractive features over laboratory-based testbeds. For example, field sites permit sustained modeling and observing systems to be deployed so that models can be continually tested and refined in real (versus synthetic) environments. Moreover, operational training can be collocated with fielded testbeds so that realistic experience can be obtained. The loss of experimental control and repeatability is limited by selecting a fixed location in an ocean area that is well understood environmentally. An example of a fielded testbed is the northern Gulf of Mexico littoral initiative, or NGLI (Carroll and Szczechowski, 2001). The NGLI is a multi-agency program established through a partnership between the Commander, Naval Meteorology and Oceanography Command and the Environmental Protection Agency's Gulf of Mexico Program Office. The goal of NGLI is to become a sustained comprehensive nowcast-ing/forecasting system for the coastal areas of Mississippi, Louisiana and Alabama that will use model forecasts and observational data for training and coastal resource management. The program integrates a reliable and timely meteorological and oceanographic modeling scheme, combining 3D circulation, sediment transport, and atmosphere and wave models with in situ and remotely sensed observations via an extensive data distribution network that is available to a wide range of users in near-real time through an interactive website. The Naval Oceanographic Office, who manages the program, chose the Mississippi Bight as an ideal testbed to examine new modeling and observational technologies before they are applied to other littoral areas of interest. The NGLI directly addresses the US Navy's requirement to project oceanographic information from deep-water environments shoreward into littoral areas. Model nowcasts and forecasts are being applied to the ocean littoral environment by cascading information from large ocean basin models to shallow-water models. Lessons learned within this "natural laboratory" provide civil authorities with metrics by which to evaluate environmental stresses (e.g. sediment transport modifications and increased pollution) caused by growth in population and industry.

12.6 Applications

Simulations are used in diverse scientific and engineering disciplines. Although the following discussions will highlight many naval applications, the basic approaches and practices surveyed here are applicable to offshore industries and oceanographic research as well. Specific attention will be given to those activities relating to engineering- and engagement-level simulations. It may be useful at this point to again refer to the discussions in Section 12.2 (especially Table 12.2).

Discussions will start with engineering-level simulations that generate system-performance outputs. As indicated in Table 12.2, general applications include design and evaluation of systems (or subsystems) and system-testing support. As specific examples, Section 12.6.1 will discuss applications in systems engineering and Section 12.6.2 will discuss applications in SBA.

The next level above engineering is engagement-level simulations, which generate system-effectiveness outputs. As indicated in Table 12.2, general applications include the evaluation of system alternatives, training of system operators and evaluation of tactics. As specific examples, Section 12.6.3 will discuss applications in operations analysis and Section 12.6.4 will discuss applications in training.

These discussions emphasize processes as opposed to specific implementations of simulation packages. The intent is to familiarize the reader with generally accepted approaches to simulation as practiced in government (both civil and military) and in industry (both offshore and defense related). Research applications may utilize variations of these approaches.

12.6.1 Systems engineering

An objective of systems engineering is to gain visibility in the early stages of a system's development. The intent is to explore all feasible approaches to system design, identify and eliminate potential problems, select a preferred design configuration and thus reduce risks and costs (Blanchard, 1998). The use of simulation allows designers to investigate alternative design solutions prior to committing to any particular design. In a systems-engineering context, simulation is used to understand the behavior of a system or to evaluate alternative system-design considerations in tradeoff studies. The use of simulation is most effective in the early stages of system development before any physical elements comprising the system are available for evaluation.

The systems-engineering process is often represented by the so-called "V" diagram, as illustrated in Figure 12.1. The decomposition-and-definition process flows downward along the left leg of the "V" while the integration-and-verification process flows upward along the right leg of the "V." In the decomposition-and-definition process (on the left side of Figure 12.1), the utilization of simulation during the preliminary design phase would have the greatest impact on the final system configuration.

An important asset in the systems engineering process is the software engineering environment (SEE), which supports the software development process. The SEE comprises the facilities, integration methods, tools, procedures and management information system (MIS) necessary to maintain productivity, simulation product quality and software reuse.

12.6.2 Simulation-based acquisition

Simulation-based acquisition is an acquisition process in which both the US Department of Defense (DOD) and industry collaborate in the use of

Decomposition and definition

Integration and verification

Decomposition and definition

Integration and verification

Figure 12.1 Systems engineering decomposition and definition process flow. (Blanchard, 1998; System Engineering Management, 2nd edn; this material is used by permission of John Wiley & Sons, Inc.)

simulation technologies that are integrated across acquisition phases and programs (US Department of the Navy, 2000a). Specifically, SBA entails the optimization of system performance versus total ownership cost (TOC) through exploration of the largest possible trade space. Total ownership cost comprises the cost to research, develop, acquire, own, operate, and dispose of primary and support systems, other equipment and real property, the costs to recruit, train, retain, separate and otherwise support military and civilian personnel, and other costs of business operations in the DOD. Johnson et al. (1998) provided a very useful introduction to the concepts underpinning SBA.

It is helpful to revisit the modeling and simulation hierarchy pyramid introduced previously in Chapter 1 (see Figure 1.3) in the context of systemdesign applications. In Figure 12.2 (US Department of the Navy, 2000a), the four categories of simulation (engineering, engagement, mission and theater) are related to their principal outputs: engineering-level simulations output system or component performance; engagement-level simulations output system effectiveness; mission-level simulations output mission effectiveness; and theater-level simulations output the overall outcomes. Requirements are generally developed in a top-down approach as indicated by the

Concept Definition (candidate solutions)

Figure 12.2 Modeling and simulation in system design (US Department of Navy, 2000a).

Concept Definition (candidate solutions)

Figure 12.2 Modeling and simulation in system design (US Department of Navy, 2000a).

downward-directed arrow (labeled "requirements development") on the left side of the pyramid in Figure 12.2. Specifically, mission requirements (derived from theater-level simulations) flow down to system requirements which, in turn, flow down to performance requirements. These performance requirements are then allocated among the various subsystems and components using engineering-level simulations. Alternatively, a bottom-up approach may be used as indicated by the upward-directed arrow (labeled "concept assessment") on the right side of the pyramid. In this approach, performance capabilities (generated by engineering-level simulations) are translated into system capabilities which, in turn, are translated into mission capabilities whereafter the outcomes are evaluated in theater-level simulations. The arrow at the base of the pyramid in Figure 12.2 (labeled "concept definition") suggests a connection with the "V" diagram presented in Figure 12.1. Specifically, adhering to the discipline of the decomposition-and-definition process ensures that the perspectives of users, designers and developers are reflected in the concept definition.

The US DOD divides the system-acquisition process into three principal phases: concept and technology development; system development and demonstration; and production and deployment. A fourth phase (operations and support) covers life-cycle sustainment (refer to Table 12.3). In each phase, it is possible to identify the generic functions performed by each of the four levels of simulation (engineering, engagement, mission and theater). In Table 12.3, each phase of acquisition or sustainment is identified in the first column. In the second column, the relationship to notable requirements documents is indicated. The third through sixth columns identify the generic

Table 12.3 Functions of the four levels of simulation corresponding to each phase of the acquisition process



Level of simulation

Was this article helpful?

0 0

Post a comment