Source: Adapted from Lauer and Sussman (1979).
Source: Adapted from Lauer and Sussman (1979).
Elements that should be included in model documentation are (Gass, 1979):
1 a precise statement of what the model is supposed to do;
2 the mathematical and logical definitions, assumptions and formulation of the problem being modeled;
3 a complete set of current input and output, and test cases that have been run;
4 a complete set of flow charts of the computer program;
5 a set of operating instructions for the computer operator;
6 an explanation of the various options available in using the model;
7 the computer program itself (listing), with comments about various operations in the program.
Further guidelines for documentation are provided later in this chapter. 11.6.2 Verification
Verification entails an examination of the model to ensure that the computer program accurately describes the model and that the program runs as expected. In order to do this, the following factors must be examined: (1) consistency of mathematical and logical relationships; (2) accuracy of intermediate numerical results; (3) inclusion of important variables and relationships; and (4) proper mechanization and debugging of the program.
Computer hardware selections will sometimes impact model accuracy due to word length and double-precision considerations, among others. Thus, the same model implemented on two different computer systems may produce significantly different results. Related problems concern artifacts, which are false features that arise from some quirk of the computer, and which disappear when the software is written differently.
Validation and critical assessment are required of all theoretical conjectures, hypotheses and models. One of the most difficult but important tasks in model construction is the specification of its limitations: what are its limits, and in what way is it an approximation? The consensus principle in science implies that the evaluation of models must be open, and cannot be accepted on the authority of the model developer alone. The model should not contain adjustable parameters or any hidden variables that have to be invoked to explain discrepancies between theory and experiment. The theoretical properties of the model should be sharply defined, and derived with sufficient mathematical rigor to be compared objectively with the observed phenomena. Elementary errors and misunderstandings will be detected by the independent repetition of experiments and by comparisons of calculations with experimental data, or by theoretical criticism (Ziman, 1978). Even when models are wrong, they can assist in structuring discussions.
The category of validity comprises three factors: theoretical validity, data validity and operational validity. Theoretical validity entails review of the physics underlying the model and the major stated and implied assumptions. The applicability and restrictiveness of these assumptions must also be examined. In addition, the internal logic of the model should be reviewed.
For empirically based models, data validity is concerned with the accuracy and completeness of the original data and the manner in which the empirical model deals with the transformation of the original data.
Operational validity is concerned with assessing the impact of model errors (i.e. divergences between model predictions and reality) on decision processes. This aspect of model evaluation addresses accuracy. When evaluating model accuracy, it is important to examine the error budget. Errors in model predictions (ep) are assumed to be the sum of two independent, random variables:
where ed represents errors related to model input data and em represents model errors.
Maintainability considers the ease of incorporating new data and formulae as well as provisions for reviewing the accuracy of the model as more experimental data become available. A training program must be formalized to ensure that the operators understand how the model should be used and also to make revisions known to the computer-systems personnel.
Usability addresses the appropriateness of the model for the intended applications. In essence, the model should satisfy the user's specific requirements. This, of course, requires that users articulate their needs very precisely.
It is instructive to distinguish between research and operational models when discussing usability. Specifically, research models are intended to address a wide variety of often ill-posed scientific questions. In order to be responsive to such ambiguous issues, the research models are structured to allow (and often require) a high degree of operator intervention during execution. This allows the researcher to adjust parameters as the problem solution evolves. Alternatively, operational models are structured to minimize (if not eliminate) the need for operator intervention. Indeed, such intervention is viewed as a nuisance to the operator. The parallel between executive-versus-bundled system architectures and research-versus-operational models is valid (refer to Chapter 10, Section 10.4.1).
Model usability is heavily influenced by the model's inherent domains of applicability. Factors such as frequency coverage and problem geometry are implicit in these domains. For example, the use of normal-mode models in the calculation of high-frequency bistatic reverberation is not practical at present due to excessive computation times. Model output options are also important. For example, wave-theoretical propagation models more easily generate TL values in the range-depth plane while arrival structure information is more easily generated by ray-theoretical propagation models.
Documentation standards for computer models were reviewed by Gass (1979). Both the US Department of Defense and the National Institute of Standards and Technology (formerly the National Bureau of Standards, or NBS) have issued formal guidelines:
• DOD-STD-7935A - Department of Defense standard: Automated data systems (ADS) documentation.
• DOD-STD-2167A - Military standard: Defense system software development.
• FIPS PUB 38 - Guidelines for documentation of computer programs and automated data systems.
• DOD-STD-2168 - Military standard: Defense system software quality program.
According to the federal information processing standards (FIPS), model documentation is keyed to the various phases and stages of the software life
Table 11.4 Software life cycle and documentation types according to the federal information processing standards (Gass, 1979)
Initiation phase Development phase Operation phase
Definition stage Design stage Programming Test stage stage
Functional System/subsystem User's manual requirements specification document
Program specification Operations manual
Data requirements Database Program document specification maintenance manual
Test plan Test analysis report
Functional requirements document - provides a basis for the mutual understanding between users and designers of the initial definition of the software, including the requirements, operating environment and development plan.
Data requirements document - provides data descriptions and technical information about the data collection requirements.
System!subsystem specification - specifies for analysts and programmers the requirements, operating environment, design characteristics and program specifications for a system or subsystem.
Program specification - specifies for programmers the requirements, operating environment and design characteristics of a computer program. Database specification - specifies the identification, logical characteristics and physical characteristics of a particular database.
User's manual - sufficiently describes the functions performed by the software in non-ADP terminology such that the user organization can determine its applicability, and when and how to use it; moreover, it should serve as a reference document for preparation of input data and parameters, and for interpretation of results.
Operations manual - provides computer operation personnel with a description of the software and of the operational environment so that the software can be run.
Program maintenance manual - provides the maintenance programmer with the information necessary to understand the programs, their operating environment and their maintenance procedures.
Test plan - provides a plan for testing the software; provides detailed specifications, descriptions and procedures for all tests; and provides test data reduction and evaluation criteria.
Test analysis report - documents the test analysis results and findings; presents the demonstrated capabilities and deficiencies for review; and provides a basis for preparing a statement of software readiness for implementation.
cycle (Table 11.4). The major phases are:
Initiation phase - During this phase, the objectives and general definition of the requirements for the software are established. Feasibility studies, cost-benefit analyses and the documentation prepared within this phase are determined by agency procedures and practices. Development phase - During this phase, the requirements for the software are determined, and software is then defined, specified, programmed and tested. Ten major documents are prepared in this phase to provide an adequate record of the technical information developed (Table 11.4).
Operation phase - During this phase, the software is maintained, evaluated and changed as additional requirements are identified. The documentation is maintained and updated accordingly.
The development phase of the software life cycle is further subdivided into four main stages as follows:
Definition stage - when the requirements for the software and documentation are determined.
Design stage - when the design alternatives, specific requirements and functions to be performed are analyzed and a design is specified. Programming stage - when the software is coded and debugged. Test stage - when the software is tested and related documentation is reviewed. The software and documentation are then evaluated in terms of readiness for implementation.
Formal documentation guidelines are frequently amended or even superseded by newer guidelines. Care should be taken to consult the latest governing instructions. For example, DOD-STD-7935A and DOD-STD-2167A (which were noted earlier) were later consolidated into MIL-STD-498 in an effort to implement governing ISO/IEC standards (DIS 12207 -Software Life-Cycle Processes). MIL-STD-498 (Software development and documentation) was issued on 5 December 1994. However, this standard was subsequently canceled on 27 May 1998. In its place, IEEE/EIA 12207 (issued in three parts) is now to be used:
• IEEE/EIA 12207.0: "Industry implementation of international standard ISO/IEC 12207, Standard for information technology - Software life-cycle processes" (March 1998), contains concepts and guidelines to foster better understanding and application of the standard. This standard thus provides industry with a basis for software practices usable for both national and international business.
• IEEE/EIA 12207.1: "Guide for ISO/IEC 12207, Standard for information technology - Software life-cycle processes - Life-cycle data," was adopted on 27 May 1998 for use by the US Department of Defense. This document provides guidance on life-cycle data resulting from the processes of IEEE/EIA 12207.0 including relationships among content of life-cycle data information items, references to documentation of life-cycle data, and sources of detailed software product information. • IEEE/EIA 12207.2: "Guide for ISO/IEC 12207, Standard for information technology - Software life-cycle processes - Implementation considerations," was adopted on 27 May 1998 for use by the US Department of Defense. This document provides implementation guidance based on software industry experience with the life-cycle processes.
The transfer of modeling and simulation (M&S) technologies among members of the international community continues to stimulate new initiatives for improved international standards in simulation architecture. Such efforts seek to promote the large-scale interoperability of simulation software and hardware.
Was this article helpful?