GLOSSARY OF QA terminology

The following definitions are taken from accepted and identified sources to help in the understanding of terms used on the certification exams. A thorough understanding of their meaning will facilitate quality assurance process .

"Terminology is a key factor in ensuring a common understanding of the software development effort to be accomplished. Terms used throughout the quality assurance documentation shall have the same meaning as the terms used in this glossary."



[Back to Top]


An action by an authorised representative of the acquirer by which the acquirer assumes ownership of products as a partial or complete performance of contract.



(1) A finite set of well-defined rules for the solution of a problem in a finite number of steps. (IEEE) (2) Any sequence of operations for performing a specific task.

Algorithm analysis


A software V&V task to ensure that the algorithms selected are correct, appropriate, and stable, and meet all accuracy, timing, and sizing requirements. (IEEE)


(1) To separate into elemental parts or basic principles so as to determine the nature of the whole (2) A course of reasoning showing that a certain result is a consequence of assumed premises. (3) The methodical investigation of a problem, and the separation of the problem into smaller related units for further detailed study. (ANSI)

Anomaly (IEEE)

Anything observed in the documentation or operation of software that deviates from expectations based on previously verified software products or reference documents. See: bug, defect, error, exception, fault. (IEEE)

Application software


Software designed to fill specific needs of a user; for example, software for navigation, payroll, or process control. Contrast with support software; system software. (IEEE)

Architecture (IEEE)

The organizational structure of a system or component. (IEEE)


(1) An independent examination of a work product or set of work products to assess compliance with specifications, standards, contractual agreements, or other criteria. (IEEE) (2) To conduct an independent review and examination of system records and activities in order to test the adequacy and effectiveness of data security and data integrity procedures, to ensure compliance with established policy and operational procedures, and to recommend any necessary changes. (ANSI)

Audit trail

(1) Data in the form of a logical path linking a sequence of events, used to trace the transactions that have affected the contents of a record. (ISO) (2) A chronological record of system activities that is sufficient to enable the reconstruction, reviews, and examination of the sequence of environments and activities surrounding or leading to each event in the path of a transaction from its inception to output of final results


[Back to Top]


A specification or product that has been formally reviewed and agreed upon, that serves as the basis for further development, and that can be changed only through formal change control procedures. (NIST)


A Baseline is a Configuration Identification formally designated and applicable at a specific point in an items life cycle. Baselines, plus approved changes from those baselines, constitute the current configuration identification. A Configuration identification document or a set of such documents formally designated by the acquirer (Customer) at a specific time during a CI life cycle.
Project baselines:

  1. Design Requirements Baseline (DRB). The DRB defines the essential program design requirements for technical System, and is contained in the System Specification, etc.
  2. Development Component Baseline (DCB). The DCB defines the Functional, Physical and Interface characteristics of the component items of an assembly or system. It shall be applied throughout the design and development phase and consist of Project definition Drawings and the Development Specifications relating to Hardware, and Software.
  3. Production Baseline (PBBS). The PBBS defines the physical and functional characteristics of the production technical System. The identification documentation includes definition of:
  • the essential physical and functional characteristics of the specific System;
  • approved tests for product acceptance (PAS).
  • parts list, MRI identified by codification requirements (usually referred to as the Build Standard).

Baselines (System, CSCI and HWCI).

  • Functional baseline. The initial approved functional configuration identification (documentation) for each CSCI and HWCI which describe a system or items functional characteristics and the verification required to demonstrate the achievement of those specified functional characteristics, for example, System specifications, Prime or critical item specifications,.
  • Allocated baseline. The initial approved allocated configuration identification (documentation) for each CSCI and HWCI functional and interface characteristics allocated from those of a higher level CI, and interface requirements with interfacing configuration items, additional design constraints and the verification required to demonstrate the achievement of those specified functional and interface characteristics.

Product baseline. The initial approved product configuration identification (documentation) for each CSCI and HWCI. The documentation shall describe all of the necessary functional and physical characteristics of the CI, any required joint and combined operations, interoperability characteristics of a CI and the selected functional and physical characteristics designated for production acceptance testing and tests necessary for support of the CI.


A standard against which measurements or comparisons can be made.


Programs that provide performance comparison for software, hardware, and systems.


Pertaining to the principles of mathematical logic developed by George Boole, a nineteenth century mathematician. Boolean algebra is the study of operations carried out on variables that can have only one of two possible values, i.e., 1 (true) and 0 (false). As ADD, MULTIPLY, and DIVIDE are the primary operations of arithmetic, AND, OR, and NOT are the primary operations of Boolean Logic


A fault in a program which causes the program to perform in an unintended or unanticipated manner. [anomaly, defect, error, exception, fault]


(1) A version of software that meets a specified subset of the requirements that the completed software will meet. (2) The period of time during which such a version is developed.
Note: The relationship to the terms 'build' and 'version' is basically up to the developer; for example, it may take several versions to reach a build, a build may be released in several parallel versions (such as different sites), or the terms may be used as synonyms.


[QA Dictionary Back to Top]

Capability Maturity Model (CMM)

A description of the stages through which software organizations evolve as they define, implement, measure, control and improve their software processes. The model is a guide for selecting the process improvement strategies by facilitating the determination of current process capabilities and identification of the issues most critical to software quality and process improvement. [SEI/CMU-93-TR-25]

Cause effect graphing

(1) Test data selection technique The input and output domains are partitioned into classes and analysis is performed to determine which input classes cause which effect. A minimal set of inputs is chosen which will cover the entire effect set. (NBS) (2) A systematic method of generating test cases representing combinations of conditions. (G. Myers) See: testing, functional

Change control

The processes, authorities for, and procedures to be used for all changes that are made to the computerized system and/or the system's data. Change control is a vital subset of the Quality Assurance [QA] program within an establishment and should be clearly described in the establishment's SOPs, See: configuration control.

Change proposal

The formal documentation that is prepared for a proposed change in accordance with the CMP Change Procedure.

Change request

The formal documentation that is prepared for a request to change a specification in accordance with the CMP Change Procedure.

Change tracker

A software tool which documents all changes made to a program.

Client - server

A simple definition of client / server computing is that server software accepts requests for data from client software and returns the results to the client. The client manipulates the data and presents the results to the user.


See: program, source code.

Code audit.


An independent review of source code by a person, team, or tool to verify compliance with software design documentation and programming standards. Correctness and efficiency may also be evaluated. (IEEE) Contrast with code inspection, code review, code walkthrough. See: static analysis

Code inspection

(G. Myers/NBS)

A manual [formal] testing [error detection] technique where the programmer reads source code, statement by statement, to a group who ask questions analyzing the program logic, analyzing the code with respect to a checklist of historically common programming errors, and analyzing its compliance with coding standards. Contrast with, code audit, code review, code walkthrough. This technique can also be applied to other software and configuration items. Syn: Fagan Inspection. See: static analysis. (G. Myers/NBS)

Code review


A meeting at which software code is presented to project personnel, managers, users, customers, or other interested parties for comment or approval. (IEEE) Contrast with code audit, code inspection, code walkthrough. See: static analysis.

Code walkthrough

(G. Myers/INBS)

A manual testing [error detection] technique where program (source code] logic [structure] is traced manually [mentally] by a group with a small set of test cases, while the state of program variables is manually monitored, to analyze the programmer's logic and assumptions. (Myers/INBS) Contrast with code audit, code inspection, code review. See: static analysis.

Coding standards

Written procedures describing coding [programming] style conventions specifying rules governing the use of individual constructs provided by the programming language, and naming, formatting, and documentation requirements which prevent programming errors, control complexity and promote understandability of the source code. Syn: development standards, programming standards.

Commercial off-the-shelf (COTS) software

Commercially available applications sold by Vendors through public catalogue listings, COTS software is not intended to be customised or enhanced. Contract-negotiated software developed for a specific application is not COTS software.


1.The time it takes for a packet to cross a network connection, from sender to receiver.
2.The period of time that a frame is held by a network device before it is forwarded.
Two of the most important parameters of a communications channel are its latency, which should be low, and its bandwidth, which should be high. Latency is particularly important for a synchronous where each packet must be acknowledged before the next can be transmitted.



A software tool that compares two computer programs, files, or sets of data to identify commonalities or differences. Typical objects of comparison are similar versions of source code, object code, data base files, or test results. (IEEE)

Compatability testing

Compatability testing - testing how well software performs in a particular hardware/software/operating system/network/etc. environment.



The property that all necessary parts of the entity are included. Completeness of a product is often used to express the fact that all requirements have been met by the product. (NIST) See: traceability analysis.



(1) The degree to which a system or component has a design or implementation that is difficult to understand and verify. (IEEE)

(2) Pertaining to any of a set of structure based metrics that measure the attribute in (1).

Computer aided software engineering (CASE)

An automated system for the support of software development including an integrated tool set, i.e., programs, which facilitate the accomplishment of software engineering methods and tasks such as project planning and estimation, system and software requirements analysis, design of data structure, program architecture and algorithm procedure, coding, testing and maintenance.

Computer system audit (ISO)

An examination of the procedures used in a computer system to evaluate their effectiveness and correctness and to recommend improvements. (ISO) See: software audit.

Configurable, off-the-shelf software (COTS)

Application software, sometimes general purpose, written for a variety of industries or users in a manner that permits users to modify the program to meet their individual needs.



The functional and/or physical characteristics of hardware/software as set forth in technical documentation and achieved in a product. (MIL-STD-973)

Configuration control


An element of configuration management, consisting of the evaluation, coordination, approval or disapproval, and implementation of changes to configuration items after formal establishment of their configuration identification. (IEEE) See: change control.

Configuration control board

A board composed of technical and administrative representatives who approve or disapprove proposed engineering changes to an approved baseline.

Configuration elements

Specifications, drawings, source code, etc., that define the configuration of a CSCI.

Configuration management


A discipline applying technical and administrative direction and surveillance to identify and document the functional and physical characteristics of a configuration item, control changes to those characteristics, record and report change processing and implementation status, and verifying compliance with specified requirements. (IEEE) See: configuration control, change control

Configuration management (CM).

A discipline applying technical and administrative direction and surveillance to:

  • identify and document the functional and physical characteristics of CIs;
  • audit the CIs to verify conformance to specifications, interface control documents and other contract requirements;
  • control changes to CIs and their related documentation; and
  • record and report information needed to manage CIs effectively, including the status of proposed changes and the implementation status of approved changes.

Configuration status accounting

The recording and reporting of information needed to manage configuration effectively, including:

  • a listing of the approved configuration identification;
  • the status of proposed changes, deviations, and waivers to the configuration;
  • the implementation status of approved changes, and;
  • the configuration all units of the CI in the operational inventory.


Computer software configuration item

A software item which is identified for configuration management. (MIL-STD-973) or, An aggregation of software that satisfies an end use function and designated for separate configuration management. (MIL-STD-498)



The degree of uniformity, standardization, and freedom from contradiction among the documents or parts of a system or component. (IEEE)

Consistency checker

A software tool used to test requirements in design specifications for both consistency and completeness

Control flow analysis


A software V&V task to ensure that the proposed control flow is free of problems, such as design or code elements that are unreachable or incorrect. (IEEE)

Control flow diagram


A diagram that depicts the set of all possible sequences in which operations may be performed during the execution of a system or program. Types include box diagram, flowchart, input-process-output chart, state diagram. (IEEE) Contrast with data flow diagram. See: call graph, structure chart.

Corrective maintenance


Maintenance performed to correct faults in hardware or software. (IEEE) Contrast with adaptive maintenance, perfective maintenance.



The degree to which software is free from faults in its specification, design and coding. The degree to which software, documentation and other items meet specified requirements. The degree to which software, documentation and other items meet user needs and expectations, whether specified or not. (IEEE)

Coverage analysis


Determining and assessing measures associated with the invocation of program structural elements to determine the adequacy of a test run. Coverage analysis is useful when attempting to execute each statement, branch, path, or iterative structure in a program. Tools that capture this data and provide reports summarizing relevant information have this feature. (NIST) See: testing, branch; testing, path; testing, statement.



The sudden and complete failure of a computer system or component. (IEEE)

Critical control point

A function or an area in a manufacturing process or procedure, the failure of which, or loss of control over, may have an adverse affect on the quality of the finished product and may result in a unacceptable health risk.

Critical design review


A review conducted to verify that the detailed design of one or more configuration items satisfy specified requirements; to establish the compatibility among the configuration items and other items of equipment, facilities, software, and personnel; to assess risk areas for each configuration item; and, as applicable, to assess the results of producibility analyses, review preliminary hardware product specifications, evaluate preliminary test planning, and evaluate the adequacy of preliminary operation and support documents. (IEEE) See: preliminary design review, system design review



The degree of impact that a requirement, module, error, fault, failure, or other item has on the development or operation of a system. (IEEE) Syn: severity.

Criticality analysis


Analysis which identifies all software requirements that have safety implications, and assigns a criticality level to each safety-critical requirement based upon the estimated risk. (IEEE)

Cyclic redundancy [check] code (CRC)

A technique for error detection in data communications used to assure a program or data file has been accurately transferred. The CRC is the result of a calculation on the set of transmitted bits by the transmitter which is appended to the data. At the receiver the calculation is repeated and the results compared to the encoded value. The calculations are chosen to optimize error detection. Contrast with check summation, parity check.

Cyclomatic complexity

(1) The number of independent paths through a program. (McCabe) (2) The cyclomatic complexity of a program is equivalent to the number of decision statements plus 1. (NBS)


[QA Dictionary Back to Top]

Data analysis


(1) Evaluation of the description and intended use of each data item in the software design to ensure the structure and intended use will not result in a hazard. Data structures are assessed for data dependencies that circumvent isolation, partitioning, data aliasing, and fault containment issues affecting safety, and the control or mitigation of hazards. (IEEE) (2) Evaluation of the data structure and usage in the code to ensure each is defined and used properly by the program. Usually performed in conjunction with logic analysis.

Data corruption(ISO)

A violation of data integrity. (ISO) Syn: data contamination.

Data dictionary


(1) A collection of the names of all data items used in a software system, together with relevant properties of those items; e.g., length of data item, representation, etc. (IEEE) (2) A set of definitions of data flows, data elements, files, data bases, and processes referred to in a leveled data flow diagram set.

Data flow analysis


A software V&V task to ensure that the input and output data and their formats are properly defined, and that the data flows are correct. (EEE)

Data flow diagram


A diagram that depicts data sources, data sinks, data storage, and processes performed on data as nodes, and logical flow of data as links between the nodes. Syn: data flowchart, data flow graph. (IEEE)

Data integrity


The degree to which a collection of data is complete, consistent, and accurate. (IEEE) Syn: data quality.

Data validation

(1) A process used to determine if data are inaccurate, incomplete, or unreasonable. The process may include format checks, completeness checks, check key tests, reasonableness checks and limit checks. (ISO) (2) The checking of data for correctness or compliance with applicable standards, rules, and conventions.


(G. Myers)

Determining the exact nature and location of a program error, and fixing the error. (G. Myers)

Decision coverage

(G. Myers)

A test coverage criteria requiring enough test cases such that each decision has a true and false result at least once, and that each statement is executed at least once. (G. Myers) Syn: branch coverage. Contrast with condition coverage, multiple condition coverage, path coverage, statement coverage

Decision table

A table used to show sets of conditions and the actions resulting from them.


Noncomformance to requirements. See: anomaly, bug, error, exception, fault. defect analysis. See: failure analysis.


An issue that requires tracking and/or corrective action. Depending on the attributes of defects, they may be test incidents, software bugs, design problems, enhancement requests, or anything indicating incorrect or undesired operation.

Defect analysis

Analyzing defects, and trying to find their causes, in order to make improvements.

Design-based testing


Designing tests based on objectives derived from the architectural or detail design of the software (e.g., tests that execute specific invocation paths or probe the worst case behaviour of algorithms).

Design of experiments

A methodology for planning experiments so that data appropriate for [statistical] analysis will be collected.

Design phase


The period of time in the software life cycle during which the designs for architecture, software components, interfaces, and data are created, documented, and verified to satisfy requirements. (IEEE)

Design specification


A specification that documents how a system is to be built. It typically includes system or component structure, algorithms, control logic, data structures, data set [file] use information, input/output formats, interface descriptions, etc (NIST) Contrast with design standards, requirement. See: software design description.

Development methodology


A systematic approach to software creation that defines development phases and specifies the activities, products, verification procedures, and completion criteria for each phase. (ANSI) See: incremental development, rapid prototyping, spiral model, waterfall model.



Pertaining to the detection and isolation of faults or failures. (IEEE) For example, a diagnostic message, a diagnostic manual.

Different software system analysis


Analysis of the allocation of software requirements to separate computer systems to reduce integration and interface errors related to safety. Performed when more than one software system is being integrated. (IEEE) See: testing, compatibility.

Dynamic analysis


Analysis that is performed by executing the program code. (NBS) Contrast with static analysis. See: testing.


[QA Dictionary Back to Top]

Embedded software


Software that is part of a larger system and performs some of the requirements of that system; e,g., software used in an aircraft or rapid transit system. Such software does not provide an interface with the user. (IEEE) See: firmware

End user


(1) A person, device, program, or computer system that uses an information system for the purpose of data processing in information exchange. (ANSI) (2) A person whose occupation requires the use of an information system but does not require any knowledge of computers or computer programming. See: user.

End-to-end testing

Similar to system testing; the 'macro' end of the test scale; involves testing of a complete application environment in a situation that mimics real-world use, such as interacting with a database, using network communications, or interacting with other hardware, applications, or systems if appropriate.

Entity relationship diagram (IEEE)

A diagram that depicts a set of real-world entities and the logical relationships among them. (IEEE) See: data structure diagram.


(1) Everything that supports a system or the performance of a function. (ANSI) (2) The conditions that affect the performance of a system or function.

Error (ISO)

A discrepancy between a computed, observed, or measured value or condition and the true, specified, or theoretically correct value or condition. (ISO) See: anomaly, bug, defect, exception, fault.

Error analysis

See: debugging, failure analysis.

Error detection

Techniques used to identify errors in data transfers. See: check summation, cyclic redundancy check [CRC], parity check, longitudinal redundancy.

Error guessing.


A test case design technique where the experience of the tester is used to postulate what faults exist, and to design tests specially to expose them [BS7925-1]

Error guessing (NBS)

Test data selection technique. The selection is to pick values that are likely to cause errors. (NBS)

Error seeding


The process of intentionally adding known faults to those already in a computer program for the purpose of monitoring the rate of detection and removal, and estimating the number of faults remaining in the program. (IEEE) Contrast with mutation analysis.


The amount by which a result is incorrect. Mistakes are usually a result of a human action. Human mistakes (errors) often result in faults contained in the source code, specification, documentation, or other product deliverable. Once a fault is encountered, the end result will be a program failure. The failure usually has some margin of error, either high, medium, or low.

Exception. (IEEE)

An event that causes suspension of normal program operation. Types include addressing exception, data exception, operation exception, overflow exception, protection exception, underflow exception. (IEEE) See: anomaly, bug, defect, error, fault.


[QA Dictionary Back to Top]

Fagan inspection

See: code inspection.



A system or component that automatically places itself in a safe operational mode in the event of a failure. (IEEE)



The inability of a system or component to perform its required functions within specified performance requirements. (IEEE) See: bug, crash, exception, fault.

Failure analysis

Determining the exact nature and location of a program error in order to fix the error, to identify and fix other similar errors, and to initiate corrective action to prevent future occurrences of this type of error. Contrast with debugging.

Failure Modes and Effects Analysis


A method of reliability analysis intended to identify failures, at the basic component level, which have significant consequences affecting the system performance in the application considered. (IEC)

Failure Modes and Effects Criticality Analysis (IEC)

A logical extension of FMEA which analyzes the severity of the consequences of failure. (IEC)


An incorrect step, process, or data definition in a computer program which causes the program to perform in an unintended or unanticipated manner. See: anomaly, bug, defect, error, exception.

Fault seeding

See: error seeding.

Fault Tree Analysis


The identification and analysis of conditions and factors which cause or contribute to the occurrence of a defined undesirable event, usually one which significantly affects system performance, economy, safety or other required characteristics. (IEC)

Feasibility study

Analysis of the known or anticipated need for a product, system, or component to assess the degree to which the requirements, designs, or plans can be implemented.

Flowchart or flow diagram (IEEE)

A control flow diagram in which suitably annotated geometrical figures are used to represent operations, data, or equipment, and arrows are used to indicate the sequential flow from one to another. Syn: flow diagram. See: block diagram, box diagram, bubble chart, graph, input-processoutput chart, structure chart.

Flowchart or flow diagram (ISO)

A graphical representation in which symbols are used to represent such things as operations, data, flow direction, and equipment, for the definition, analysis, or solution of a problem. (2)

Formal qualification review. (IEEE)

The test, inspection, or analytical process by which a group of configuration items comprising a system is verified to have met specific contractual performance requirements. (IEEE) Contrast with code review, design review, requirements review, test readiness review.

Function (ISO)

A mathematical entity whose value, namely, the value of the dependent variable, depends in a specified manner on the values of one or more independent variables, with not more than one value of the dependent variable corresponding to each permissible combination of values from the respective ranges of the independent variables.

Functional analysis


Verifies that each safety-critical software requirement is covered and that an appropriate criticality level is assigned to each software element. (IEEE)

Functional configuration audit


An audit conducted to verify that the development of a configuration item has been completed satisfactorily, that the item has achieved the performance and functional characteristics specified in the functional or allocated configuration identification, and that its operational and support documents are complete and satisfactory. (IEEE) See: physical configuration audit.

Functional design


(1) The process of defining the working relationships among the components of a system. (IEEE)See: architectural design. (2) The result of the process in (1).

Functional requirement (IEEE)

A requirement that specifies a function that a system or system component must be able to perform.


[QA Dictionary Back to Top]



A diagram or other representation consisting of a finite set of nodes and internode connections called edges or arcs. (IEEE) Contrast with blueprint. See: block diagram, box diagram, bubble chart, call graph, cause-effect graph, control flow diagram, data flow diagram, directed graph, flowchart, input-process-output chart, structure chart, transaction flowgraph.

Graphic software specifications

Documents such as charts, diagrams, graphs which depict program structure,, states of data, control, transaction flow, HIPO, and cause-effect relationships; and tables including truth, decision, event, state-transition, module interface, exception conditions/responses necessary to establish design integrity.


[QA Dictionary Back to Top]


A condition that is prerequisite to a mishap. (DOD)

Hazard analysis

A technique used to identify conceivable failures affecting system performance, human safety or other required characteristics. See: FMEA, FMECA, FTA, software hazard analysis, software safety requirements analysis, software safety design analysis, software safety code analysis, software safety test analysis, software safety change analysis.

Hazard probability

(DOD) The aggregate probability of occurrence of the individual events that create a specific hazard.

Hazard severity

(DOD) An assessment of the consequence of the worst credible mishap that could be caused by a specific hazard.


[QA Dictionary Back to Top]


A formal evaluation technique in which software requirements, design, or code are examined in detail by person or group other than the author to detect faults, violations of development standards, and other problems [IEEE94]. A quality improvement process for written material that consists of two dominant components: product (document) improvement and process improvement (document production and inspection).


A manual testing technique in which program documents [specifications (requirements, design~, source code or user's manuaIs~ are examined in a very formal and disciplined manner to discover errors, violations of standards and other problems. Checklists are a typical vehicle used in accomplishing this technique. See: static analysis, code audit, code inspection, code review, code walkthrough.

Installation and checkout phase

(IEEE) The period of time in the software life cycle during which a software product is integrated into its operational environment and tested in this environment to ensure that it performs as required.

Institute of Electrical and Electronic Engineers (IEEE)

345 East 47th Street. New York, NY 10017. An organization involved in the generation and promulgation of standards. IEEE standares represent the formalization of current norms of professional practice through the process of obtaining the consensus of concerned, practicing professionals in the given field.


(ANSI/IEEE) A program statement that causes a computer to perform a particular operation or set of operations. (2) (ISO) In a programming language, a meaningful expression that specifies one operation and identifies its operands, if any.

Instruction set

(IEEE) The complete set of instructions recognized by a given computer or provided by a given programming language. (2) (ISO) The set of the instructions of a computer, of a programming language, or of the programming languages in a programming system. See: computer instruction set.


(NIBS) The insertion of additional code into a program in order to collect information about program behavior during program execution. Useful for dynamic analysis techniques such as assertion checking, coverage analysis, tuning.


The process of combining software components or hardware components or both into overall system.


A point of communication between two or more processes, persons, or other physical entities.

Interface (ISO)

A shared boundary between two functional units, defined by functional characteristics, common physical interconnection characteristics, signal characteristics, and other characteristics, as appropriate. The concept involves the specification of the connection of two devices having different functions.

Interface analysis

(IEEE) Evaluation of: (1) software requirements specifications with hardware, user, operator, and software interface requirements documentation, (2) software design description records with hardware, operator, and software interface requirements specifications, (3) source code with hardware, operator, and software interface design documentation, for correctness, consistency, completeness, accuracy, and readability. Entities to evaluate include data items and control items

Interface requirement

(IEEE) A requirement that specifies an external item with which a system component must interact, or sets forth constraints on formats, timing, or other factors caused by such an interaction.

Invalid inputs

These are not only inputs outside the valid range for data to be input, i.e., when the specified input range is 50 to 100, but also unexpected inputs, especially when these unexpected inputs may easily occur; e.g., the entry of alpha characters or special keyboard characters when only numeric data is valid, or the input of abnormal command sequences to a program.

Invalid inputs (NBS)

Test data that lay outside the domain of the function the program represents.


[QA Dictionary Back to Top]


(IEEE) A user-defined unit of work that is to be accomplished by a computer. For example, the compilation, loading, and execution of a computer program. See: job control language.

Job control language

(IEEE) A language used to identify a sequence of jobs, describe their requirements to an operating system, and control their execution.


[QA Dictionary Back to Top]

Kiviat Chart

A graph that provides a method of viewing the impact of multiple metrics on a source code module or set of files. This presentation allows easy determination of values (test metrics) that fall either under or over the expected minimum or maximum limits.


[QA Dictionary Back to Top]

Latency (computer science)

The time it takes for a specific block of data on a data track to rotate around to the read/write head [syn: rotational latency]

Life cycle methodology

The use of any one of several structured methods to plan, design, implement, test. and operate a system from its conception to the termination of its us. See: waterfall model.

Logic analysis (IEEE)

Evaluates the safety-critical equations, algorithms, and control logic of the software design.

Low-level language

See: assembly language. The advantage of assembly language is that it provides bit-level control of the processor allowing tuning of the program for optimal speed and performance. For time critical operations, assembly language may be necessary in order to generate code which executes fast enough for the required operations. The disadvantage of assembly language is the high-level of complexity and detail required in the programming. This makes the source code harder to understand, thus Increasing the chance of introducing errors during program development and maintenance.


[QA Dictionary Back to Top]

Machine code

(IEEE) Computer instructions and definitions expressed in a form [binary code] that can be recognized by the CPU of a computer. All source code, regardless of the language in which it was programmed, is eventually converted to machine code. Syn: object code.

Machine language

See: machine code.


(IEEE) In software engineering, a predefined sequence of computer instructions that is inserted into a program, usually during assembly or compilation, at each place that its corresponding macroinstruction appears in the program.


Term used to describe a large computer.


(IEEE) The ease with which a software system or component can be modified to correct faults, improve performance or other attributes, or adapt to a changed environment. Syn: modifiability


(QA) Activities such as adjusting, cleaning, modifying, overhauling equipment to assure performance in accordance with requirements. Maintenance to a software system includes correcting software errors, adapting software to a new environment, or making enhancements to software See: adaptive maintenance, corrective maintenance, perfective maintenance.

Mean time between failures

A measure of there liability of a computer system, equal to average operating time of equipment between failures, as calculated on a statistical basis from the known failure rates of various components of the system.

Mean time to failure

A measure of reliability, giving the average time before the first failure.

Mean time to repair

A measure of reliability of a piece of repairable equipment, giving the average time between repairs.


Capable of being measured.


(IEEE) A quantitative assessment of the degree to which a software product or process possesses a given attribute.


The process of determining the value of some quantity in terms of a standard unit.


Measures used to indicate progress or achievement

Metric based test data generation

(NBS) The process of generating test sets for structural testing based upon use of complexity metrics or coverage metrics.

Metric, software quality

(IEEE) A quantitative measure of the degree to which software possesses a given attribute which affects its quality. Find more on Project Evaluation


A scheduled event for which some individual is accountable and that is used to measure progress. [SEI/CMU-93-TR-25]

Modular software

(IEEE) Software composed of discrete parts. See: structured design.


(IEEE) The degree to which a system or computer program is composed of discrete components such that a change to one component has minimal impact on other components.


In programming languages, a self-contained subdivision of a program that may be separately compiled.


A packaged functional hardware units suitable for use with other components.


Mean time between failures.


Mean time to failure.


Mean time to repair.

Multiple condition coverage

(Myers) A test coverage criteria which requires enough test cases such that all possible combinations of conditions outcomes in each decision, and all points of entry, are invoked at least once. Contrast with branch coverage, condition, decision coverage, path coverage, path coverage, statement coverage.

Mutation analysis


A method to determine test set thoroughness by measuring the extent to which a test set can discriminate the program from slight variants [mutants] of the program. (NBS) Contrast with error seeding.


[QA Dictionary Back to Top]


National Institute for Standards and Technology.

NIST - National Institute for Standards and Technology

Gaithersburg, MD 20899. A federal agency under the Department of Commerce. originally established by an act of Congress on March 3, 1901 as the National Bureau of Standards. The Institute's overall goal is to strengthen and advance the Nation's science and technology and facilitate their effective application for public benefit. The National Computer Systems Laboratory conducts research and provides, among other things, the technical foundation for computer related policies of the Federal Government.

Noncritical code analysis (IEEE)

(1) Examines software elements that are not designated safety-critical and ensures that these elements do not cause a hazard. (IEEE) (2) Examines portions of the code that are not considered safety-critical code to ensure they do not cause hazards. Generally, safety-critical code should be isolated from non-safety-critical code. This analysis is to show this isolation is complete and that interfaces between safety-critical code and non-safety-critical code do not create hazards.


[QA Dictionary Back to Top]


In object oriented programming, A self contained module [encapsulation] of and the programs [services] that manipulate [process] that data.

Object code


A code expressed in machine language ["1"s and "0"s] which is normally an output of a given translation process that is ready to be executed by a computer. (NIST) Syn: machine code. Contrast with source code. See: object program.

Object oriented design (IEEE)

A software development technique in which a system or component is expressed in terms of objects and connections between those objects.

Object oriented language


A programming language that allows the user to express a program in terms of objects and messages between those objects. (IEEE) Examples include C + +, Smalltalk and LOGO.

Object oriented programming

A technology for writing programs that are made up of self sufticiant modules that contain all of the information needed to manipulate a given data structure. The modules are created in class hierarchies so that the code or methods of a class can be passed to other modules. New object modules can be easily created by inheriting the characteristics of existing classes. See: object, object oriented design.

Object program (IEEE)

A computer program that is the output of an assembler or compiler.


The base 3 number system. Digits are 0,1,2,3,4,5,6, &7


(IEEE) Pertaining to a system or mode of operation in which input data enter the computer directly from the point of origin or output data are transmitted directly to the point where they are used. For example, an airline reservation system. Contrast with batch. See: conversational, interactive, real time.


object oriented programming

Operating system

(ISO) Software that controls the execution of programs, and that provides services such as resource allocation. scheduling, input/output control, and data management. Usually, operating systems are predominantly software. but partial or complete hardware implementations are possible.

Operation and maintenance phase

(IEEE) The period of time in the software life cycle during which a software product is employed in its operational environment, monitored for satisfactory performance, and modified as necessary to correct problems or to respond to changing requirements.

Operation exception (IEEE)

An exception that occurs when a program encounters an invalid operation code.


[QA Dictionary Back to Top]

Pareto Analysis

Analyze defect patterns to identify causes and sources. [William E. Lewis, 2000]

Peer Reviews

Peer reviews involve a methodical examination of software work-products by the producerís peers to identify defects and areas where changes are needed.

Performance requirement

IEEE) A requirement that imposes conditions on a functional requirement; e.g., a requirement that specifies the speed. accuracy, or memory usage with which a given function must be performed.

Peripheral device

Equipment that is directly connected a computer. A peripheral device can be used to input data: e.g., keypad, bar code reader, transducer, laboratory test equipment: or to output data; ~g., printer, disk drive, video system, tape drive, valve controller, motor controller. Syn: peripheral equipment.

Physical requirement

(IEEE) A requirement that specifies a physical characteristic that a system or system component must posses; e.g., material, shape, size, weight


The hardware and software which must be present and functioning for an application program to run [perform] as intended. A platform includes, but is not limited to the operating system or executive software, communication software, microprocessor. network, input/output hardware, any generic software libraries, database management, user interface software, and the like.


A technique a CPU can use to learn if a peripheral device is ready to receive data or to send data. In this method each device is checked or polled in-turn to determine if that device needs service. The device must wait until it is polled in order to send or receive data. This method is useful if the device's data can wait for a period of time before being processed, since each device must await its turn in the polling scheme before it will be serviced by the processor. Contrast with interrupt.

Positive and Negative Testing

Test the positive and negative values for all inputs. [William E. Lewis, 2000]

Production database

The computer file that contains the establishment's current production data.


(ISO) A sequence of instructions suitable for processing. Processing may include the use of an assembler, a compiler, an interpreter, or another translator to prepare the program for execution. The instructions may include statements and necessary declarations

Program design language

(IEEE) A specification language with special constructs and, sometimes, verification protocols, used to develop, analyze, and document a program design.

Program mutation

(IEEE) A computer program that has been purposely altered from the intended version to evaluate the ability of program test cases to detect the alteration. See: testing, mutation.

Programming language

(IEEE) A language used to express computer programs. See: computer language, high-level language, low-level language.

Programming standards

See: coding standards.

Project plan

(NIST) A management document describing the approach taken for a project. The plan typically describes work to be done, resources required, methods to be used, the configuration management and quality assurance procedures to be followed, the schedules to be met, the project organization, etc. Project in this context is a generic term. Some projects may also need integration plans, security plans, test plans, quality assurance plans, etc. See: documentation plan, software development plan, test plan, software engineering.

Proof of correctness

(NBS) The use of techniques of mathematical logic to infer that a relation between program variables assumed true at program entry implies that another relation between program variables holds at program exit.


(ISO) A set of semantic and syntactic rules that determines the behavior of functional units in achieving communication.


Using software tools to accelerate the software development process by facilitating the identification of required functionality during analysis and design phases. A limitation of this technique is the identification of system or software problems and hazards. See: rapid prototyping.

Prior Defect History Testing.

Test cases are created or rerun for every defect found in prior tests of the system. [William E. Lewis, 2000]


A combination of programming language and natural language used to express a software design. If used, it is usually the last document produced prior to writing the source code.


[QA Dictionary Back to Top]


Quality assurance.

QA - Quality assurance

(ISO) The planned systematic activities necessary to ensure that a component, module, or system conforms to established technical requirements.

QA - Quality assurance

All actions that are taken to ensure that a development organization delivers products that meet performance requirements and adhere to standards and procedures.

QA - Quality assurance

The policy, procedures, and systematic actions established in an enterprise for the purpose of providing and maintaining some degree of confidence in data integrity and accuracy throughout the life cycle of the data, which includes input, update, manipulation, and output.

QA - Quality assurance

The actions, planned and performed, to provide confidence that all systems and components that influence the quality of the product are working as expected individually and collectively.

QA - Quality assurance, software

A set of activities designed to evaluate the process by which products are developed or manufactured.

QA - Quality assurance, software (IEEE)

A planned and systematic pattern of all actions necessary to provide adequate confidence that an item or product conforms to established technical requirements.


Quality control.

QC - Quality control

The operational techniques and procedures used to achieve quality requirements.

Qualification, installation (FDA)

Establishing confidence that process equipment and ancillary systems are compliant with appropriate codes and approved design intentions, and that manufacturer's recommendations are suitably considered.

Qualification, operational

(FDA) Establishing confidence that process equipment and subsystems are capable of consistently operating within established limits and tolerances.

Qualification, process performance

(FDA) Establishing confidence that the process is effective and reproducible.

Qualification, product performance

(FDA) Establishing confidence through appropriate testing that the finished product produced by a specified process meets all release requirements for functionality and safety.


The degree to which a program possesses a desired combination of attributes that enable it to perform its specified end use.

Quality Assurance (QA)

Consists of planning, coordinating and other strategic activities associated with measuring product quality against external requirements and specifications (process-related activities).

Quality control

Quality control is a five-step process
1 Define the attribute(s)
2 Define the attribute check procedure
3 Carry out check procedure
4 Record the result
5 Take & record any corrective action

Quality Control (QC)

Consists of monitoring, controlling and other tactical activities associated with the measurement of product quality goals.

Quality management

The central principle of Quality management is; say what youíll do, do it, record that you have done it.


[QA Dictionary Back to Top]

Rapid prototyping

A structured software requirements discovery technique which emphasizes generating prototypes early in the development process to permit early feedback and analysis in support of the development process. Contrast with incremental development, spiral model, waterfall model. See: prototyping.

Real time (IEEE)

Pertaining to a system or mode of operation in which computation is performed during the actual time that an external process occurs, in order that the computation results can be used to control. monitor, or respond in a timely manner to the external process. Contrast with batch. See: conversational, interactive.

Real time processing

A fast-response [immediate response] on-line system which obtains data from an activity or a physical process, performs computations. and returns a response rapidly enough to affect [control] the outcome of the activity or process; ~g., a process control application. Contrast with batch processing.

Record (ISO)

A group of related data elements treated as a unit [A data element (field) is a component of a record, a record is a component of a file (database)].

Record of change

Documentation of changes made to the system. A record of change can be a written document or a database. Normally there are two associated with a computer system, hardware and software. Changes made to the data are recorded in an audit trail.


The process of examining and altering an existing system to reconstitute it in a new form. May include reverse engineering (analyzing a system and producing a representation at a higher level of abstraction, such as design from code), restructuring (transforming a system from one representation to another at the same level of abstraction), recommendation (analyzing a system and producing user and support documentation), forward engineering (using software products derived from an existing system, together with new requirements, to produce a new system), and translation (transforming source code from one language to another or from one version of a language to another).

Relational database

Database organization method that links files together as required. Relationships between files are created by comparing data such as account numbers and names. A relational system can take any two or more files and generate a new file from the records that meet the matching criteria. Routine queries often involve more than one data file; e.g., a customer tile and an order file can be linked in order to ask a question that relates to information in both tiles, such as the names of the customers that purchased a particular product. Contrast with network database, flat tile.

Release (IEEE)

The formal notification and distribution of an approved version. See: version


The probability of failure-free operation for a specified period.

Reliability assessment (ANSI/IEEE)

The process of determining the achieved level of reliability for an existing system or system component.

Reliability (IEEE)

The ability of a system or component to perform its required functions under stated conditions for a specified period of time. See: software reliability.

Requirement (IEEE)

(1) A condition or capability needed by a user to solve a problem or achieve an objective (2) A condition or capability that must be met or possessed by a system or system component to satisfy a contract, standard. specification, or other formally imposed documents. (3) A documented representation of a condition or capability as in (1) or (2). See: design requirement, functional requirement, implementation requirement, interface requirement, performance requirement, physical requirement.

Requirements analysis (IEEE)

The process of studying user needs to arrive at a definition of a system, hardware, or software requirements.

Requirements phase (IEEE)

The period of time in the software life cycle during which the requirements, such as functional and performance capabilities for a software product, are defined and documented.

subroutine trace, symbolic trace. variable trace.


Relative to software changes, revalidation means validating the change itself, assessing the nature of the change to determine potential ripple effects, and performing the necessary regression testing.

Review (IEEE)

A process or meeting during which a work product or set of work products, is presented to project personnel, managers, users, customers, or other interested parties for comment or approval. Types include code review, design review, formal qualification review, requirements review, test readiness review. Contrast with audit, inspection. See: static analysis.

Risk (IEEE)

A measure of the probability and severity of undesired effects. Often taken as the simple product of probability and consequence.

Risk assessment (DOD)

A comprehensive evaluation of the risk and its associated impact.

Risk management

An organised process to identify what can go wrong, to quantify and access associated risks, and to implement/control the appropriate approach for preventing or handling each risk identified.


[QA Dictionary Back to Top]

Safety (DOD)

Freedom from those conditions that can cause death, injury, occupational illness, or damage to or loss of equipment or property, or damage to the environment.

Safety critical (DOD)

A term applied to a condition, event, operation. process or item of whose proper recognition, control, performance or tolerance is essential to safe system operation or use; e.g., safety critical function, safety critical path, safety critical component.

Safety critical computer software components (DOD)

Those computer software components and units whose errors can result in a potential hazard, or loss of predictability or control of a system.


See: computer system security.

Security testing

Testing how well the system protects against unauthorized internal or external access, wilful damage, etc; may require sophisticated testing techniques.

Sequence Diagram (UML)

A diagram that describes a pattern of interaction among objects, arranged in a chronological order; it shows the objects participating in the interaction by their "lifelines" and the messages that they send to each other.

Service level agreement (SLA)

A Service Level Agreement (SLA) is a contract between a service provider and a customer that specifies, usually in measurable terms, what services the service provider will furnish.

Side effect

An unintended alteration of a program's behaviour caused by a change in ore part of the program, without taking into account the effect the change has on another part of the program. See: regression analysis and testing.

Simulation (IEEE)

A model that behaves or operates like a given system when provided a set of controlled inputs. Contrast with emulation.

Simulation (NBS)

Use of an executable model to represent the behaviour of an object. During testing the computational hardware, the external environment, and even code segments may be simulated.

Simulation analysis (IEEE)

A software V&V task to simulate critical tasks of the software or system environment to analyze logical or performance characteristics that would not be practical to analyze manually.

Simulator (IEEE)

A device, computer program, or system that behaves or operates like a given system when provided a set of controlled inputs. Contrast with emulator. A simulator provides inputs or responses that resemble anticipated process parameters. Its function is to present data to the system at known speeds and in a proper format.

Sizing (IEEE)

The process of estimating the amount of computer storage or the number of source lines required for a software system or component. Contrast with timing.


See: service level agreement.

Software (ANSI)

Programs, procedures, rules, and any associated documentation pertaining to the operation of a system. Contrast with hardware See: application software, operating system, system software, utility software.

Software Architecture

Software architecture encompasses:
  • the significant decisions about the organization of a software system,
  • the selection of the structural elements and their interfaces by which the system is composed together with their behavior as specified in the collaboration among those elements,
  • the composition of the structural and behavioral elements into progressively larger subsystems,
  • the architectural style that guides this organization, these elements and their interfaces, their collaborations, and their composition.
Software architecture is not only concerned with structure and behavior, but also with usage, functionality, performance, resilience, reuse, comprehensibility, economic and technology constraints and tradeoffs, and aesthetic concerns.

Software characteristic

An inherent, possibly accidental, trait, quality, or property of software; e.g., functionality, performance, attributes, design constraints, number of states, lines or branches.

Software configuration item

See: configuration item.

Software design description (IEEE)

A representation of software created to facilitate analysis, planning, implementation, and decision making. The software design description is used as a medium for communicating software design information, and may be thought of as a blueprint or model of the system. See: structured design, design description, specification.

Software development notebook (NIST)

A collection of material pertinent to the development of a software module. Contents typically include the requirements, design, technical reports, code listings, test plans, test results, problem reports, schedules, notes, etc. for the module. Syn: software development file.

Software development plan (NIST)

The project plan for the development of a software product. Contrast with software development process, software life cycle.

Software development process (IEEE)

The process by which user needs are translated into a software product. The process involves translating user needs into software.

Software diversity (IEEE)

A software development technique in which two or more functionally identical variants of a program are developed from the same specification by different programmers or programming teams with the intent of providing error detection, increased reliability, additional documentation or reduced probability that programming or compiler errors will influence the end results.

Software documentation (NIST)

Technical data or information, including computer listings and printouts, in human readable form, that describe or specify the design or details, explain the capabilities, or provide operating instructions for using the software to obtain desired results from a software system. See: specification; specification, requirements: specification, design; software design description; test plan, test report, user's guide.

Software element (IEEE)

A deliverable or in-process document produced or acquired during software development or maintenance. Specific examples include but are not limited to:
(1) Project planning documents; i.e., software development plans, and software verification and validation plans.
(2) Software requirements and design specifications.
(3) Test documentation.
(4) Customer-deliverable documentation.
(5) Program source code.
(6) Representation of software solutions implemented in firmware
(7) Reports; i.e., review, audit, project status.
(8) Data; i.e., defect detection, test. Contrast with software item. See: configuration item.

Software engineering (IEEE)

The application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software; i.e., the application of engineering to software. See: project plan, requirements analysis, architectural design, structured design, system safety, testing, configuration management.

Software engineering environment (IEEE)

The hardware, software, and firmware used to perform a software engineering effort. Typical elements include computer equipment. compilers, assemblers, operating systems, debuggers, simulators, emulators, test tools, documentation tools, and database management systems.

Software hazard analysis (CDE, CDRH)

The identification of safety-critical software, the classification and estimation of potential hazards, and identification of program path analysis to identify hazardous combinations of internal and environmental program conditions. See: risk assessment, software safety change analysis, software safety code analysis, software safety design analysis, software safety requirements analysis, software safety test analysis, system safety.

Software item (IEEE)

Source code. object code, job control code, control data, or a collection of these items. Contrast with software element.

Software life cycle

(NIST) Period of time beginning when a software product is conceived and ending when the product is no longer available for use The software life cycle is typically broken into phases denoting activities such as requirements, design, programming, testing, installation, and operation and maintenance. Contrast with software development process. See: waterfall model.

Software management group

A group of specialists who establish, maintain, and improve the software management process used during software development.

Software reliability (IEEE)

The probability that software will not cause the failure of a system for a specified time under specified conditions. The probability is a function of the inputs to and use of the system in the software The inputs to the system determine whether existing faults, if and are encountered.

Software reliability (IEEE)

The ability of a program to perform its required functions accurately and reproducibly under stated conditions for a specified period of time.

Software requirements specification

See: specification; requirements.

Software review (IEEE)

An evaluation of software elements to ascertnain discrepancies from planned results and to recommend improvement. This evaluation follows a formal process. Syn: software audit. See: code audit, code inspection, code review, code walkthrough, design review, specification analysis, static analysis.

Software safety change analysis (IEEE)

Analysis of the safety-critical design elements affected directly or indirectly by the change to show the change does not create a new hazard, does not impact on a previously resolved hazard, does not make a currently existing hazard more severe, and does not adversely affect any safety-critical software design element. See: software hazard analysis, system safety.

Software safety code analysis (IEEE)

Verification that the safety-critical portion of the design are correctly implemented in the code. See: logic analysis, data analysis, interface analysis, constraint analysis, programming style analysis, noncritical code analysis, timing and sizing analysis, software hazard analysis, system safety.

Software safety design analysis (IEEE)

Verification that the safety-critical portion of the software design correctly implements the safety-critical requirements and introduces no new hazards. See: logic analysis, data analysis, interface analysis, constraint analysis, functional analysis, software element analysis, timing and sizing analysis, reliability analysis. software hazard analysis, system safety.

Software safety requirements analysis (IEEE)

Analysis evaluating software and interface requirements to identify errors and deficiencies that could contribute to a hazard. See: criticality analysis, specification analysis, timing and sizing analysis, different software systems analyses, software hazard analysis, system safety.

Software safety test analysis (IEEE)

Analysis demonstrating that safety requirements have been correctly implemented and that the software functions safely within its specified environment. Tests may include; unit level tests, interface tests, software configuration item testing, system level testing, stress testing, and regression testing. See: software hazard analysis, system safety.

Source code

The human readable version of the list of instructions [program] that cause a computer to perform a task. Contrast with object code. See: source program, programming language.

Source code (IEEE)

Computer instructions and data definitions expressed in a form suitable for input to an assembler, compiler or other translator.

Source program (IEEE)

A computer program that must be compiled, assembled, or otherwise translated in order to be executed by a computer. Contrast with object program. See: source code.

Spaghetti code

Program source code written without a coherent structure Implies the excessive use of GOTO instructions. Contrast with structured programming.

Special test data (NBS)

Test data based on input values that are likely to require special handling by the program. See: error guessing; testing. special case.

Specification (IEEE)

A document that specifies, in a complete, precise, verifiable manner, the requirements, design, behavior, or other characteristics of a system or component, and often, the procedures for determining whether these provisions have been satisfied. Contrast with requirement. See: specification, formal; specification, requirements; specification, functional; specification, performance; specification, interface; specification, design; coding standards; design standards.

Specification analysis (IEEE)

Evaluation of each safety-critical software requirement with respect to a list of qualities such as completeness, correctness, consistency, testability. robustness, integrity, reliability, usability, flexibility, maintainability, portability, interoperability, accuracy, auditability, performance. internal instrumentation, security and training.

Specification-based test.

A test, whose inputs are derived from a specification.

Specification tree (IEEE)

A diagram that depicts all of the specifications for a given system and shows their relationship to one another.

Specification, design (NIST)

A specification that documents how a system is to be built. It typically includes system or component structure, algorithms, control logic, data structures, data set [file] use information, input/output formats, interface descriptions, etc Contrast with design standards, requirement. See: software design description.

Specification, formal

A specification expressed in a requirements specification language. Contrast with requirement.

Specification, formal (NIST)

A specification written and approved in accordance with established standards

Specification, functional. (NIST)

A specification that documents the functional requirements for a system or system component. It describes what the system or component is to do rather than how it is to be built. Often part of a requirements specification. Contrast with requirement

Specification, interface. (NIST)

A specification that documents the interface requirements for a system or system component. Often part of a requirements specification. Contrast with requirement.

Specification, performance. (IEEE)

A document that sets forth the performance characteristics that a system or component must possess. These characteristics typically include speed, accuracy, and memory usage Often part of a requirements specification. Contrast with requirement.

Specification, product (IEEE)

A document which describes the as built version of the software pecification, programming

Specification, product (NIST)

See: specification, design.

Specification, requirements (NIST)

A specification that documents the requirements of a system or system component. It typically includes functional requirements. performance requirements, interface requirements, design requirements [attributes and constraints], development [coding] standards, etc Contrast with requirement.

Specification, system

See: requirements specification.

Specification, test case

See: test case.

Spiral model (IEEE)

A model of the software development process in which the constituent activities, typically requirements analysis, preliminary and detailed design. coding, integration, and testing, are performed iteratively until the software is complete Syn: evolutionary model, Contrast with incremental development; rapid prototyping; waterfall model.

Standard operating procedures (SOP)

Written procedures [prescribing and describing the steps to be taken in normal and defined conditions] which are necessary to assure control of production and processes.

State (IEEE)

A condition or mode of existence that a system, component, or simulation may be in; e.g., the pre-flight state of an aircraft navigation program or the input state of a given channel.

State diagram (UML)

A diagram that depicts the states that a system or component can assume, and shows the events or circumstances that cause or result from a change from one state to another. Syn: state graph. See: state-transition table. (IEEE)

Statement coverage

See: testing, statement.

Static Analysis

Looks at the complexity of source code. It has been demonstrated that highly complicated modules or functions are more error-prone and should be isolated for more rigorous inspection, verification and validation.

Static analysis (IEEE)

The process of evaluating a system or component based on its form, structure, content, documentation. Contrast with dynamic analysis. See: code audit, code inspection, code review, code walk-through, design review, symbolic execution.

Static analysis (NBS)

Analysis of a program that is performed without executing the program.

Static analyzer (ANSI/IEEE)

A software tool that aides in the evaluation of a computer program without executing the program. Examples include checkers, compilers, cross-reference generators, standards enforcers, and flow charters.

Structure chart (IEEE)

A diagram that identifies modules, activities, or other entities in a system or computer program and shows how larger or more general entities break down into smaller, more specific entries. Note: The result is not necessarily the same as that shown in a call graph. Syn: hierarchy chart, program structure chart. Contrast with call graph.

Structured design (IEEE)

Any disciplined approach to software design that adheres to specified rules based on principles such as modularity, top-down design, and stepwise refinement of data, system structure, and processing steps. See: data structure centered design, input-processing-output, modular decomposition, object oriented design, rapid prototyping, stepwise refinement, structured programming, transaction analysis, transform analysis, graphical software specification/design documents, modular software, software engineering.

Structured programming (IEEE)

Any software development technique that includes structured design and results in the development of structured programs. See: structured design.

Structured query language

A language used to interrogate and process data in a relational database. Originally developed for IBM mainframes, there have been many implementations created for mini and micro computer database applications. SQL commands can be used to interactively work with a data base or can be embedded with a programming language to interface with a database.

Stub (NBS)

Special code segments that when invoked by a code segment under test will simulate the behavior of designed and specified modules not yet constructed.

Subprogram (IEEE)

A separately compilable, executable component of a computer program. Note: This term is defined differently in various programming languages. See: coroutine, main program, routine, subroutine.

Subroutine (IEEE)

A routine that returns control to the program or subprogram that called it. Note: This term is defined differently in various programming languages. See: module.

Subroutine trace (IEEE)

A record of all or selected subroutines or function calls performed during the execution of a computer program and. optionally, the values of parameters passed to and returned by each subroutine or function. Syn: call trace. See: execution trace, retrospective trace, symbolic trace, variable trace.

Supplementary specification.

This document captures any requirements that cannot be tied directly to any specific use case, and especially many of the nonfunctional requirements and design constraints. (Rational Inc.)

Support software (IEEE)

Software that aids in the development and maintenance of other software; e.g., compilers, loaders, and other utilities.

Symbolic execution (IEEE)

A static analysts technique in which program execution is simulated using symbols, such as variable names, rather than actual values for input data, and program outputs are expressed as logical or mathematical expressions involving these symbols.

Symbolic trace (IEEE)

A record of the source statements and branch outcomes that are encountered when a computer program is executed using symbolic, rather than actual values for input data. See: execution trace, retrospective trace, subroutine trace, variable trace.


The structural or grammatical rules that define how symbols in a language are to be combined to form words, phrases, expressions, and other allowable constructs.

System (ANSI)

People, machines, and methods organized to accomplish a set of specific functions.

System (DOD)

A composite, at any level of complexity, of personnel, procedures, materials, tools, equipment, facilities, and softwara The elements of this composite entity are used together in the intended operational or support environment to perform a given task or achieve a specific purpose, support. or mission requirement.

System administrator

The person that is charged with the overall administration, and operation of a computer system. The System Administrator is normally an employee or a member of the establishment. Syn: system manager.

System analysis (ISO)

A systematic investigation of a real or planned system to determine the functions of the system and how they relate to each other and to any other system. See: requirements phase.

System design (ISO)

A process of defining the hardware and software architecture, components, modules, interfaces, and data for a system to satisfy specified requirements. See: design phase, architectural design, functional design.

System design review (IEEE)

A review conducted to evaluate the manner in which the requirements for a system have been allocated to configuration items, the system engineering process that produced the allocation, the engineering planning for the next phase of the effort, manufacturing considerations, and the planning for production engineering. See: design review.

System documentation (ISO)

The collection of documents that describe the requirements, capabilities, limitations, design, operation, and maintenance of an information processing system. See: specification, test documentation, user's guide.

System integration (ISO)

The progressive linking and testing of system components into a complete system. See: incremental integration.

System life cycle

The course of developmental changes through which a system passes from its conception to the termination of its use; a.g., the phases and activities associated with the analysis. acquisition, design, development, test, integration, operation, maintenance, and modification of a system. See: software life cycle.

System safety (DOD)

The application of engineering and management principles, criteria, and techniques to optimize all aspects of safety within the constraints of operational effectiveness, time, and cost throughout all phases of the system life cycle. See: risk assessment, software safety change analysis, software safety code analysis, software safety design analysis, software safety requirements analysis, software safety test analysis, software engineering.

System software (IEEE)

Software designed to facilitate the operation and maintenance of a computer system and its associated programs: ag., operating systems, assemblers, utilities. Contrast with application software See: support software

System software (ISO)

Application-independent software that supports the running of application software


[QA Dictionary Back to Top]

Test (IEEE)

An activity in which a system or component is executed under specified conditions, the results are observed or recorded and an evaluation is made of some aspect of the system or component.

Test case (IEEE)

Documentation specifying inputs, predicted results, and a set of execution conditions for a test item. Syn: test case specification. See: test procedure.

Test case generator (IEEE)

A software tool that accepts as input source code, test criteria, specifications, or data structure definitions; uses these inputs to generate test input data; and, sometimes, determines expected results. Syn: test data generator, test generator.

Test configuration

A defined and versioned set of drivers, stubs, and a build of the system under test.

Test Coverage

The degree to which a given test or set of tests addresses all specified test cases for a given system or component.

Test criteria

The criteria used to determine that a system or component must meet in order to pass a given test [IEEE 610]. Decision rules used to determine whether software item or software feature passes or fails a test.

Test data

The criteria that a system or component must meet in order to pass a given test [IEEE 610]. Decision rules used to determine whether a software item or software feature

passes or fails a test.

Test design (IEEE)

Documentation specifying the details of the test approach for a software feature or combination of software features and identifying the associated tests. See: testing functional; cause effect graphing; boundary value analysis; equivalence class partitioning; error guessing; testing, structural; branch analysis; path analysis; statement coverage; condition coverage: decision coverage; multiple-condition coverage.

Test design specification

The process of producing a suite of test cases using a test strategy. Test design is concerned with three problems: identification of interesting test points, placing these test points into a test sequence, and defining the expected result for each test point in the sequence. [R. V. Binder, 1999]

Test documentation (IEEE)

Documentation describing plans for, or results ot the testing of a system or component, Types include test case specification, test incident report, test log, test plan, test procedure, test report.

Test driver (IEEE)

A software module used to invoke a module under test and, often, provide test inputs, control and monitor execution, and report test results. Syn: test harness.

Test effectiveness

The relative ability of a testing strategy to find bugs.

Test efficiency

The relative cost of finding a bug

Test harness

A system of test drivers and other tools to support test execution (e.g., stubs, executable test cases, and test drivers).

See: test driver.

Test incident report (IEEE)

A document reporting on any event that occurs during testing that requires further investigation. See: failure analysis.

Test item. (IEEE)

A software item which is the object of testing

Test log

Contains the results of an execution of a test suite.

Test log (IEEE)

A chronological record of all relevant details about the execution of a test.

Test Oracle.


A mechanism to produce the predicted outcomes to compare with the actual outcomes of the software under test [BS7925-1]

Test phase (IEEE)

The period of time in the software life cycle in which the components of a software product are evaluated and integrated, and the software product is evaluated to determine whether or not requirements have been satisfied.

Test plan (IEEE)

Documentation specifying the scope, approach, resources, and schedule of intended testing activities. It identifies test items, the features to be tested, the testing tasks, responsibilities, required resources, and any risks requiring contingency planning. See: test design, validation protocol.

Test procedure.


A document, providing detailed instructions for the [manual] execution of one or more test cases. [BS7925-1] Often called - a manual test script.

Test procedure (NIST)

A formal document developed from a test plan that presents detailed instructions for the setup, operation, and evaluation of the results for each defined test. See: test case.

Test process

The set of tools, methods, and practices used to test a software product.

Test readiness review (IEEE)

A review conducted to evaluate preliminary rest results for one or more configuration items; to verify that the test procedures for each configuration item are complete, comply with test plans and descriptions, and satisfy test requirements; and to verify that a project is prepared to proceed to formal testing of the configuration items. (2) A review as in (1) for any hardware or software component. Contrast with code review, design review, formal qualification review, requirements review.

Test report (IEEE)

A document describing the conduct and results of the testing carried out for a system or system component.

Test result analyzer

A software tool used to test output data reduction, formatting, and printing.

Test script.

A program written in a procedural script language (usually interpreted) that executes a test suite(s).

Test script

Documentation specifying a sequence of actions and expected results to accomplish a system task.

Test sequence

Two or more test messages which follow one another

Test specification technique.

A standardized way to extract test cases from output information. [Tim Koomen, 1999]

Test stub

A dummy software component or object used (during development and testing) to simulate the behaviour of a real component. The stub typically provides test output.

Test strategy

An algorithm or heuristic to create test cases from a representation, an implementation, or a test model.

Test status.

The assessment of the result of running tests on software.

Test Suites

A test suite consists of multiple test cases (procedures and data) that are combined and often managed by a test harness.


[ISO 8402]

Attributes of software that bear on the effort needed for validating the modified software [ISO 8402]


The ease and speed with which the functionality and the performance level of the system (after adjustment) can be tested (TPI definition) [Tim Koomen, 1999]

Test technique.

A test technique is a collection of actions to produce a test product in universal manner. [Tim Koomen, 1999]

Testability (IEEE)

The degree to which a system or component facilitates the establishment of test criteria and the performance of tests to determine whether those criteria have been met.


The process of analyzing a software item to detect the differences between existing and required conditions, i.e., bugs, and to evaluate the features of the software items. See: dynamic analysis, static analysis, software engineering.


The execution of tests with the intent of providing that the system and application under test does or does not perform according to the requirements specification.

Testing (IEEE)

The process of operating a system or component under specified conditions, observing or recording the results, and making an evaluation of some aspect of the system or component.

Testing, acceptance (IEEE)

Testing conducted to determine whether or not a system satisfies its acceptance criteria and to enable the customer to determine whether or not to accept the system. Contrast with testing, development; testing, operational. See: testing, qualification, and user acceptance testing.

Testing, beta [B]

For medical device software such use may require an Investigational Device Exemption [ICE] or Institutional Review Board (IRS] approval.

Testing, boundary value

A testing technique using input values at, just below, and just above, the defined limits of an input domain; and with input values causing outputs to be at, just below, and just above, the defined limits of an output domain. See: boundary' value analysis; testing, stress.

Testing, branch (NBS)

Testing technique to satisfy coverage criteria which require that for each decision point, each possible branch (outcome] be executed at least once. Contrast with testing, path; testing, statement. See: branch coverage.

Testing, compatibility

The process of determining the ability of two or more systems to exchange information. In a situation where the developed software replaces an already working program, an investigation should be conducted to assess possible comparability problems between the new software and other programs or systems. See: different software system analysis; testing, integration; testing, interface. program variables. Feasible only for small, simple programs.

Testing, component

See: testing, unit.

Testing, design based functional (NBS)

The application of test data derived through functional analysis extended to include design functions as well as requirement functions. See: testing, functional.

Time sharing (IEEE)

A mode of operation that permits two or more users to execute computer programs concurrently on the same computer system by interleaving the execution of their programs. May be implemented by time slicing, priority-based interrupts, or other scheduling methods.

Timing (IEEE)

The process of estimating or measuring the amount of execution time required for a software system or component. Contrast with sizing.

Timing analyzer (IEEE)

A software tool that estimates or measures the execution time of a computer program or portion of a computer program, either by summing the execution times of the instructions along specified paths or by inserting probes at specified points in the program and measuring the execution time between probes.

Timing and sizing analysis (IEEE)

Analysis of the safety implications of safety-critical requirements that relate to execution time, clock time, and memory' allocation.

Total failure

In a distributed system, the inability of all sites to perform any normal function.


To establish a relationship between two or more products of the development process: a.g., to establish the relationship between a given requirement and the design element that implements that requirement.

Trace (IEEE)

(1) A record of the execution of a computer program, showing the sequence of instructions executed, the names and values of variables, or both. Types include execution trace, retrospective trace, subroutine trace, symbolic trace, variable trace. (2) To produce a record as in (1).


The degree to which each element in a software development product establishes its reason for existing; e.g., the degree to which each element in a bubble chart references the requirement that it satisfies. See: traceability analysis, traceability matrix.

Traceability (IEEE)

The degree to which a relationship can be established between two or more products of the development process, especially products having a predecessor-successor or master-subordinate relationship to one another; ag., the degree to which the requirements and design of a given software component match. See: consistency.

Traceability analysis

The tracing of software design descriptions to software requirements specifications and software requirements specifications to software design descriptions

Traceability analysis

The tracing of source code to corresponding design specifications and design specifications to source code. Analyze identified relationships for correctness, consistency, completeness, and accuracy. See: traceability, traceability matrix.

Traceability analysis (IEEE)

The tracing of Software Requirements Specifications requirements to system requirements in concept documentation

Traceability matrix (IEEE)

A matrix that records the relationship between two or more products; e.g., a matrix that records the relationship between the requirements and the design of a given software component. See: traceability, traceability analysis.


An exchange between and end user and an interactive system.


In a database management system, a unit of processing activity' that accomplishes a specific purpose such as a retrieval, an update, a modification, or a deletion of one or more data elements of a storage structure.

Transaction (ANSI)

A command, message, or input record that explicitly or implicitly calls for a processing action, such as updating a file.

Transaction analysis

A structured software design technique, deriving the structure of a system from analyzing the transactions that the system is required to process.

Transaction flowgraph (Seizer)

A model of the structure of the system's (program's] behavior, i.e., functionality

Transaction matrix (IEEE)

A matrix that identifies possible requests for database access and relates each request to information categories or elements in the database.

Transform analysis

A structured software design technique in which system structure is derived from analyzing the flow of data through the system and the transformations that must be performed on the data.

Trojan horse

A method of attacking a computer system, typically by providing a useful program which contains code intended to compromise a computer system by secretly providing for unauthorized access, the unauthorized collection of privileged system or user data, the unauthorized reading or altering of files, the performance of unintended and unexpected functions, or the malicious destruction of software and hardware See: bomb, virus, worm.

Truth table (ISO)

An operation table for a logic operation.

Tuning (NIST)

Determining what pans of a program are being executed the most. A tool that instruments a program to obtain execution frequencies of statements is a tool with this feature.


[QA Dictionary Back to Top]


See user acceptance testing.


Not having two or more possible meanings.


Not susceptible to different interpretations.


Not obscurer not vague.


Clear, definite, certain.


A logically separable part of a computer program. Syn: component, module.

Unit (IEEE)

A separately testable element specified in the design of a computer software element.

Unit testing

Unit testing - the most 'micro' scale of testing; to test particular functions or code modules. Typically done by the programmer and not by testers, as it requires detailed knowledge of the internal program design and code. Not always easily done unless the application has a well-designed architecture with tight code; may require developing test driver modules or test harnesses.

Unit Testing

Testing performed to isolate and expose faults and failures as soon as the source code is available, regardless of the external interfaces that may be required. Oftentimes, the detailed design and requirements documents are used as a basis to compare how and what the unit is able to perform. White and black-box testing methods are combined during unit testing.

Usability (IEEE)

The ease with which a user can operate, prepare inputs for, and interpret of a system or component.

Usability testing

Testing for 'user-friendliness'. Clearly this is subjective, and will depend on the targeted end-user or customer. User interviews, surveys, video recording of user sessions, and other techniques can be used. Programmers and testers are usually not appropriate as usability testers.

User (ANSI)

Any person, organization, or functional unit that uses the services of an information processing system. See: end user.

Use-case diagram [UML]

A diagram that shows the relationships among actors and use cases within a system.

Use-case specification.

Use cases serve as a format to express functional requirements in sequence. A use case is a sequence of actions a system performs that yields an observable result (a work output) of value to a particular actor. Use cases are especially good at documenting functional software requirements. (Rational Inc.)

User acceptance testing

User acceptance testing - determining if software is satisfactory to an end-user or customer.

User acceptance testing

A phase of testing designed, performed and evaluated by end-users of an application to determine its fitness for use in the operational environment. User acceptance testing (UAT) validates that the original business or operational objectives of the application have been realized, regardless of the specifications or requirements. Syn: UAT, user testing. See also testing, acceptance.

Userís guide (ISO)

Documentation that describes how to use a functional unit, and that may include description of the rights and responsibilities of the user, the owner, and the supplier of the unit. Syn: user manual, operator manual.


[QA Dictionary Back to Top]


Verification and validation.

V- model of software development

V- model of software development shows when testing activities should take place. It shows that each development activities has a corresponding test activity.


Well grounded on principles of evidence


Able to withstand criticism or objection.

Valid input (NBS)

Test data that lie within the domain of the function represented by the program.


To prove to be valid


Validation is checking that you have built the right system


The comparison between the actual characteristics of something (e.g. a product of a software project and the expected characteristics).


The process of evaluating the software under test against the functional requirements (preferably defined and reviewed by the customer).

Validation (FDA)

Establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality attributes. Contrast with data validation.

Validation protocol (FDA)

A written plan stating how validation will be conducted, including test parameters, product characteristics. production equipment, and decision points on what constitutes acceptable test results. See: test plan.

Validation, process (FDA)

Establishing documented evidence which provides a high degree of assurance that a specific process will consistently produce a product meeting its predetermined specifications and quality characteristics.

Validation, prospective (FDA)

Validation conducted prior to the distribution of either a new product, or product made under a revised manufacturing process, where the revisions may affect the product's characteristics.

Validation, retrospective

Retrospective validation can also be useful to augment initial premarket prospective validation for new products or changed processes. Test data is useful only if the methods and results are adequately specific. Whenever test data are used to demonstrate conformance to specifications, it is important that the test methodology be qualified to assure that the test results are objective and accurate.

Validation, retrospective (FDA)

Validation of a process for a product already in distribution based upon accumulated production. testing and control data.

Validation, software (NBS)

Determination of the correctness of the final program or software produced from a development project with respect to the user needs and requirements. Validation is usually accomplished by verifying each stage of the software development life cycle. See: verification, software.

Validation, verification and testing (NIST)

Used as an entity to define a procedure of review, analysis, and testing throughout the software life cycle to discover errors. Determine functionality, and ensure the production of quality software.


A name, label, quantity', or data item whose value may be changed many times during processing. Contrast with constant.

Variable trace (IEEE)

A record of the name and values of variables accessed or changed during the execution of a computer program. Syn: data-flow trace, data trace, value trace. See: execution trace, retrospective trace, subroutine trace, symbolic trace.


A person or an organization that provides software and(or hardware and/or firmware and/or documentation to the user for a fee or in exchange for services. Such a firm could be a medical device manufacturer.


Can be proved or confirmed by examination or investigation. See: measurable.


Verification is checking that we have built the system right


The comparison between the actual characteristics of something (e.g. a product of a software project) and the specified characteristics.


The process of determining if the products (outputs) of previous development phase are acceptable as inputs. For example, product requirements must be complete ( to an extent) prior to detail design and implementation ( code development)

Verification, software (NIBS)

In general the demonstration of consistence completeness, and correctness of the software at each stage and between each stage of the development life cycle. See: validation, software.


To check the results of data entry; a.g., keypunching.

Verify (ANSI)

To determine whether a transcription of data or other operation has been accomplished accurately.

Verify (Webster)

To prove to be true by demonstration.


An initial release or a complete re-release of a software item or software element. See: release.

Version number

A unique identifier used to identify software items and the related software documentation which are subject to configuration control. The execution of a virus program compromises a computer system by performing unwanted or unintended functions which may be destructive. See: bomb. trojan horse, worm.

Volume (ANSI)

A portion of data, together with its data carrier, that can be handled conveniently as a unit; e.g., a reel of magnetic tape, a disk pack, a floppy disk.


Validation, verification, and testing.


[QA Dictionary Back to Top]


See: code walkthrough.


In the most usual form of term, a walkthrough is step-by-step simulation of the execution of a procedure, as when walking through code line by line, with an imagined set of inputs. The term has been extended to the review of material that is not procedural, such as data descriptions, reference manuals, specifications, etc

Waterfall model (IEEE)

A model of the software development process in which the constituent activities, typically a concept phase, requirements phase, design phase, implementation phase. test phase, installation and checkout phase, and operation and maintenance, are performed in that order, possibly with overlap but with little or no iteration. Contrast with incremental development; rapid prototyping; spiral model.

Work breakdown structure

(WBS) A product-oriented listing, in family tree order, of the hardware, software, services and other work tasks which completely defines a product or program. The listing results from project engineering during the development and production of a materiel item. A WBS relates the elements of work to be accomplished to each other and to the end product.


A sequence of actions the user should take to avoid a problem or system limitation until the computer program is changed. They may include manual procedures used in conjunction with the computer system.


An independent program which can travel from computer to computer across network connections replicating itself in each computer. They do not change other programs, but compromise a computer system through their impact on system performance. See: bomb, trojan horse, virus.

On this page I put QA dictionary. These QA dictionary is very simple and mainly were used for interviewing software testers. Find more about Quality Assurance - QA definitions.
END QA dictionary.

Extreme Software Testing Main Page
© January 2006 Alex Samurin and ©