1. BACKGROUND
  2. INTRODUCTION
  3. ASSUMPTIONS
  4. TEST ITEMS
    List each of the items (programs) to be tested.
  5. FEATURES TO BE TESTED
    List each of the features (functions or requirements) which will be tested or demonstrated by the test.
  6. FEATURES NOT TO BE TESTED
    Explicitly lists each feature, function, or requirement which won't be tested and why not.
  7. APPROACH
    Describe the data flows and test philosophy.
    Simulation or Live execution, Etc.
  8. ITEM PASS/FAIL CRITERIA Blanket statement
    Itemized list of expected output and tolerances
  9. SUSPENSION/RESUMPTION CRITERIA
    Must the test run from start to completion?
    Under what circumstances may it be resumed in the middle?
    Establish check-points in long tests.
  10. TEST DELIVERABLES
    What, besides software, will be delivered?
    Test report
    Test software
  11. TESTING TASKS Functional tasks (e.g., equipment set up)
    Administrative tasks
  12. ENVIRONMENTAL NEEDS
    Security clearance
    Office space & equipment
    Hardware/software requirements
  13. RESPONSIBILITIES
    Who does the tasks in Section 10?
    What does the user do?
  14. STAFFING & TRAINING
  15. SCHEDULE
  16. RESOURCES
  17. RISKS & CONTINGENCIES
  18. APPROVALS

 

 

Test Specification Items

Each test specification should contain the following items:

Case No.: The test case number should be a three digit identifer of the following form: c.s.t, where: c- is the chapter number, s- is the section number, and t- is the test case number.

Title: is the title of the test.

ProgName: is the program name containing the test.

Author: is the person who wrote the test specification.

Date: is the date of the last revision to the test case.

Background: (Objectives, Assumptions, References, Success Criteria): Describes in words how to conduct the test.

Expected Error(s): Describes any errors expected

Reference(s): Lists reference documententation used to design the specification.

Data: (Tx Data, Predicted Rx Data): Describes the data flows between the Implementation Under Test (IUT) and the test engine.

Script: (Pseudo Code for Coding Tests): Pseudo code (or real code) used to conduct the test.