Tuesday, April 14, 2009

An approach for Security Testing of Web Applications

Introduction

As more and more vital data is stored in web applications and the number of transactions on the web increases, proper security testing of web applications is becoming very important. Security testing is the process that determines that confidential data stays confidential (i.e. it is not exposed to individuals/ entities for which it is not meant) and users can perform only those tasks that they are authorized to perform (e.g. a user should not be able to deny the functionality of the web site to other users, a user should not be able to change the functionality of the web application in an unintended way etc.).

Some key terms used in security testing

Before we go further, it will be useful to be aware of a few terms that are frequently used in web application security testing:

What is "Vulnerability"?
This is a weakness in the web application. The cause of such a "weakness" can be bugs in the application, an injection (SQL/ script code) or the presence of viruses.

What is "URL manipulation"?
Some web applications communicate additional information between the client (browser) and the server in the URL. Changing some information in the URL may sometimes lead to unintended behavior by the server.

What is "SQL injection"?
This is the process of inserting SQL statements through the web application user interface into some query that is then executed by the server.

What is "XSS (Cross Site Scripting)"?
When a user inserts HTML/ client-side script in the user interface of a web application and this insertion is visible to other users, it is called XSS.

What is "Spoofing"?
The creation of hoax look-alike websites or emails is called spoofing.
Security testing approach:

In order to perform a useful security test of a web application, the security tester should have good knowledge of the HTTP protocol. It is important to have an understanding of how the client (browser) and the server communicate using HTTP. Additionally, the tester should at least know the basics of SQL injection and XSS. Hopefully, the number of security defects present in the web application will not be high. However, being able to accurately describe the security defects with all the required details to all concerned will definitely help.

1. Password cracking:

The security testing on a web application can be kicked off by "password cracking". In order to log in to the private areas of the application, one can either guess a username/ password or use some password cracker tool for the same. Lists of common usernames and passwords are available along with open source password crackers. If the web application does not enforce a complex password (e.g. with alphabets, number and special characters, with at least a required number of characters), it may not take very long to crack the username and password.

If username or password is stored in cookies without encrypting, attacker can use different methods to steal the cookies and then information stored in the cookies like username and password.

For more details see article on "Website cookie testing".

2. URL manipulation through HTTP GET methods:

The tester should check if the application passes important information in the querystring. This happens when the application uses the HTTP GET method to pass information between the client and the server. The information is passed in parameters in the querystring. The tester can modify a parameter value in the querystring to check if the server accepts it.

Via HTTP GET request user information is passed to server for authentication or fetching data. Attacker can manipulate every input variable passed from this GET request to server in order to get the required information or to corrupt the data. In such conditions any unusual behavior by application or web server is the doorway for the attacker to get into the application.

3. SQL Injection:

The next thing that should be checked is SQL injection. Entering a single quote (') in any textbox should be rejected by the application. Instead, if the tester encounters a database error, it means that the user input is inserted in some query which is then executed by the application. In such a case, the application is vulnerable to SQL injection.

SQL injection attacks are very critical as attacker can get vital information from server database. To check SQL injection entry points into your web application, find out code from your code base where direct MySQL queries are executed on database by accepting some user inputs.

If user input data is crafted in SQL queries to query the database, attacker can inject SQL statements or part of SQL statements as user inputs to extract vital information from database. Even if attacker is successful to crash the application, from the SQL query error shown on browser, attacker can get the information they are looking for. Special characters from user inputs should be handled/escaped properly in such cases.

4. Cross Site Scripting (XSS):

The tester should additionally check the web application for XSS (Cross site scripting). Any HTML e.g. <HTML> or any script e.g. <SCRIPT> should not be accepted by the application. If it is, the application can be prone to an attack by Cross Site Scripting.

Attacker can use this method to execute malicious script or URL on victim's browser. Using cross-site scripting, attacker can use scripts like JavaScript to steal user cookies and information stored in the cookies.

Many web applications get some user information and pass this information in some variables from different pages.

E.g.: http://www.examplesite.com/index.php?userid=123&query=xyz

Attacker can easily pass some malicious input or <script> as a '&query' parameter which can explore important user/server data on browser.

Important: During security testing, the tester should be very careful not to modify any of the following:

  •  Configuration of the application or the server
  •  Services running on the server
  •  Existing user or customer data hosted by the application

Additionally, a security test should be avoided on a production system.

The purpose of the security test is to discover the vulnerabilities of the web application so that the developers can then remove these vulnerabilities from the application and make the web application and data safe from unauthorized actions

SQL Injection – How to Test Web Applications against SQL Injection Attacks

Security testing of web applications against SQL Injection, explained with simple examples - By Inder P Singh.

Many applications use some type of a database. An application under test might have a user interface that accepts user input that is used to perform the following tasks:

1.    Show the relevant stored data to the user e.g. the application checks the credentials of the user using the log in information entered by the user and exposes only the relevant functionality and data to the user

2.    Save the data entered by the user to the database e.g. once the user fills up a form and submits it, the application proceeds to save the data to the database; this data is then made available to the user in the same session as well as in subsequent sessions

Some of the user inputs might be used in framing SQL statements that are then executed by the application on the database. It is possible for an application NOT to handle the inputs given by the user properly. If this is the case, a malicious user could provide unexpected inputs to the application that are then used to frame and execute SQL statements on the database. This is called SQL injection. The consequences of such an action could be alarming.

The following things might result from SQL injection:

1. The user could log in to the application as another user, even as an administrator.

2. The user could view private information belonging to other users e.g. details of other users' profiles, their transaction details etc.

3. The user could change application configuration information and the data of the other users.

4. The user could modify the structure of the database; even delete tables in the application database.

5. The user could take control of the database server and execute commands on it at will.

Since the consequences of allowing the SQL injection technique could be severe, it follows that SQL injection should be tested during the security testing of an application. Now with an overview of the SQL injection technique, let us understand a few practical examples of SQL injection.

Important: The SQL injection problem should be tested only in the test environment.

If the application has a log in page, it is possible that the application uses a dynamic SQL such as statement below. This statement is expected to return at least a single row with the user details from the Users table as the result set when there is a row with the user name and password entered in the SQL statement.

SELECT * FROM Users WHERE User_Name = '" & strUserName & "' AND Password = '" & strPassword & "';"

If the tester would enter John as the strUserName (in the textbox for user name) and Smith as strPassword (in the textbox for password), the above SQL statement would become:

SELECT * FROM Users WHERE User_Name = 'John' AND Password = 'Smith';

If the tester would enter John'– as strUserName and no strPassword, the SQL statement would become:

SELECT * FROM Users WHERE User_Name = 'John'– AND Password = 'Smith';

Note that the part of the SQL statement after John is turned into a comment. If there were any user with the user name of John in the Users table, the application could allow the tester to log in as the user John. The tester could now view the private information of the user John.

What if the tester does not know the name of any existing user of the application? In such a case, the tester could try common user names like admin, administrator and sysadmin. If none of these users exist in the database, the tester could enter John' or 'x'='x as strUserName and Smith' or 'x'='x  as strPassword. This would cause the SQL statement to become like the one below.

SELECT * FROM Users WHERE User_Name = 'John' or 'x'='x' AND Password = 'Smith' or 'x'='x';

Since 'x'='x' condition is always true, the result set would consist of all the rows in the Users table. The application could allow the tester to log in as the first user in the Users table.

Important: The tester should request the database administrator or the developer to copy the table in question before attempting the following SQL injection.

If the tester would enter John'; DROP table users_details;'—as strUserName and anything as strPassword, the SQL statement would become like the one below.

SELECT * FROM Users WHERE User_Name = 'John'; DROP table users_details;' –' AND Password = 'Smith';

This statement could cause the table "users_details" to be permanently deleted from the database.

Though the above examples deal with using the SQL injection technique only the log in page, the tester should test this technique on all the pages of the application that accept user input in textual format e.g. search pages, feedback pages etc.

SQL injection might be possible in applications that use SSL. Even a firewall might not be able to protect the application against the SQL injection technique.

I have tried to explain the SQL injection technique in a simple form. I would like to re-iterate that SQL injection should be tested only in a test environment and not in the development environment, production environment or any other environment. Instead of manually testing whether the application is vulnerable to SQL injection or not, one could use a web vulnerability scanner that checks for SQL injection.

Top 25 common programming bugs every tester should know

Just a quick note to share a useful resource with you. Just came across a good article "25 common programming errors" for software programmers and software testers. Basically this is more useful for programmers but I think software testers can get insight on how developers can unknowingly leave bugs in software programs.

Each bug listed in this resource can lead to serious software vulnerabilities if not fixed. The top 25 security bugs list will help programmers to avoid some common but serious coding mistakes. For software testers list will be useful as a security testing checklist for Internet as well as for testing desktop application.

Here are few top security vulnerabilities discussed in detail in this article:

  •  Improper input validation
  •  Improper escaping of output or encoding
  •  SQL injection
  •  Cross-site scripting
  •  Race conditions
  •  Information leak in error messages
  •  Error while transmitting sensitive information
  •  Memory leak
  •  External control of critical data and file paths
  •  Improper initialization
  •  Improper authorization
  •  Client side security checks

I think, the most common security vulneraability mistake developers make is "Client side enforcement of server side security".

Monday, April 13, 2009

A Roadmap for Testing

A Roadmap for Testing

 

Purpose & Scope:

 

This document provides a comprehensive, high-level, Roadmap summary of testing technical guidelines, checklist items, and brief documentation templates. The testing technical data included herein is meant to be a menu of items to select from based on the unique specifics of a given… testing trip (project). It is not intended to be applicable in totality to any given project.  However, it is recommended that all included items should at least be considered/analyzed for applicability based on the specifics of a given project.  There should be some sound, technically defensible, logic or reason for not including an item.  It should be noted that project risk16&17 may increase in some proportion to the items not selected…you may get lost somewhere along the way of your testing trip.

 

This Roadmap is meant to be useful for all sorts of testing trips…from a simple, eXtreme9,26&27-ly short walk-around-the-block-testing-trip to a Biblical-40-year-testing-trek-through-the-desert constrained by stone tablet regulations….

 

Examples:

 

1.       For eXtreme9,26&27 testing walk-around-the-block-trips, extreme developments, and small simple projects / systems…the recommended, cost effective, and risk-appropriate16&17 testing activities may be narrow in scope and described by a single paragraph or page. As an eXtreme minimum example, Dr. James Whittaker4 recommends little or no formal test documentation at all, but instead recommends developing a Fault model4 of your application under test.  Then … Brain on…eyes open…TEST!4…using various Attacks4 (listed below). 

 

2.       For Biblical-40-year-testing-treks-through-the-desert constrained by stone-tablet-regulations…large complex projects…essentially the opposite end of the spectrum of eXtreme testing trips…one can take a Biblical and Regulatory approach to test activities and documentation: 

 

                             Biblical - Test everything. 1 Thessalonians 5:21 NIV    - A worthy spiritual goal, but in general not possible in this

                             lifetime…but one can focus on progress, not perfection.

 

                                                       Regulatory - If it isn’t documented, it didn’t happen.Food &Drug Administration  All test activities are completely

                           documented. Each of the technical testing activities may be separate or multiple documents/volumes

                           (e.g., a Test  master plan with subordinate multiple testing plans, designs, cases, procedures, and reports).  

 

Experience Note: As effort intensive as it may seem, this Biblical and Regulatory approach may be somewhat appropriate for high risk systems such as weapons/combat systems, biomedical systems, nuclear reactor control systems, and spacecraft control systems.

 

Generally however, most testing trips are of some middle length/size, complexity, and risk16&17 requiring a somewhat less than  Biblical and Regulatory approach. That is, the scope of activities and extent of formal documentation should be proportional to the project / product  size, complexity, and  risks16&17.

 

An attempt has been made to include a broad scope of summarized, current, testing technology.  More detail is provided in the included references.  That is, this document is applicable to projects / products of all sizes including initial/new development testing as well as the testing of subsequent computer system software revisions / releases. For a completely new release, this document may be analyzed for applicability to all life cycle phases of development; or, in part for minor releases.

 

Experience Notes:

1.       You never know what is enough [testing] unless you know what is more than enough.

2.       To err is human. To really foul things requires a computer.

3.       Great engineering [testing] is simple engineering. ‑ James Martin

4.       Myth #1: The computer only does what you tell it.

5.       There's always one more bug.

6.       The Definition of an Upgrade: Take old bugs out, put new ones in.

7.       Computers make very fast, very accurate mistakes.

8.       “It did what? Well, it's not supposed to do that.”

9.       “My software never has bugs.  It just develops random features.”

10.    If debugging is the process of removing bugs, then programming must be the process of putting them in.

11.    The sooner a bug is found and fixed, the cheaper.12

 

 

The "V" Testing Concept9&15

 

 

 

 

 

 

 

 

 

 

In a lifecycle approach to testing, testing should occur throughout the project lifecycle; from testing requirements through conducting user acceptance testing. E.g., looking at the figure above, User Acceptance Test activities should begin concurrently with Requirements Definition activities.  The arrows in general show the time relationship of testing and development activities. The earlier an error is detected the less expensive it is to correct. -Barry Boehm, 1976

 

 

Testing (Validation) Principles:

 

1.       The primary role of software testing is two-fold: determine if the system meets specifications (developer view); determine whether the system meets operational business and user needs (sponsor/customer/user view)9   In addition, there are several secondary roles for testing in an organization that include: raising issues; instilling confidence in the system; providing insight into the software delivery process; continuously improving the test process.9

2.       Testing is a detection activity.  It can only demonstrate the presence of errors.  Testing cannot prove the absence of errors.  Thus, testing is the process of analyzing a software item to detect the differences between required outputs (or conditions) and actual outputs (or conditions). You can't test quality into software - Testing will not improve bad coding techniques or poor coding structure.9

3.       Present test results as observations…not judgements.  This approach goes a long way in developing a congenial constructive communication with the developers. 

4.       As a general testing principal, all requirements should be validated by one of the following methods: testing, inspection, demonstration, and analysis.  This document focuses primarily on the testing method of validation.   Some requirements may not be able to be cost-effectively validated by testing.  In such cases, requirements should be validated by some other method than testing.

5.       Project interim deliverables/documentation may be verified to provide in-progress quality assurance of successful system/software validation.  See examples below.

6.       Test planning begins as soon as requirements are established, and must be modified whenever the requirements are modified.

7.       The testing effort is planned under the assumption that errors will be found.

8.       The testing process is documented (plans, design, cases, procedures, and reports) and approved.

9.       Every test case/procedure/result should be repeatable.

10.    All hardware or software changes should be tested prior to releasing the system to the production environment.  This includes regression testing of all software components that could be impacted by the change(s).

11.    Testing standards should be established to address: Documentation,  Deliverables, Tools & techniques, Quality requirements

12.    Software developers are not the only ones to test the software.  Independent testers/users (independent of the engineer responsible for development) may be used conduct system, acceptance, and other types of testing.

13.    Testers never assume that the system/software is correct.  The software testing effort provides that evidence.

14.    Start early. Even though you might not have all of the details at hand, you can complete a great deal of the planning effort by starting on the general and working toward the specific. By starting early, you can also identify resource needs and plan for them before other areas of the project subsume them.9

15.    Keep the test plan flexible. Make it easy to add test cases, test data, and so on. The test plan itself should be changeable, but subject to change control.9 

 

Testing (Validation) Principles. (continued)

 

16.    Frequently review the test plan. Other people's observations and input greatly facilitate achieving a comprehensive test plan. The test plan should be subject to quality control just like any other project deliverable.9

17.    Keep the test plan concise and readable. The test plan does not need to be large and complicated. In fact, the more concise and readable it is, the more useful it will be. Remember that the test plan is intended to be a communication document. The details should be kept in a separate reference document.9  

18.    Calculate the planning effort. You can count on roughly one-third of the testing effort being spent on each of the following test activities: planning, execution, and evaluation.9 

19.    Spend the time to do a complete test plan. The better the test plan, the easier it will be to execute the tests.9

20.    Always write down what you do and what happens when you do exploratory testing.12The primary objectives9 of testing during the requirements phase are to:

  • Determine that the requirements fairly represent what the is needed
  • Determine that the needs have been defined and documented
  • Verify that a reasonable and valid cost/benefit study has been performed
  • Determine that the business problem has been solved
  • Verify that the control requirements have been specified
  • Verify that a reasonable process was followed in developing the business solution
  • Verify that a reasonable alternative was selected among the most probable alternative solutions”

21.    eXtreme Testing 9  - The process of testing software in a dynamic environment whereby the program is executing while testing is underway to identify defects and enhance improved quality and productivity. The methodology combines the best practices of traditional software testing methods and processes with the ability to apply these time-tested methods in a rapid development paradigm. It focuses on collaboration between developer, customer, and tester, testing is accomplished in shorter timeframes with an emphasis on developing the software correctly the first time, thereby eliminating the need for lengthy test cycles and redundant tests. This is a highly customizable approach to testing software and is adaptable to any software development environment. The best success is achieved in organizations that already have mature development and testing processes and have attained a level of capability beyond simplistic means.

 

 

Levels / Categories of Testing:

 

For a specific project, an analysis should be accomplished to determine all appropriate/applicable/cost-effective levels of tests.  Those selected should be documented along with rational/justification.  The following is a partial list of important levels of validation/testing that, as a minimum, should be considered for applicability to a specific project or product:

 

1.       Functional testing (Generally required for all products. The purpose of functional testing is to reveal defects related to the product’s/component’s functionality and conformance to documented functional requirement specifications.)

2.       Unit testing - usually accomplished by developers; Computer Software Unit1&10 testing

3.       Structural testing9

4.       Exploratory testing (Always write down what you do and what happens when you run exploratory tests.12)

5.       Component/Sub-component testing Computer Software Component1&10 testing.

6.       Walkthroughs, inspections, desk-checking9

7.         Verification (e.g., reviews, examinations, walkthroughs, desk-checking, or inspection of interim work products such as requirements, design, specifications, documentation, prototypes, code; early detection of errors are highly cost effective.) 1, 2, 9, &13

8.       Developmental integration testing 9&13

9.       Developmental system testing 13

10.    User acceptance testing (Generally required for all products. The purpose of acceptance testing is convincing the user that the product fulfills expected user needs.)13

11.    Performance/load/stress testing 9&12

12.    Security/access testing 9&19

13.    Usability testing16, 22,23&24

14.    Operational procedure documentation verification11

15.    Regression testing (Reliability)9&12

 

Levels / Categories of Testing: (continued)

 

16.    Alpha & Beta Testing12&13

17.    Smoke Test - …establish that the system is stable and all major functionality is present and works under ‘normal’ conditions.13

18.    Pilot testing10

19.    Recovery testing - …can involve the manual functions of an application, loss of input capability, loss of communication lines, hardware or operating system failure, loss of database integrity, operator error, or application system failure.9

20.    Operations testing / Usability Testing (Ease of Operations)9

21.    Compliance testing - …verifies that the application was developed in accordance with information technology standards, procedures, and guidelines.9

22.    Manual-Support Testing - Systems commence when transactions originate and conclude with the use of the results of processing. The manual part of the system requires the same attention to testing as does the automated segment.9

23.    Intersystem [interface] Testing9

24.    Parallel Testing (e.g., matching test results between current live system and new system.)

25.    Compliance Testing (Authorization) - Testing should verify that the authorization rules have been properly implemented and evaluate compliance with them. Test conditions should include unauthorized transactions or processes to ensure that they are rejected, as well as ensuring authorized transactions are accepted.9

 

 

Web Site Specific Testing 9&14

 

Testing specific to Web sites should address:

 

Visual appeal, usability, readability, accessibility

Site navigation and hyperlinks

Accurate, reliable and consistent information.

Correct data capture

Transactions completion

Gateway software (“Common Gateway Interface”)

HTML validation / W3C compliance

Alternative Graphics Text

Shopping cart, price, tax, and shipping calculations

Server Load testing

Database / file / table validation, integrity, authenticity of facts provided

Query Response time

Multiple browser compatibility

User/page response time

Recovery after failure/error; fail-over; reliability; availability

Network security

Payment transaction security

Printer-friendly pages

Style guides/sheets

Client versus server validation & functional partitioning

Search engine visibility / meta-tags

Site map

Pop-ups

Audio/video streaming content

Database middleman software

Cookies

Screen size and pixel resolution

Scalability/capacity growth

Lists

Searchable documents

Forms

Frames

Image weight (load time)

Information architecture - chunking

Usability22,23,24&25

Data & access security

Site user impact

Backup / recovery

 

 

 

Test Data and Data Verification:

 

It is important to consider the verification, validity, accuracy, and completeness of any test data.  Queries or utilities may be constructed to independently verify the contents of a test data set.  Data sets and data verification queries, utilities, etc., should be controlled as configuration items and form part of the system baseline.

Data sets intended for testing use in general may include:

1.       “snapshots” of production data

2.       data constructed of specific data entries designed to match particular test procedures; specific expected results

3.       messages

4.       transactions

5.       records

6.       files

7.       scripts

8.       automated tool generated data

9.       ad hoc key strokes

10.    DBMS views

11.    Stored procedures

12.    Queries

 

 

Test (Validation21) Design, Methods, & Techniques:

 

Evaluating a system…to determine if it satisfies specified requirements.21  For a specific project, an analysis should be accomplished to determine all appropriate/applicable test design, methods, and techniques.  Those selected should be documented along with rational/justification.  The following is a partial list of various test design, methods, and techniques that, as a minimum should be considered for applicability to a specific project or product:

 

requirements based functional testing5,9,11,12,&13

domain analysis testing5

data flow testing5

control flow testing5

orthogonal array testing5

waterfall testing5

boundary value testing5&12

risk based testing16&17

error guessing12

test automation15

string testing10

eXtreme testing9,26&27 

test attacks4 (see summaries below)

greybox testing4&5

white box testing5&21 / structural testing 5&21/  glass box testing12

black box testing5&21 /  behavioral testing5&21

transaction testing5

state transition testing5

all pairs / pairwise testing5

decision table testing5

equivalence class testing5&12

translucent-box testing12

operation testing9

recovery testing9

compliance (to process) testing9

execution  testing (desired level of proficiency)9

path testing12

error handling / messages testing9

business rules decision tables5

Inter-systems/interface testing (data passing)1&9

manual support (people-computer interaction) testing9

fault model4

 

 

 

Verification21 Types & Techniques

 

Verification methods are used to ensure that interim life cycle items (e.g., software, hardware, documentation):

1.       comply with standards, procedures, guidelines, and processes

2.       satisfy the conditions imposed at the start of that [life cycle] phase

3.       are consistent with items from a previous life cycle phase. 

 

Experience Note: Review early and review often.

 

Management Reviews20

Technical reviews20

Inspections20&28

Walkthroughs20

Audits20

Concept of Operations Review/Walk-through

Requirements Review/Walk-through

Design Review/Walk-through

Code Inspection/ Walk-through

Test Readiness Review/ Walk-through

Requirements tracing9 (e.g., throughout the life cycle; requirements versus design; requirements versus tests.)

Phase-end reviews9

Physical Audit (correct versions of hardware & software components)

(Experience Note: Before you release the program, you MUST check that the release disk [media] contain the most recent version of every file.12)

In-Progress Review9

Post Implementation Reviews9

Peer reviews9

 

 

 

Test Attack4 Summary  - User Interface Attacks4

 

1. Apply inputs that force all error messages.

10. Force the screen to refresh.

2. Apply inputs that force software to establish default values.

11. Apply inputs using a variety of initial conditions.

3. Explore allowable character sets and data types.

12. Force a data structure to store too many or too few values.

4. Overflow input buffers.

13. Investigate alternate ways to modify internal data constraints.

5. Find inputs that mat interact and test combinations of

their values.

14. Experiment with invalid operand and operator combinations.

6. Repeat the same inputs numerous times.

15. Force a function to call itself recursively.

7. Force different outputs to be generated for each input.

16. Force computation results to be too large or too small.

8. Force invalid outputs to be generated.

17. Find features that share data or interact poorly.

9. Force properties of an output to change.

 

 

 

 

Test Attack4 Summary  - System Interface Attacks4

 

1. Fill the system to capacity.

4. Assign an invalid file name.

2. Force the media to be busy or unavailable.

5. Vary access permissions.

3. Damage the media.

6. Vary corrupt file contents.

 

 

 

Testing (Validation) Documentation Content 

Test Plan, Test Design & Cases, Test Report/Results:

 

The recommended basic components described in this section should be included in all documents, unless there is sufficient and acceptable justification any of them in whole or in part to be not applicable.  These basic components may have different names, groupings, or sequences in different projects and it may be necessary to determine if there is an equivalent or suitable substitute.  Additional information may be added as required.

 

A practical note from some experts: A test plan [documentation] is a valuable tool to the extent that it helps you manage your testing project and find bugs.  Beyond that it is a diversion of resources.12

 

Experience Note: Lots of documents do not make good documentation.

 

 

Test Plan Content:

 

1.       Experience Notes:

         To fail to plan is to plan to fail. 

        A good plan today is better than a perfect plan tomorrow.-Wag the Dog, movie dialog

         No plan survives the departure point; keep the plan current.

2.       Introduction: testing policies; testing standards & conventions; purpose of project tests; scope; project testing objectives; system description; references; definitions/terms; assumptions; constraints; testing documentation approach.   

3.       Project Testing Communication Approach: Communication mechanisms e.g., formal and informal meetings; working sessions; processes (such as defect tracking); tools (issue and defect tracking, electronic bulletin boards, notes databases, Intranet sites); bulletin boards; email; status reports; in-progress review meetings.

4.       Testability: Is necessary and sufficient documentation available to plan testing; is the documentation available/provided written with sufficient clarity, accuracy, and detail to allow the system/product/software to be unambiguously tested.

5.       Identification of Configuration Item(s) Under Test: e.g., Hardware (e.g., Specialty/Test Equipment, Communication Server, Terminal, Applications Server, Computer, etc. - nomenclature, version, date); Software (e.g., modules, units, systems, sub-systems, CSCI, CSCs, CSUs, data structures  - name, version/revision, and date); User Operational Procedures (document name, version/revision, and date)

 

6.       Risks & Contingency Planning*: Identify both project-oriented testing activity risks as well an application-oriented risks.  Both types of risks may potentially affect the success of the testing. (*if not already included in project management planning documentation.)

7.       Functions/Features/Design to be Tested

8.       Functions/Features/Design Not to be Tested

9.       Traceability: all requirements/features should be traceable to at least one test procedure; conversely, are all test procedures should be traceable to at least one requirement/feature; this traceability is best documented in a traceability matrix.

10.    Test Technical Approach: project test strategy (e.g., risk reduction); testing levels; test design methods/techniques; test tools; test metrics; test automation strategy.

11.    Test Data: creation, verification, and use.

12.    Test Results Recording, Data Collection, Log, & Reports: e.g., instructions for performing tests, documenting results, reporting and resolving deviations, documentation/log of errors discovered, error corrections, re-testing, sign-off/approval provisions.

13.    Test Execution: e.g., important factors that affect test execution; test cycles; test resources

14.    General Pass/Fail Criteria

15.    Acceptance Criteria

16.    Suspension / Resumption Requirements/Criteria: e.g., corrective action taken when problems or errors are encountered may include correct documentation, re-run test, re-code, re-design, regression testing

 

Test Plan Content: (continued)

 

17.    Pre-Test Needs/Test Environment: e.g., hardware/test equipment & documentation; test support software & documentation; test data files; test tools & documentation (Test driver software may need to be developed or procured to test some applications; test design documentation should address this situation.); description of the test data design/validation; installation instructions; multiple environment migrations (e.g.,  development->  test-> live).

18.    Organizational Roles and responsibilities: e.g., groups and individuals executing test tasks, groups and individuals providing test support functions. (*If not already included in project management planning documentation.)

19.    Miscellaneous: Test Deliverables*(e.g., plans, design, cases, procedures, reports, logs); Tasks*; Schedule*; Milestones*; Staffing Needs*; Training Needs*; Pre-requisites Tasks* (e.g., completion of interfacing systems) (*if not already included in project management planning documentation.)

 

 

Test Design & Cases Content:

 

1.       Introduction: e.g., same as test plan or note exceptions or differences.

2.       Test Approach Refinements: e.g., Coverage (operational scenarios, use cases, requirements, design, data domain, source code, executable code, logic paths)

3.       Test Automation: e.g., automated test regime/environment, techniques, selected test tools, applicability and of test automation for specific project testing activities, division between manual and automated tests.

4.       Test Case Selection, Rationale, Justification, Traceability: e.g., with respect to features or specifications

5.       Test Case Identification (for each test case)

6.        Specific Pass/Fail Criteria for Each Feature/Function/Scenario

 

 

Test Procedures Content:

 

1.       Introduction: e.g., same as test plan or note exceptions or differences.

2.       Test Preparations: e.g., test case, procedure, scripts, and test data names/identification; pre-test/set-up procedures; hardware preparation; software preparation; other Pre-Test Preparations; system/software initialization; test inputs/outputs; criteria for evaluating results; assumptions, constraints, dependencies

3.       Test Procedure(s) Description: e.g., detailed steps, expected results, actual results/outcome/log; deviations from test procedures; variances, and problems encountered ; test assessment

4.       Errors/Enhancements: e.g., bug / incident  tracking / documentation / prioritization (errors are inevitable); enhancement / change  tracking / documentation/prioritization (changes are inevitable).

 

 

Test Report: 

 

1.       Experience note: Record observations of system behavior.  Avoid making judgements about products or development staff producing those products.

2.       Introduction: e.g., same as test plan or note exceptions or differences.

3.       Expected results versus actual results: Define any problems; identify the work and input sequences leading to a suspected problem; identify the test case/procedure producing the suspected problem; identify the system elements involved; identify the flow of work and/or data input leading to the condition; identify the tester(s); assure the problem is repeatable.

4.       Detailed, specific verification artifacts/evidence/results should be captured that can be compared against equally detailed expected results, rather than a simple pass/fail format.  As an example, these artifacts may be generated reports, screen prints, and test database contents. All artifacts should be unambiguously labeled with the test case/procedure number, step number, date, and tester identification. Actual Results Deviations from Test Procedures, Variances, and Problems Encountered

5.       Errors/Enhancements: e.g., “bug”/incident reporting; enhancement/change reporting.               

6.       Summary of Test Activities and Actual Results.

7.       Comprehensive Test Evaluation, Notes, and Recommendations

 

 

Signatures and Approvals.

 

As required and appropriate for the specific organization and/or project.

 

 

References:

 

1.       IEEE  Standard for Software Test Documentation (ANSI), IEEE Std 829-1998

2.       IEEE  Standard for Software Verification and Validation Plans (ANSI), IEEE Std 1012-1986. New York, NY: IEEE, September 17, 1992 (update reference when changed)

3.       A Test Plan Guide and Readiness Checklist To Facilitate the Execution of Capacity Tests, Jose Fajardo, StickyMinds.com Template, Jun 10, 2003

4.       How to Break Software, James A. Whittaker, Addison Wesley, 2003

5.       Mastering Test Design, Lee Copeland, (draft manuscript May, 2003)

6.       Test Plan Review Checklist, Ross Collard, Copyright Collard & Company 2002

7.       SQE Test Plan Template, Softwqare Quality Engineering, Version 7.0, Copyright 2001

8.       Test Planning Case Study –2, Ross Collard, Copyright Collard & Company 2002

9.       Boot Camp for Software Testers, On-Line Training  Course, St Petersburg College, Quality Assurance Institute (QAI), Copyright 2002

10.     Managing the Testing Process, Rex Black, Second Edition , Wiley Publishing, Inc. 2002

11.     Software Testing in the Real World: Improving the Process, Edward Kit, Addison-Wesley, 1995

12.    Testing Computer Software, Kaner, Falk, Nruyen, Second Edition, Wiley, 1999

13.    Systematic Software Testing, Craig & Jaskiel, Artech House, 2002

14.    The Web Testing Handbook, Splaine & Jaskiel, STQE Publishing, 2001

15.    Software Test Automation: Effective Use of test Execution tools, Fewster & Graham Addison-Wesley, 1999

16.    How to Conduct Heuristic Risk Analysis, James Bach,  STQE Magazine, Nov/Dec 1999 (Vol. 1, Issue 6) 

17.    Approach to Implementing Risk Based Testing, Sreeram Kishore Chavali , StickyMinds.com Article, Mar 29, 2001

18.    A Test Plan Guide and Readiness Checklist To Facilitate the Execution of Capacity Tests, Jose Fajardo, StickyMinds.com Template, Jun 10, 2003

19.    How to Break Software, Security, James A. Whittaker & Herbert H. Thompson, Addison Wesley, 2004

20.    IEEE  Standard for Software Reviews, IEEE Std 1028-1997

21.    IEEE Standard Glossary of Software Engineering Terminology, IEEE Std 610.12-1990

22.    Homepage Usability, 50 Websites Deconstructed, Jakob Nielsen/Marie Tahir, New Riders Publishing, 2001

23.    Introduction to Usability Testing, Cheryl L. Nesta, Vanteon,  SQE STARWEST 2001 Conference

24.    User Interface Guidelines, UI Guide, Bruce Durbin, StickyMinds.com User Submitted Article, Dec 24, 2001

25.    Usability Testing, How an observer can correctly interpret and categorize user behavior, Bob Stahl, STQE Magazine, Sep/Oct 2001 (Vol. 3, Issue 5)

26.    Extreme Testing, Ronald E. Jeffries,  STQE Magazine, Mar/Apr 1999 (Vol. 1 Issue 2)

27.    Testing in the Extreme Programming World, Robert Martin, Object Mentor, Inc.,  SQE STARWEST 2001 Conference

28.    Leveraging Inspections, What to do with the data you collect, Ed Weller, STQE Magazine, July/August 2003