Thursday, July 9, 2009

ISTQB question pattern and tips to solve:

ISTQB question pattern and tips to solve:
ISTQB questions are formatted in such a way that the answers look very much similar. People often choose the one, which they are more familiar with. We should carefully read the question twice or thrice or may be more than that, till we are clear about what is being asked in the question.

Now look at the options carefully. The options are chosen to confuse the candidates. To choose the correct answer, we should start eliminating one by one. Go through each option and check whether it is appropriate or not. If you end up selecting more than one option, repeat the above logic for the answers that you selected. This will definitely work.

Before you start with the question papers, please read the material thoroughly. Practice as many papers as possible. This will help a lot because, when we actually solve the papers, we apply the logic that we know.

ISTQB 'Foundation level' sample questions with answers:

1. Designing the test environment set-up and identifying any required infrastructure and tools are a part of which phase

a) Test Implementation and execution
b) Test Analysis and Design
c) Evaluating the Exit Criteria and reporting
d) Test Closure Activities

Evaluating the options:
a) Option a: as the name suggests these activities are part of the actual implementation cycle. So do not fall under set-up
b) Option b: Analysis and design activities come before implementation. The test environment set-up, identifying any required infrastructure and tools are part of this activity.
c) Option c: These are post implementation activities
d) Option d: These are related to closing activities. This is the last activity.

So, the answer is 'B'

2. Test Implementation and execution has which of the following major tasks?

i. Developing and prioritizing test cases, creating test data, writing test procedures and optionally preparing the test harnesses and writing automated test scripts.
ii. Creating the test suite from the test cases for efficient test execution.
iii. Verifying that the test environment has been set up correctly.
iv. Determining the exit criteria.

a) i,ii,iii are true and iv is false
b) i,,iv are true and ii is false
c) i,ii are true and iii,iv are false
d) ii,iii,iv are true and i is false

Evaluating the options:
Let's follow a different approach in this case. As can be seen from the above options, determining the exit criteria is definitely not a part of test implementation and execution. So choose the options where (iv) is false. This filters out 'b' and 'd'.

We need to select only from 'a' and 'c'. We only need to analyze option (iii) as (i) and (ii) are marked as true in both the cases. Verification of the test environment is part of the implementation activity. Hence option (iii) is true. This leaves the only option as 'a'.

So, the answer is 'A'

3. A Test Plan Outline contains which of the following:-

i. Test Items
ii. Test Scripts
iii. Test Deliverables
iv. Responsibilities

a) I,ii,iii are true and iv is false
b) i,iii,iv are true and ii is false
c) ii,iii are true and i and iv are false
d) i,ii are false and iii , iv are true

Evaluating the options:
Let's use the approach given in question no. 2. Test scripts are not part of the test plan (this must be clear). So choose the options where (ii) is false. So we end up selecting 'b' and 'd'. Now evaluate the option (i), as option (iii) and (iv) are already given as true in both the cases. Test items are part of the test plan. Test items are the modules or features which will be tested and these will be part of the test plan.

So, the answer is 'B'

4. One of the fields on a form contains a text box which accepts numeric values in the range of 18 to 25. Identify the invalid Equivalence class
a) 17
b) 19
c) 24
d) 21

Evaluating the options:
In this case, first we should identify valid and invalid equivalence classes.

Invalid Class | Valid Class | Invalid Class
Below 18 | 18 to 25 | 26 and above

Option 'a' falls under invalid class. Options 'b', 'c' and 'd' fall under valid class.

So, the answer is 'A'

5. In an Examination a candidate has to score minimum of 24 marks in order to clear the exam. The maximum that he can score is 40 marks. Identify the Valid Equivalence values if the student clears the exam.

a) 22,23,26
b) 21,39,40
c) 29,30,31
d) 0,15,22

Evaluating the options:
Let's use the approach given in question 4. Identify valid and invalid equivalence classes.

Invalid Class | Valid Class | Invalid Class
Below 24 | 24 to 40 | 41 and above

The question is to identify valid equivalence values. So all the values must be from 'Valid class' only.

a) Option a: all the values are not from valid class
b) Option b: all the values are not from valid class
c) Option c: all the values are from valid class
d) Option d: all the values are not from valid class

So, the answer is 'C'

6. Which of the following statements regarding static testing is false:

a) static testing requires the running of tests through the code
b) static testing includes desk checking
c) static testing includes techniques such as reviews and inspections
d) static testing can give measurements such as cyclomatic complexity

Evaluating the options:
a) Option a: is wrong. Static testing has nothing to do with code
b) Option b: correct, static testing does include desk checking
c) Option c: correct, it includes reviews and inspections
d) Option d: correct, it can give measurements such as cyclomatic complexity

So, the answer is 'A'

7. Verification involves which of the following:-
i. Helps to check the Quality of the built product
ii. Helps to check that we have built the right product.
iii. Helps in developing the product
iv. Monitoring tool wastage and obsoleteness.

a) Options i,ii,iii,iv are true.
b) i is true and ii,iii,iv are false
c) i,ii,iii are true and iv is false
d) ii is true and i,iii,iv are false.

Evaluating the options:
a) Option a: The quality of the product can be checked only after building it.
Verification is a cycle before completing the product.
b) Option b: Verification checks that we have built the right product.
c) Option c: it does not help in developing the product
d) Option d: it does not involve monitory activities.

So, the answer is 'B'

8. Component Testing is also called as :-
i. Unit Testing
ii. Program Testing
iii. Module Testing
iv. System Component Testing .

a) i,ii,iii are true and iv is false
b) i,ii,iii,iv are false
c) i,ii,iv are true and iii is false
d) all of above is true

Evaluating the options:
a) Option a: correct, component testing is also called as unit testing
b) Option b: not sure (but as all the options indicate this as true, we can conclude that Program testing is also called as unit testing)
c) Option c: correct, component testing is also called as module testing
d) Option d: wrong. System component testing comes under system testing.

So, the answer is 'A'

9. Link Testing is also called as :
a) Component Integration testing
b) Component System Testing
c) Component Sub System Testing
d) Maintenance testing

Evaluating the options:
As the name suggests, this testing is performed by linking (say modules). Now if
we look at the options, only option 'a' is performed by linking or integrating
modules/components.

So, the answer is 'A'

10.
ISTQB questions 10

What is the expected result for each of the following test cases?
A.TC1: Anand is a 32 year old married, residing in Kolkatta.
B.TC3: Attapattu is a 65 year old married person, residing in Colombo.
a) A – Issue membership, 10% discount, B–Issue membership, offer no discount. B
b) A – Don't Issue membership, B – Don't offer discount. C
c) A – Issue membership, no discount, B – Don't Issue membership.
d) A - Issue membership, no discount, B- Issue membership with 10% discount.

Evaluating the options:

ISYQB answer 10
Explanation:
For TC1: follow the path in green color
(The person is Indian resident, so select only 'True' options.
The person is aged between 18-55, so select only 'True'
The person is a married, so again select only 'True'
For this person, the actions under 'Rule 4′ will be applied. That is, issue membership and no discount)

For TC3: follow the path in blue color
(The person is not Indian resident, so select only 'False' (under Rule 1)
The person is not aged between 18-55. No need to select any path, as it is written "Don't care".
The person is married. No need to select any path, as it is written "Don't care".
For this person, the actions under 'Rule1′ will be applied, That is, Don't issue membership and no discount.)

So, the answer is 'C'

Note: The answers are based on writers own experience and judgment and may not be 100% correct. If you feel any correction is required please discuss in comments below.

Please feel free to ask any question related to ISTQB exam and testing certifications in comment section below.

Monday, May 4, 2009

Test Case Documents

Test Case Documents

Designing good test cases is a complex art. The complexity comes from three sources:

§  Test cases help us discover information. Different types of tests are more effective for different classes of information.

§  Test cases can be "good" in a variety of ways. No test case will be good in all of them.

§  People tend to create test cases according to certain testing styles, such as domain testing or risk-based testing. Good domain tests are different from good risk-based tests.

 

What's a test case?

"A test case specifies the pretest state of the IUT and its environment, the test inputs or conditions, and the expected result. The expected result specifies what the IUT should produce from the test inputs. This specification includes messages generated by the IUT, exceptions, returned values, and resultant state of the IUT and its environment. Test cases may also specify initial and resulting conditions for other objects that constitute the IUT and its environment."

 

What's a scenario?

A scenario is a hypothetical story, used to help a person think through a complex problem or system.

 

Characteristics of Good Scenarios

 

A scenario test has five key characteristics. It is (a) a story that is (b) motivating, (c) credible, (d) complex, and (e) easy to evaluate.

The primary objective of test case design is to derive a set of tests that have the highest attitude of discovering defects in the software. Test cases are designed based on the analysis of requirements, use cases, and technical specifications, and they should be developed in parallel with the software development effort.

A test case describes a set of actions to be performed and the results that are expected. A test case should target specific functionality or aim to exercise a valid path through a use case. This should include invalid user actions and illegal inputs that are not necessarily listed in the use case. A test case is described depends on several factors, e.g. the number of test cases, the frequency with which they change, the level of automation employed, the skill of the testers, the selected testing methodology, staff turnover, and risk.

The test cases will have a generic format as below.

 

Test case ID - The test case id must be unique across the application

Test case description - The test case description must be very brief.

Test prerequisite - The test pre-requisite clearly describes what should be present in the system, before the test can be executes.

Test Inputs - The test input is nothing but the test data that is prepared to be fed to the system.

Test steps - The test steps are the step-by-step instructions on how to carry out the test.

 

Expected Results - The expected results are the ones that say what the system must give as output or how the system must react based on the test steps.

Actual Results – The actual results are the ones that say outputs of the action for the given inputs or how the system reacts for the given inputs.

Pass/Fail - If the Expected and Actual results are same then test is Pass otherwise Fail.

The test cases are classified into positive and negative test cases. Positive test cases are designed to prove that the system accepts the valid inputs and then process them correctly. Suitable techniques to design the positive test cases are Specification derived tests, Equivalence partitioning and State-transition testing. The negative test cases are designed to prove that the system rejects invalid inputs and does not process them. Suitable techniques to design the negative test cases are Error guessing, Boundary value analysis, internal boundary value testing and State-transition testing. The test cases details must be very clearly specified, so that a new person can go through the test cases step and step and is able to execute it. The test cases will be explained with specific examples in the following section.

For example consider online shopping application. At the user interface level the client request the web server to display the product details by giving email id and Username. The web server processes the request and will give the response. For this application we will design the unit, Integration and system test cases.

Figure 6.Web based application

Unit Test Cases (UTC)

These are very specific to a particular unit. The basic functionality of the unit is to be understood based on the requirements and the design documents. Generally, Design document will provide a lot of information about the functionality of a unit. The Design document has to be referred before UTC is written, because it provides the actual functionality of how the system must behave, for given inputs.

For example, In the Online shopping application, If the user enters valid Email id and Username values, let us assume that Design document says, that the system must display a product details and should insert the Email id and Username in database table. If user enters invalid values the system will display appropriate error message and will not store it in database.

Figure 7: Snapshot of Login Screen

Test Conditions for the fields in the Login screen

 

Email-It should be in this format (For Eg clickme@yahoo.com).

Username – It should accept only alphabets not greater than 6.Numerics and special type of characters are not allowed.

 

Test Prerequisite: The user should have access to Customer Login screen form screen

Negative Test Case

Project Name-Online shopping

Version-1.1

Module-Catalog

 

Test                           #

Description

Test Inputs

Expected Results

Actual results

Pass/Fail

1

Check for inputting values in Email field

Email=keerthi@rediffmail

Username=Xavier

 

Inputs should not be accepted. It should display  message "Enter valid Email"

 

 

2

Check for inputting values in Email field

Email=john26#rediffmail.com

Username=John

 

Inputs should not be accepted. It should display  message "Enter valid Email"

 

 

3

Check for inputting values in Username field

Email=shilpa@yahoo.com

Username=Mark24

 

Inputs should not be accepted. It should display  message "Enter correct Username"

 

 

Positive Test Case

Test                           #

Description

Test Inputs

Expected Results

Actual results

Pass/Fail

1

Check for inputting values in Email field

Email=shan@yahoo.com

Username=dave

 

Inputs should be accepted.

 

 

2

Check for inputting values in Email field

Email=knki@rediffmail.com

Username=john

 

Inputs should be accepted.

 

 

3

Check for inputting values in Username field

Email=xav@yahoo.com

Username=mark

 

Inputs should be accepted.

 

 

 

Integration Test Cases

Before designing the integration test cases the testers should go through the Integration test plan. It will give complete idea of how to write integration test cases. The main aim of integration test cases is that it tests the multiple modules together. By executing these test cases the user can find out the errors in the interfaces between the Modules.

For example, in online shopping, there will be Catalog and Administration module. In catalog section the customer can track the list of products and can buy the products online. In administration module the admin can enter the product name and information related to it.

 

 

 

Table3: Integration Test Cases

Test                           #

Description

Test Inputs

Expected Results

Actual results

Pass/Fail

1

 

Check for Login Screen

Enter values in Email and UserName.

For Eg:

Email =shilpa@yahoo.com

Username=shilpa

Inputs should be accepted.

 

 

Backend Verification

Select email, username from Cus;

The entered Email and Username should be displayed at sqlprompt.

 

 

2

Check for Product Information

Click product information link

It should display complete details of the product

 

 

3

Check for admin screen

Enter values in Product Id and Product name fields.

For Eg:

Product Id-245

Product name-Norton Antivirus

Inputs should be accepted.

 

 

Backend verification

Select pid , pname from Product;

The entered Product id and Product name should be displayed at the sql prompt.

 

 

NOTE: The tester has to execute above unit and Integration test cases after coding. And He/She has to fill the actual results and Pass/fail columns. If the test cases fail then defect report should be prepared.

 

System Test Cases: -

The system test cases meant to test the system as per the requirements; end-to end. This is basically to make sure that the application works as per SRS. In system test cases, (generally in system testing itself), the testers are supposed to act as an end user. So, system test cases normally do concentrate on the functionality of the system, inputs are fed through the system and each and every check is performed using the system itself. Normally, the verifications done by checking the database tables directly or running programs manually are not encouraged in the system test.

The system test must focus on functional groups, rather than identifying the program units. When it comes to system testing, it is assume that the interfaces between the modules are working fine (integration passed).

Ideally the test cases are nothing but a union of the functionalities tested in the unit testing and the integration testing. Instead of testing the system inputs outputs through database or external programs, everything is tested through the system itself. For example, in a online shopping application, the catalog and administration screens (program units) would have been independently unit tested and the test results would be verified through the database. In system testing, the tester will mimic as an end user and hence checks the application through its output.

There are occasions, where some/many of the integration and unit test cases are repeated in system testing also; especially when the units are tested with test stubs before and not actually tested with other real modules, during system testing those cases will be performed again with real modules/data in

Test Plan

Test Plan

The test strategy identifies multiple test levels, which are going to be performed for the project. Activities at each level must be planned well in advance and it has to be formally documented. Based on the individual plans only, the individual test levels are carried out.

 

The plans are to be prepared by experienced people only. In all test plans, the ETVX {Entry-Task-Validation-Exit} criteria are to be mentioned. Entry means the entry point to that phase. For example, for unit testing, the coding must be complete and then only one can start unit testing. Task is the activity that is performed. Validation is the way in which the progress and correctness and compliance are verified for that phase. Exit tells the completion criteria of that phase, after the validation is done. For example, the exit criterion for unit testing is all unit test cases must pass.

 

ETVX is a modeling technique for developing worldly and atomic level models. It sands for Entry, Task, Verification and Exit. It is a task-based model where the details of each task are explicitly defined in a specification table against each phase i.e. Entry, Exit, Task, Feedback In, Feedback Out, and measures.

There are two types of cells, unit cells and implementation cells. The implementation cells are basically unit cells containing the further tasks.

For example if there is a task of size estimation, then there will be a unit cell of size estimation. Then since this task has further tasks namely, define measures, estimate size. The unit cell containing these further tasks will be referred to as the implementation cell and a separate table will be constructed for it.

A purpose is also stated and the viewer of the model may also be defined e.g. top management or customer.

 

18.2.1 Unit Test Plan {UTP}

The unit test plan is the overall plan to carry out the unit test activities. The lead tester prepares it and it will be distributed to the individual testers, which contains the following sections.

 

18.2.1.1           What is to be tested?

The unit test plan must clearly specify the scope of unit testing. In this, normally the basic input/output of the units along with their basic functionality will be tested. In this case mostly the input units will be tested for the format, alignment, accuracy and the totals. The UTP will clearly give the rules of what data types are present in the system, their format and their boundary conditions. This list may not be exhaustive; but it is better to have a complete list of these details.

 

18.2.1.2 Sequence of Testing

The sequences of test activities that are to be carried out in this phase are to be listed in this section. This includes, whether to execute positive test cases first or negative test cases first, to execute test cases based on the priority, to execute test cases based on test groups etc. Positive test cases prove that the system performs what is supposed to do; negative test cases prove that the system does not perform what is not supposed to do. Testing the screens, files, database etc., are to be given in proper sequence.

 

18.2.1.4 Basic Functionality of Units

How the independent functionalities of the units are tested which excludes any communication between the unit and other units. The interface part is out of scope of this test level. Apart from the above sections, the following sections are addressed, very specific to unit testing.

·         Unit Testing Tools

·         Priority of Program units

·         Naming convention for test cases

·         Status reporting mechanism

·         Regression test approach

·         ETVX criteria

 

18.2.2 Integration Test Plan

The integration test plan is the overall plan for carrying out the activities in the integration test level, which contains the following sections.

 

2.2.1      What is to be tested?

This section clearly specifies the kinds of interfaces fall under the scope of testing internal, external interfaces, with request and response is to be explained. This need not go deep in terms of technical details but the general approach how the interfaces are triggered is explained.

 

18.2.2.1Sequence of Integration

When there are multiple modules present in an application, the sequence in which they are to be integrated will be specified in this section. In this, the dependencies between the modules play a vital role. If a unit B has to be executed, it may need the data that is fed by unit A and unit X. In this case, the units A and X have to be integrated and then using that data, the unit B has to be tested. This has to be stated to the whole set of units in the program. Given this correctly, the testing activities will lead to the product, slowly building the product, unit by unit and then integrating them.

 

18.2.2.2           List of Modules and Interface Functions

There may be N number of units in the application, but the units that are going to communicate with each other, alone are tested in this phase. If the units are designed in such a way that they are mutually independent, then the interfaces do not come into picture. This is almost impossible in any system, as the units have to communicate to other units, in order to get different types of functionalities executed. In this section, we need to list the units and for what purpose it talks to the others need to be mentioned. This will not go into technical aspects, but at a higher level, this has to be explained in plain English.

 

Apart from the above sections, the following sections are addressed, very specific to integration testing.

·         Integration Testing Tools

·         Priority of Program interfaces

·         Naming convention for test cases

·         Status reporting mechanism

·         Regression test approach

·         ETVX criteria

·         Build/Refresh criteria {When multiple programs or objects are to be linked to arrived at single product, and one unit has some modifications, then it may need to rebuild the entire product and then load it into the integration test environment. When and how often, the product is rebuilt and refreshed is to be mentioned}.

 

18.2.3 System Test Plan {STP}

The system test plan is the overall plan carrying out the system test level activities. In the system test, apart from testing the functional aspects of the system, there are some special testing activities carried out, such as stress testing etc. The following are the sections normally present in system test plan.

 

18.2.3.1 What is to be tested?

This section defines the scope of system testing, very specific to the project. Normally, the system testing is based on the requirements. All requirements are to be verified in the scope of system testing. This covers the functionality of the product. Apart from this what special testing is performed are also stated here.

 

18.2.3.2 Functional Groups and the Sequence

The requirements can be grouped in terms of the functionality. Based on this, there may be priorities also among the functional groups. For example, in a banking application, anything related to customer accounts can be grouped into one area, anything related to inter-branch transactions may be grouped into one area etc. Same way for the product being tested, these areas are to be mentioned here and the suggested sequences of testing of these areas, based on the priorities are to be described.

 

18.2.3.3 Special Testing Methods

This covers the different special tests like load/volume testing, stress testing, interoperability testing etc. These testing are to be done based on the nature of the product and it is not mandatory that every one of these special tests must be performed for every product.

Apart from the above sections, the following sections are addressed, very specific to system testing.

 

·         System Testing Tools

·         Priority of functional groups

·         Naming convention for test cases

·         Status reporting mechanism

·         Regression test approach

·         ETVX criteria

·         Build/Refresh criteria

 

18.2.4     Acceptance Test Plan {ATP}

The client at their place performs the acceptance testing. It will be very similar to the system test performed by the Software Development Unit. Since the client is the one who decides the format and testing methods as part of acceptance testing, there is no specific clue on the way they will carry out the testing. But it will not differ much from the system testing. Assume that all the rules, which are applicable to system test, can be implemented to acceptance testing also.

 

Since this is just one level of testing done by the client for the overall product, it may include test cases including the unit and integration test level details.

 

A sample Test Plan Outline along with their description is as shown below:

 

Test Plan Outline

  1. BACKGROUND – This item summarizes the functions of the application system and the tests to be performed.
  2. INTRODUCTION
  3. ASSUMPTIONS – Indicates any anticipated assumptions which will be made while testing the application.
  4. TEST ITEMS - List each of the items (programs) to be tested.
  5. FEATURES TO BE TESTED - List each of the features (functions or requirements) which will be tested or demonstrated by the test.
  6. FEATURES NOT TO BE TESTED - Explicitly lists each feature, function, or requirement which won't be tested and why not.
  7. APPROACH - Describe the data flows and test philosophy.
    Simulation or Live execution, Etc. This section also mentions all the approaches which will be followed at the various stages of the test execution.
  8. ITEM PASS/FAIL CRITERIA Blanket statement - Itemized list of expected output and tolerances
  9. SUSPENSION/RESUMPTION CRITERIA - Must the test run from start to completion?
    Under what circumstances it may be resumed in the middle?
    Establish check-points in long tests.
  10. TEST DELIVERABLES - What, besides software, will be delivered?
    Test report
    Test software
  11. TESTING TASKS Functional tasks (e.g., equipment set up)
    Administrative tasks
  12. ENVIRONMENTAL NEEDS
    Security clearance
    Office space & equipment
    Hardware/software requirements
  13. RESPONSIBILITIES
    Who does the tasks in Section 10?
    What does the user do?
  14. STAFFING & TRAINING
  15. SCHEDULE
  16. RESOURCES
  17. RISKS & CONTINGENCIES
  18. APPROVALS
The schedule details of the various test pass such as Unit tests, Integration tests, System Tests should be clearly mentioned along with the estimated efforts.