Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

Test Data are the values used during the test case development. Qualitia comes with a built-in Test Data explorer for adding and updating test data.

While adding or editing a test case, You can provide data that a test case can use in the following two ways:

  • Static Data

  • Parameterized Data

Static Data

As the name suggests, static data has a single static value used when executing test cases. Static data is given in the parameter section against each action in Qualitia. This data should be enclosed in double-quotes ("").

For example, if an object is username and Action is Set, then under Parameter column, type as “John”.

Whenever you change the static data for a step of a task, the change is reflected for all instances of this task across all the test cases.

Parameterized Data

Qualitia gives you the provision to use multiple sets of data when executing test cases.

It allows multiple iterations of a single Test case or multiple iterations of a single task within a test case iteration.

The Test Data feature ensures effective time management and maximum test coverage with minimum updates to the test case. In a Test Data, a TC iteration flows from left to right, whereas a Task Iteration within a TC iteration flows from top to bottom.

Qualitia Test Data Explorer can be used to add or update test data. When you click the Test Data button, from the Test Cases Editor screen, Qualitia opens a spreadsheet which allows adding or updating Test Data to use while executing the test case.

To add/ edit Test Data:

  1. Click the Test Data button present in the right pane.The Test Data Explorer opens.

  2. Click  (Edit Data button) to start adding or editing values in the Test Data Explorer. The whole test case is locked for that user.

  3. The following values are displayed in the second row.

    1. TC Iteration: Enter the number for every TC Iteration. There must be at least one TC Iteration for executing the test case.

    2. Selection: Enter True or False for enabling or disabling iterations. Disabled iterations (False) are not considered when the test case is executed.

    3. DataSetTag: Enter a label for identifying the dataset in the test case execution report. Entering data in this column is optional.

  1. From the third row, enter the values in their respective formats. The first row displays the different Tasks highlighted in grey color. Each Task covers its parameters in the second row.

  2. If you want a Task Iteration, insert a row without specifying the TC Iteration number, and then enter the values for parameters. Add more Tasks iterations by inserting new rows.
  3. Once all the data is added or edited, click the Save button located in the Test Editor screen.
  • Enter as $NULL$ in the respective cell when you do not want to add a value for a parameter in a TC or Task Iteration.
  • The Test Data screen is movable and resizes up to the full screen.

  • Export and Import a Test Data in an excel file.
  • After saving a test data, if you want to edit again, click  to unlock the test data again.


Validations in Parameterized Test Data 

Test data validations, while saving test data or importing test data through an excel file, are as follows:

  • Blank Data

    Error message displayed is “ Some cells are left blank. Enter the data in the highlighted fields.” Enter the values in the blank cells. If you do not want to add values, enter as $Null$.


  • Task Name Mismatch

    Error message displayed is “Mismatch in the task name. Enter the correct task name." Correct any task name that is missing or its name has changed.


  • Parameters Mismatch

    Error message displayed is “Some Parameters names or their order have changed. Correct the parameter names and arrange them in the original order.” Correct the sequence of parameters or their names.

In the Test Data explorer, the missing or wrong values are highlighted in red. In case of uploading through the excel file, you can view these messages in the TestDataErrorlog.txt.



DataSetTag

Qualitia gives you a provision to tag the test case iterations with keywords based on the test data set you to provide for test case execution. These tags enable you to differentiate test case iterations based on the provided test data. Using this feature, you can easily spot the test case iteration, which is failed or has defects just by looking at the suite execution report. Identifying the area where a tester needs to work becomes easy and handy when you add relevant tags to the test case iterations.

Where to add these tags? 

Qualitia users can add these tags at test case iteration level from the test data screen.  

Providing these tags is optional. 

Suite Execution Report 

The tags which you add to the test case iterations are displayed in the test case execution report next to the iteration number.

When executing such test cases, if any of the iterations fails or has defects, it is easy to identify which set of data has issues from the whole set of iterations that are causing failures or adding defects.  

These tags are displayed against the test case name in the suite execution report. 










  • No labels