Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Test data is the information used during the test case development. Qualitia comes with a built-in Test Data explorer for adding/updating test data. When developing test cases in Qualitia, you can provide data in 2 waysA test case contains the following two types of test data:

  • Static Data

  • Parameterized Data

Static Data

...

As the name suggests, static  static data has denotes a single static value to be used when executing test cases. Static data is given in the parameter section against each action in Qualitia. This data should for a parameter of an action. The static data must be enclosed in double-quotes ("")For example, if a step contains an object as username and Action as Set, then under the Parameter column, enter “John”.

Whenever you change the static data for a step of a task, the change is reflected in all the associated test cases.

Parameterized Data

...

Qualitia gives you provision to add You can define multiple sets of data to be used when executing test cases. This allows executing multiple iterations of the single test case in one go with different details. This test data ensures effective time management and maximum test coverage with minimum updates to the test case.

You can use Qualitia's Test Data explorer to add or update test data. When you click the test data button from the Test Cases or Requirements screen, Qualitia opens a spreadsheet which allows adding or updating Test Data to use when executing the test case.

To add/ update Test Data:

  1. Select a test case name under the Scenarios tree structure.
  2. Click the Test Data button in the right pane.
    The test data explorer opens with the iteration columns, Selection, Dataset tag, Task name, and the parameterized data (which is considered as a column name and not the actual data).
  3. Parameterize the test data for a test case iteration as well as task iteration.
    (info) Test case iteration can contain one or more task iterations.
  4. Enable or disable test case iterations by specifying True or False in the Selection column respectively.
  5. Click the save icon (Image Removed) once all the data is added for the test case execution.

...

nameDataSetTag

DataSetTag

Qualitia gives you a provision to tag the test case iterations with keywords based on the test data set you provide for test case execution. These tags enable you to differentiate test case iterations based on the provided test data. Using this feature, you can easily spot the test case iteration which is failed or has defects just by looking at the suite execution report. Identifying the area where a tester needs to work becomes easy and handy when you add relevant tags to the test case iterations.

Where to add these tags? 

Qualitia users can add these tags at test case iteration level from the test data screen.  

Providing these tags is optional. 

Suite Execution Report 

The tags which you add to the test case iterations are displayed in the test case execution report next to the iteration number.

Suppose you have an application with multi-lingual support. There are multiple test case iterations created based on the languages where the data is provided using the test data feature. 

After executing such test cases, if any of the iterations fails or has defects, it is easy to identify which set of data has issues from the whole set of iterations that is causing failures or adding defects.  

These tags are displayed against the test case name in the suite execution report. 

...

execute a test case in multiple iterations using the Test Data explorer. Also, you can have multiple task iterations within a test case iteration.


Info

Test Data explorer parameterizes the parameters of the test case and task steps only.  You can parameterize the locator values of objects using the Editing Object screen. 


To parameterize the test data for a test case:

  1. Open the test case in which you want to add test data or edit its existing test data. For instructions on how to open a test case, refer to Editing a Test Case.
  2. In the test case, under the Parameter columns, for the data you want to parameterize, replace the existing values with a unique value without enclosing in double-quotes. 
  3. Click the Test Data button to open the Test Data explorer.

  4. Click the Edit Data button.
    The test case and test data are locked for other users.

  5. To add a test case iteration, under the following columns, enter the following details:

    1. TC Iteration: Enter a number for every test case Iteration. 
      Note: After saving the test data, the values in TC Iteration are automatically numbered from 1, even if you provide any number in TC Iteration.

    2. DataSetTag: Enter a unique value for a data set test case iteration. DataSetTag simplifies finding a test case iteration in the test case execution report, 

    3. Selection: Enter True to enable the execution for the test case iteration or enter False to disable the execution for the test case iteration. You must enable at least one test case iteration to execute the test case, 

    4. Enter the values in their relevant formats for each parameterized data.

      Info
      • Enter $NULL$ in the cell where you want to have a blank value.
      • You can move and resize the Test Data explorer.

      • You can import test data for a test case from an excel file and also export it as an excel file.


  6. If the test case contains tasks, the first row displays the task names. Each task covers its associated parameterized data in the second row.
    To add a task iteration within a test case iteration, do the following in the order listed:
    1. Insert a blank row after the test case iteration.
    2. Skip the TC Iteration number, and enter the associated Selection and DataSetTag.

    3. Enter the values in their associated format for the task parameters only.

      Info
      • The values added for parameters of a task are not duplicated in the same tasks that are present in other test cases.
      • You do not need to enter the values for the parameters that are not associated with task iterations as they are not considered in the task iteration executions.


  7. After adding or editing all the required data, click the Save button on the Test Case Editor screen.

    Info

    After saving a test data, to edit test data, click the Edit button to unlock the test data again.


  8. Click the Test Data button present in the test case editor screen to close the Test Data explorer.
    You can now Dry run the test case to verify whether the test data meets the test requirements. For more information, refer to Dry Run a Test Case

Troubleshooting the Issues in Parameterizing Test Data 

The following table provides information to troubleshoot the issues you may face while parameterizing the test data.

Info

If you have entered any data incorrectly in the Test Data explorer or in the imported excel file, a text file called TestDataErrorLog.txt is generated displaying the incorrect data. Save this file on your computer. 


Error MessageReasonSolution
Some cells are left blank. Enter the data in the highlighted fieldsSome of the data cells are blank Enter the values in the blank cells.  If you want some values as blank, enter $Null$ in the associated cells.
Mismatch in the task name. Enter the correct task nameSome task names do not match

Do the following:

  1. Add the missing task along with its parameters.
  2. Correct the task names.
Some Parameters names or their order have changed. Correct the parameter names and arrange them in the original order.Some parameter names do not match

The Test Data explorer highlights the missing or wrong parameters in red color. Correct the sequence of parameters or their names.

If you have imported test data using an excel file, you can view the missing and wrong values in the TestDataErrorlog.txt.