Header

Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

A test case contains the following two types of test data:

  • Static Data

  • Parameterized Data

Static Data

As the name suggests, static data denotes a single static value for a parameter of an action. The static data must be enclosed in double-quotes (""). For example, if a step contains an object as username and Action as Set, then under the Parameter column, enter “John”.

Whenever you change the static data for a step of a task, the change is reflected in all the associated test cases.

Parameterized Data

You can define multiple sets of data to execute a test case in multiple iterations using the Test Data explorer. Also, you can have multiple task iterations within a test case iteration.

Test Data explorer parameterizes the values that are present under the parameter columns of the test case and task steps only. You can parameterize the locator values of objects using the Editing Object screen. 
If you do not define add test data for a step, then that step and its parent task is underlined with red color. Also, the test case and test scenario are underlined with red color in the left pane of Test Case screen. 

To parameterize the test data for a test case:

  1. Open the test case in which you want to add test data or edit its existing test data. For instructions on how to open a test case, refer to Adding or Editing a Test Case Manually.
  2. In the test case, under the Parameter columns, for the data you want to parameterize, replace the existing values with a unique value without enclosing in double-quotes. 
  3. Click the Test Data button () to open the Test Data explorer.

  4. Click the Edit Data button.
    The test case and test data are locked for other users.

  5. To add a test case iteration, under the following columns, enter the following details:

    1. TC Iteration: Enter a number for every test case Iteration. 
      Note: After saving the test data, the values in TC Iteration are automatically numbered from 1, even if you provide any number in TC Iteration.

    2. DataSetTag: Enter a unique value for a data set test case iteration. DataSetTag simplifies finding a test case iteration in the test case execution report, 

    3. Selection: Enter True to enable the execution for the test case iteration or enter False to disable the execution for the test case iteration. You must enable at least one test case iteration to execute the test case, 

    4. Enter the values in their relevant formats for each parameterized data.

      • Enter $NULL$ in the cell where you want to have a blank value.
      • You can move and resize the Test Data explorer.

      • You can import test data for a test case from an excel file and also export it as an excel file.

  6. If the test case contains tasks, the first row displays the task names. Each task covers its associated parameterized data in the second row.
    To add a task iteration within a test case iteration, do the following in the order listed:
    1. Insert a blank row after the test case iteration.
    2. Skip the TC Iteration number, Selection, and DataSetTag.

    3. Enter the values in their associated format for the task parameters only.

      • The values added for parameters of a task are not duplicated in the same tasks that are present in other test cases.
      • You do not need to enter the values for the parameters that are not associated with task iterations as they are not considered in the task iteration executions.
  7. After adding or editing all the required data, click the Save button on the Test Case Editor screen.

    After saving a test data, to edit test data, click the Edit button to unlock the test data again.


  8. Click the Test Data button present in the test case editor screen to close the Test Data explorer.
    You can now Dry run the test case to verify whether the test data meets the test requirements. For more information, refer to Dry Run a Test Case

Troubleshooting the Issues in Parameterizing Test Data 

The following table provides information to troubleshoot the issues you may face while parameterizing the test data.

If you have entered any data incorrectly in the Test Data explorer or in the imported excel file, a text file called TestDataErrorLog.txt is generated displaying the incorrect data. Save this file on your computer. 


Error MessageReasonSolution
Some cells are left blank. Enter the data in the highlighted fieldsSome of the data cells are blank Enter the values in the blank cells.  If you want some values as blank, enter $Null$ in the associated cells.
Mismatch in the task name. Enter the correct task nameSome task names do not match

Do the following:

  1. Add the missing task along with its parameters.
  2. Correct the task names.
Some Parameters names or their order have changed. Correct the parameter names and arrange them in the original order.Some parameter names do not match

The Test Data explorer highlights the missing or wrong parameters in red color. Correct the sequence of parameters or their names.

If you have imported test data using an excel file, you can view the missing and wrong values in the TestDataErrorlog.txt.

  • No labels