Header

Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

A Test Case consists of steps and may contain single or multiple tasks. Each step contains an action along with its associated objects and parameters. Define the sequence of the steps in a test case that resemble the flow of a test scenario. You can add test data as per the requirements.

A test case determines whether the application under test meets the defined specifications under the given set of conditions or variables. Test cases are also created to validate if an application meets the business and functional requirements.

Before you create a test case, create a test scenario or select from available test scenarios.

A Test Case Editor screen contains buttons that you can use to add steps or execute a test case.

The following table shows the description of each button.

ButtonButton NameDescription
Create New taskTo create a new task while adding or editing a test case. You need to select a step after which you want to create the new task 
Create Tasks using the selected steps

To create a new task using the steps of the test case.

You need to select the steps. 

Import TaskTo add a task that is already created in the project. Select a step after which you want to add a task.

Detach TaskTo remove a task from the test case. You need to select the task that you want to remove.
Add Test Case StepTo add an independent test case step, that is, a step which is not part of any task.
Add Task Step

To add a step in a task.  You need to select a task in which you want to add a step.

Note: If you add a new step, ensure that the execution of all the associated test cases are not affected.  

Remove StepTo remove a step from the test case. You need to select the consecutive steps.
Add Conditional BlockTo insert a conditional statement. For more information, refer to Conditional Structure (IF-ELSE).
Move UpTo move up a selected step.
Move DownTo move down the selected step.
Start Web Recording

To start web recording on a web application under test.

Or, if a recording session is on, then this button is replaced with the Pause Web Recording button ().


Note: If you have already dry run a test case, you cannot record in the same session. You need to close the browser, open the test case, and then do the recording.

Stop Web RecordingTo stop the web recording session.

Start Mobile Recording

To start web recording on a mobile application under test.
Note: If you have already dry run a test case, you cannot record in the same session. You need to close the browser, open the test case, and then do the recording.

Stop Mobile RecordingTo stop the mobile recording session.

Add Test DataTo insert test data. For more information, refer to Adding a Test Data.

Dry RunTo dry run the test case. For more information, refer to Dry Running a Test Case.

DebugTo debug a test case. For more information, refer to Debugging a Test Case.

Debug LogsTo open the debug execution logs.
Go To LineTo go to a specific line of a test case
Previous Error

To go to the previous step, which has been identified as:

  • Deleted: The objects are deleted
  • Empty: The objects are not added.
  • New Test data:  Add with a new set of test data.
Next Error

To go to the next step, which has been identified as:

  • Deleted: The objects are deleted
  • Empty: The objects are not added.
  • New Test data:  Add with a new set of test data.
Save 

To save the test case every time you update it.

Also, you can click the arrow button to save it as another test case with a different name.

Adding or Editing a Test Case Manually

Define the sequence of the steps in a test case that resemble the flow of a test scenario.

You need to add or edit some or all the steps manually for a test case in the following situations such as:

  • To add certain actions that are not supported by Qualitia Automation Recorder.

  • To create a test case to automate testing on desktop applications.

  • To add certain General Web Actions and Mobile Actions.

  • To define conditional statements at the appropriate places.

  • To add custom actions at the appropriate places.

Before you start adding or editing steps manually, ensure that all the objects of the application under test are added in the Object Repository. For more information, refer to Working with Objects.

To add or edit a test case manually:

  1. From the Expand Menu, click Develop.

  2. On the Test Case Editor screen, from the left pane, open an existing scenario or create a scenario. For more details, refer to Working with Scenarios.

  3. You can edit or add a test case of the selected scenario:

    1. To edit a test case

      • Select a test case
        The Test Case Editor screen opens.

        If a test case is underlined with the red color, open the test case to resolve the steps that are also underlined with the red color. For more information, refer to Editing a Step Underlined with the Red Color.


    2. To add a test case

      1. On the right pane, click the New Test Case button ().

      2. Enter a Test Case NameDescription, Manual TC ID, and File Name.

      3. Click OK.
        The Test Case Editor screen opens.

  4. (Optional) To add a task from the first step, click Create Task. Enter a name and description for the task, and then click OK.
    To add steps for the added task, refer to step 4.

  5. To add an independent step or a task step

    1. Click the Add Test Case Step button () to add a blank step in the Test Case Editor screen.

    2. To add an object, double-click the Object cell to open the Object Repository hierarchy, and then select an object. For more information on objects, refer to Working with Objects.
      Note: You can right-click the object and select Edit Object to edit its properties. For more information, refer to Editing an Object.
      Some Generic Actions do not require an object such as OpenURL, Maximize Browser, and so on.

    3. Double-click the Action cell and click the arrow button to select an Action.

      Note: As per the selected object, only the relevant actions are displayed. For more information, refer to Qualitia Actions.
      As per the selected action, the associated parameters in a single or multiple parameter cells are automatically displayed.

    4. Enter the relevant values for the parameters. For more information, refer to Quailtia Actions.
      Similarly, you can add more steps that you want to add to the test case.

  6. To add a task using the steps that you have defined

    1. Select the steps, Press Ctrl and select the other steps that you want to add to the task, right-click and then select Add New Task.

    2. Enter a name and description for the task, and then click OK.
      To add steps to the task, refer to step 4.

  7. To import an existing task, select a step from where you want to insert the task, click the Import Task button () from the Test Case Editor screen.
    To edit the steps for the imported task, refer to step 4.

    If you edit the steps of the imported task, ensure that there is no impact on other test cases that are using the same task. 
    The task steps have white background, whereas the independent test case steps


  8. To move a task or step to its appropriate place, select the task or step, and then click the Move Up button () or the Move Down button ().

  9. After adding or editing all the tasks and steps, click the Save button () to save the test case.

If a task step is added with test data, you need to update the same task in other associated test cases. The associated test cases and test scenarios are automatically underlined with the red color in the left pane. If you execute the associated test cases (underlined with the red color) without updating the test data, the test case execution fails.

For more information, refer to Editing a Step Underlined with the Red Color

You can now click the Run button () to dry run the test case to know whether expected results are achieved. For more information, refer to Dry Run a Test Case.

Qualitia recommends saving the test case after making every change in the test case.

You can use the following Qualitia Automation Studio features to enrich a test case:

Editing a Highlighted Step

The Test Case Editor highlights the steps that have issues with a red underline and a vertical red line next to the step number. Similarly, the associated problematic tasks, test cases, and test scenarios that need resolution are also highlighted.

You must resolve the highlighted steps to avoid test case execution failures.

The following figure shows how steps, tasks, test cases, and test scenarios are highlighted:

The highlighted steps may have one of the following problems:

  • The task step does not have test data. Add the relevant test data for that task step. For instructions on how to add test data, refer to Adding Test Data for a Test Case.

  • The step contains an empty object, which means that one or more of the object property values are blank, such as locator type and locator value. 

  • The step contains a deleted object, which is applicable for the desktop projects only. The object was deleted from the objects repository due to updates in the synchronized TSR file.

To navigate through the highlighted steps, select any step in the Test Case Editor screen, click the Next Error button () or the Previous Error button ().

Recording a Test Case using Qualitia Automation Studio Recorder

Unable to render {include} The included page could not be found.

Dry Run a Test Case

A dry run is a trial execution to verify whether a test case is created as per the test requirement.

The dry run test case execution results are not displayed in the Reports and Dashboard portal. A dry run covers all the task iterations and test case iterations as per the set of values defined in the associated Test Data explorer.

Before you dry run a test case, ensure that you have configured the settings in the System Execution Profile as per the requirements.

For more information, refer to the following topics:

You can dry run a test case on a local environment only.

In the System Execution Profile, you can select the Keep Dry Run Reports checkbox to save the test case dry run report without overwriting the previous dry run reports.

Watch the video tutorial about how to record test case using test recorder here: 

To dry run a test case:

  1. Open a Test Case. To know how to open a test case, refer to Adding a Test Case Manually.

  2. Click the Dry Run button ( ) to dry run the test case.

    • To dry run a test case on mobile platform, ensure that you have configured the system requirements and settings in the associated execution profile. 
      For more information, refer to Requirements and Configuration Settings for Mobile Environments.

    • You can start a dry run of a test case from any step for a desktop project only. You need to select the step from where you want to execute, right-click it, and select Execute from the Step.
  3. If you want to stop the test execution at any step, click Stop.
  4. After the dry run completes, the test case execution report is displayed.
    For more details on the Test Case Execution report, refer to Test Case Iteration Detailed Report.

    • If some steps have failed in the dry run, you can refer to Failure Summary to know the reasons for their failure.

    • If the Keep Dry Run Reports checkbox is not selected in the System Execution Profile, the associated test case report is automatically stored in the following location:
      <LocalAppData>\QualitiaWDClient\app-8.3.x\WebDriver\ReportResources\1\ITR_xxx
      In this case, when you dry run the same or another test case, this report is overwritten.

    • If the Keep Dry Run Reports checkbox is selected in the System Execution Profile, the associated test case report is automatically stored in the following location:
      AppData\Local\Qualitia\WD\Client\ResultPath\DryRun\<machinename>\<username>\<projectname>\<Buildname>\<Releasenumber>\ITR_xxx

    • You cannot log a defect from the dry run test case execution report even if the Qualitia project is integrated with a defect management system, such as Jira or TFS/Azure DevOps.


Adding Test Data for a Test Case

A test case contains the following two types of test data:

  • Static Data

  • Parameterized Data

Static Data

As the name suggests, static data denotes a single static value for a parameter of an action. The static data must be enclosed in double-quotes (""). For example, if a step contains an object as username and Action as Set, then under the Parameter column, enter “John”.

Whenever you change the static data for a step of a task, the change is reflected in all the associated test cases.

Parameterized Data

You can define multiple sets of data to execute a test case in multiple iterations using the Test Data explorer. Also, you can have multiple task iterations within a test case iteration.

Test Data explorer parameterizes the values that are present under the parameter columns of the test case and task steps only. You can parameterize the locator values of objects using the Editing Object screen. 
If you do not define add test data for a step, then that step and its parent task is underlined with red color. Also, the test case and test scenario are underlined with red color in the left pane of Test Case screen. 

To parameterize the test data for a test case:

  1. Open the test case in which you want to add test data or edit its existing test data. For instructions on how to open a test case, refer to Adding or Editing a Test Case Manually.
  2. In the test case, under the Parameter columns, for the data you want to parameterize, replace the existing values with a unique value without enclosing in double-quotes. 
  3. Click the Test Data button () to open the Test Data explorer.

  4. Click the Edit Data button.
    The test case and test data are locked for other users.

  5. To add a test case iteration, under the following columns, enter the following details:

    1. TC Iteration: Enter a number for every test case Iteration. 
      Note: After saving the test data, the values in TC Iteration are automatically numbered from 1, even if you provide any number in TC Iteration.

    2. DataSetTag: Enter a unique value for a data set test case iteration. DataSetTag simplifies finding a test case iteration in the test case execution report, 

    3. Selection: Enter True to enable the execution for the test case iteration or enter False to disable the execution for the test case iteration. You must enable at least one test case iteration to execute the test case, 

    4. Enter the values in their relevant formats for each parameterized data.

      • Enter $NULL$ in the cell where you want to have a blank value.
      • You can move and resize the Test Data explorer.

      • You can import test data for a test case from an excel file and also export it as an excel file.

  6. If the test case contains tasks, the first row displays the task names. Each task covers its associated parameterized data in the second row.
    To add a task iteration within a test case iteration, do the following in the order listed:
    1. Insert a blank row after the test case iteration.
    2. Skip the TC Iteration number, Selection, and DataSetTag.

    3. Enter the values in their associated format for the task parameters only.

      • The values added for parameters of a task are not duplicated in the same tasks that are present in other test cases.
      • You do not need to enter the values for the parameters that are not associated with task iterations as they are not considered in the task iteration executions.
  7. After adding or editing all the required data, click the Save button on the Test Case Editor screen.

    After saving a test data, to edit test data, click the Edit button to unlock the test data again.


  8. Click the Test Data button present in the test case editor screen to close the Test Data explorer.
    You can now Dry run the test case to verify whether the test data meets the test requirements. For more information, refer to Dry Run a Test Case

Troubleshooting the Issues in Parameterizing Test Data 

The following table provides information to troubleshoot the issues you may face while parameterizing the test data.

If you have entered any data incorrectly in the Test Data explorer or in the imported excel file, a text file called TestDataErrorLog.txt is generated displaying the incorrect data. Save this file on your computer. 


Error MessageReasonSolution
Some cells are left blank. Enter the data in the highlighted fieldsSome of the data cells are blank Enter the values in the blank cells.  If you want some values as blank, enter $Null$ in the associated cells.
Mismatch in the task name. Enter the correct task nameSome task names do not match

Do the following:

  1. Add the missing task along with its parameters.
  2. Correct the task names.
Some Parameters names or their order have changed. Correct the parameter names and arrange them in the original order.Some parameter names do not match

The Test Data explorer highlights the missing or wrong parameters in red color. Correct the sequence of parameters or their names.

If you have imported test data using an excel file, you can view the missing and wrong values in the TestDataErrorlog.txt.

Removing a Test Case

Unable to render {include} The included page could not be found.

Empty Data Handling

Unable to render {include} The included page could not be found.

Mandatory / Non-Mandatory Action Parameters

Unable to render {include} The included page could not be found.

Critical Data Protection

Unable to render {include} The included page could not be found.


  • No labels