Dashboards provide the capability to analyze the results of the test suites executed using Qualitia Automation Studio. They help product owners, managers, and testers to get a holistic and comprehensive view of execution results across projects and test environments.
You can access the Web-based dashboards from the Automation Insights section.
Installation Process for Reports and Dashboards Portal
Web-based dashboard is a great way of understanding overall progress of the project. This provides the capability to analyze results of the test cases executed in Qualitia Automation Studio for Web, Mobile, Or Desktop applications.
You can either enable this dashboard functionality while installing/upgrading Qualitia Server or you can do it after you complete the installation or upgrade process.
Qualitia consumes an additional database (except master and project databases) for managing data related to the dashboard functionality.
To enable Reports and Dashboards Portal:
- Log on to the machine where you have installed Qualitia Server.
- Double-click the Qualitia Automation Studio Server icon.
The Welcome to Qualitia Setup Wizard screen appears. - Click Next.
The Database Server Details screen appears. - Click Next.
- Select Enable Dashboard option and enter the desired name for the dashboard database.
- Click Next and then click Install.
Now you can access dashboards from the Dashboard section of Qualitia Automation Studio clients or using the dashboard URL on using any browser on any machine in the network.
Reports and Dashboard Portal URL contains the Hostname/IP:Port from the machine where Qualitia server is installed followed by the login page URL, that is, qualitia/sign-in. By default, system uses port number 8887 which users can change while installing Qualitia server.
The sample URL may look like:
You need to login and the landing screen contains the options to navigate to Web Dashboards, Real-time reporting or Server Administration portals, depending upon your user role in Qualitia.
Getting Started with Reports and Dashboards Portal
Prerequisites
Web-based Dashboards allows you to visualize and analyze your test automation results in various ways. The dashboard consolidates test execution data across projects, including executions run using CI tools. This provides you the capability to view trends and monitor progress along your test automation journey.
In order to serve the purpose, this feature will require:
- Higher RAM and Storage
- Higher data storage of (at least 100 GB) on database instance
- Qualitia Automation Server with minimum 8 GB of RAM
- /wiki/spaces/T2/pages/1381635920
- Browsers:
We have tested this Web dashboard on following browsers versions:
Browser Name | Version |
---|---|
Google Chrome | 73 to 75 |
Mozilla Firefox | 66 and 67 |
Microsoft Edge | 44 |
Chromium | 77 |
Safari | 12 |
Internet Explorer | 10 and 11 |
- Access to Qualitia Server machine and port 8887:
You must have access to port 8887 from the machine where Qualitia Server is installed.
Accessing Web-Based Dashboards
You can access these dashboards using any machine in the network where Qualitia server is installed. To access these dashboards, just enter the dashboard URL in any browser (Google Chrome, Mozilla Firefox, Internet Explorer, Microsoft Edge) and enter your Qualitia credentials.
Dashboard data will be displayed for the projects which are assigned to this user.
This Web dashboard URL is available in the Automation Insights section of Dashboard in Qualitia Automation Studio.
Dashboard URL contains the Hostname/IP:Port from the machine where Qualitia server is installed followed by the dashboard URL i.e. qualitia-reports/dashboard. By default, system uses port number 8887 which users can change while installing Qualitia server.
The sample URL may look like:
- http://192.168.25.xx:8887/qualitia-reports/dashboard
- http://localhost:8887/qualitia-reports/dashboard
Last Updated
On logging in, the dashboard displays date and time when it was last updated in the top right corner. If there is no data available for the dashboard to be displayed, (like immediately after the Qualitia automation studio installation or upgrade as dashboard gets refreshed only after midnight) it displays the text as 'Not yet updated'.
Dashboard Refresh Interval
The data in the dashboard gets refreshed every day at midnight. Details of the test cases executed on a particular day will be available in the dashboard next day.
Important Notes:
You can view the data of last 30 days for a project in the dashboard. You can filter the test execution dates from the filter given at the top of the screen. You can also filter the data based on other filters such as project, suite and so forth.
To view the detailed information about any bar/line from the chart, hover your mouse pointer over that bar/line.
If a test case is executed multiple times in a selected time frame, multiple occurrences are counted while loading the metrics and charts.
If there is no data present for the selected filter criteria, you will see a message in the widget that there is no data to display.
Dashboards
- For test suites executions on desktop projects, the following charts are not displayed:
- Test Execution Results Trend
- Execution Trend on Web
- Execution Trend on Mobile
- Test Execution Trend for Both (Mobile & Web Executions)
- Failures by Browser
- Failures by Operating System
- Web Dashboards display the suite execution results as per the time zone of the QAS server.
Automation Test Progress Report
This section of the dashboard provides the cumulative number of test cases executed and their split per execution status. You can view the following information for the project and dates selected in the filter criteria.
- Tests Executed: Displays the total number of test cases that have been executed.
- Total Pass: Displays the total number of passed test cases.
- Total Fail: Displays the total number of failed test cases.
- Total Defects: Displays the total number of test cases that have defects.
- Tests Not Executed: Displays the total number of test cases that are marked for execution but did not get executed due to one reason or another.
You can click any number shown in the box to view the complete list of test cases with respective status.
30-Day Trend
The line in each box indicates the trend of these test cases for the 30-day period prior to the selected end date. Hovering on the markers on the line will provide information for the value at that point.
Test Execution Results Trend
This chart displays the trend of test case execution results over time. In this stacked column chart, each bar represents a single date (mentioned on the x-axis) and test execution results are stacked on each other. You can filter this chart for specific Date, Project or, Test Suite(s) based on your requirements by selecting corresponding options from the filters. The default view of the chart is at a day level. Higher levels like Week, Month, and so forth can be selected from the Group by filter provided. Colors of these stacks represent Qualitia test case execution results as follows:
- green refers to the passed test cases,
- red refers to the test cases with defects,
- amber refers to the failed test cases,
- grey refers to the test cases that are Not Executed.
If test cases are marked for the execution and are not executed due to on-error flags or any such reasons, they are reported as Not Executed.
Execution Trend on Web
This chart shows the total count of executions run on the Web, with a split by status for each platform.
You can filter the results based on the execution environment that includes browsers and operating systems, which are used while executing these test cases. The X-axis shows the execution environment details such as a browser or operating system details whereas the Y-axis shows the number of test case executions.
Execution Trend on Mobile
This chart shows the total count of executions run on Mobile devices, with a split by status for each platform
You can filter the test case based on the execution environment that includes mobile browsers and mobile operating systems which are used while executing these test cases. The X-axis shows the execution environment details such as a browser or operating system details whereas the Y-axis shows the number of test cases.
Test Execution Trend for Both (Mobile & Web Executions)
This stack column chart displays execution results for tests that have both Mobile & Web components. The x-axis shows the desktop / mobile browsers used while execution or the desktop / mobile operating systems where test cases were executed. The test case status is represented through stacks.
You can select whether you would like to view the chart for Web/Mobile/Both and whether you would like to see the chart for browsers or operating systems. The selected dimension forms the x-axis for the chart. You can also further filter and see only the specific Browser(s) or Operating System(s) that you would like to see in the chart. The Y-axis displays the number of test cases.
You can filter this chart based on the projects, suites and dates.
Hovering over the column displays the total number of test cases and the number of test cases for each status. You can click the column to view the complete list of test cases and other related information.
Failures by Browser
The vertical stacked columns in this chart displays the number of failed tests against the browsers on which the test cases were executed. For the purpose of this chart, test cases with occurrences of both, Defects and Failures, have been included to determine the number of failures. You can click any stack in the columns on the chart to view the detailed list of test cases and other related information. This chart enables you to quickly identify whether there are any browser(s) on which test cases have higher incidence of defects or failures.
This chart considers test case executions for the web environment only. Also, if execution browser is not identified, such reports will fall under the unidentified section.
Failures by Operating System
The vertical stacked columns in this chart displays the number of failed tests against the operating system on which the test cases were executed. This chart considers all the test cases that are failed or have defects in them. You can click any stack from the columns in the chart to view the detailed list of test cases and other related information about the same. This chart enables you to quickly identify whether there are any operating systems on which test cases have higher incidence of defects or failures.
This chart considers test case executions for the web environment only. Also, if the operating system is not identified, such reports will fall under the unidentified section.
Test case classification by Failure Rate
This chart categorizes test cases based on their failure rate across the selected period. The purpose is to observe and list out the test cases that have a higher frequency of failures. The y-axis of the chart shows the number of test cases and the x-axis shows the failure frequency that categorizes the test cases.
The failure rate of the test cases is calculated as follows:
Failure rate = (Total number of occurrences of Fail status / total number of occurrences of Passed, Failed, and Defected status) * 100
Note: While calculating this failure rate, test cases with the status not executed (status = -1) are not considered. Also, if any test case is executed multiple times, all the multiple instances are considered while calculating the failure rate.
Hover on any of the bars from the chart to view the Number of Test Cases with the failure rate.
Upon clicking any bar you can view complete list of test cases falling under the selected frequency rate .
This page shows the latest status of a test case and the execution trend per iteration across all suites. You can also filter test case list based on Suite Name. In an Iteration trend, it shows the status of all its executions along with iteration number.
The statuses of different test case executions are shown through the following colors:
- green refers to the passed test cases,
- red refers to the test cases with defects,
- amber refers to the failed test cases,
- grey refers to the test cases that are Not Executed.
Upon clicking an iteration, the drill-down displays the complete information of the corresponding test case iteration in the grid view.
Test Cases Created/Modified Report
This column chart displays the trend in the number of test cases created and modified in the selected project and date range. You can group the charts-based on date, week, month and year level. By default, data will be grouped by date.
For the purpose of this chart, suppose a test case was created and updated in the same day, it would only appear in the No. of test cases created column.
The X-axis shows the dates on which the test cases were created and modified whereas the Y-axis shows the number of test cases.
You click any column to view the complete list of test cases along with the test case(s) metadata which were created or modified at that time.
Suite Stability Comparison
This column chart helps us identify test suites which have a high failure rate across executions. You can filter this chart based on a Project, Dates and Test Suite(s). The x-axis shows names of the test suites whereas the Y-axis shows failure rate in percentage.
The failure percentage is calculated as: [(Number of occurrences of TCs with Fail Status) / (Number of occurrences of TCs with Pass, Fail or Defect status)] * 100
You can hover over column to view name of the test suite and the percentage of failure in the tooltip and you can click the column to view the detailed information about the suite in the drill view.
To view more charts specifically about the suite executions, click the icon and you will be navigated to another page.
Suite Stability Trend
This line graph helps to plot the percentage of the failure across suite executions and shows the trend in this metric over time. The trend line enables the user to observe if the number of failures has increased/decreased over time, and thereby, identify whether the suite is stabilized. The y-axis plots percentage of failure and the x-axis displays suite execution dates.
Failures are counted using the formula = [(Total number of occurrences of TCs with Fail Status) / (total number of occurrences of TCs with Pass, Fail or Defect status)] * 100
You can filter this chart based on the date range and test suite. Charts are sorted on ascending order of dates. You can hover on a point to view date and the exact percentage of the failure.
Clicking on the line navigates user to the test case details of the execution for that suite.
Suite Execution Switches Between Two Iterations
This horizontal bar chart helps to analyze the behavior of test cases that were part of a test suite across two successive iterations. For comparison, you can select two iterations of the Test Suite using the drop-down options. By default, the latest 2 suite iterations are selected for the comparison. This chart classifies the behavior of the test cases into the following four categories:
- Pass to Fail: This category represents test cases that passed in the 1st selected iteration and Failed (or had a Defect) in the 2nd selected iteration.
- Fail to Fail: This category represents test cases that failed in both the iterations.
- Fail to Pass: This category represents test cases that failed in the 1st iteration and passed in the 2nd iteration.
- Pass to Pass: This category represents test cases that were consistently passing in both the iterations. Analyzing this chart, you can get a list of test cases that need to be looked at on priority.
In the context of this chart, a test case that has either failed or had a defect will be considered as Failed.
Hovering over a bar shows the category and the number of test cases in that category.
Suite Execution Time Statistics Across Last 5 Iterations
This is a dual Y-axis combo column chart. The left side Y-axis shows the execution time in seconds and the right-side Y-axis shows the number of test cases which were part of the test case execution. This chart displays data for a test suite across the last 5 iterations. You can select the desired suite from the top right corner of the page. If a Suite has fewer than 5 iterations for the date range selected, then the chart will display as many iterations are available for that Suite during that period. The column shows the execution time metric and the line represents the number of test cases executed as a part of the suite execution. The x-axis displays iteration numbers.
Hovering over the column shows the name of the metric and its value, and the iteration number of the Suite in a tool-tip.
Test Cases with Defects and Failures
You can filter this chart based on the defects and failures.
Defects
This chart enables focusing on the test cases in which defects are occurring during the executions. You can either focus on the test cases presenting with high occurrences of defects or test cases with lower occurrences of defects.
You can select the project and date range for which you want to view the list of test cases. You can choose between top 10 / 20 or bottom 10 / 20 test cases by selecting the appropriate option from the top of the chart. For example, if you select the Top 20 option, the chart will be filtered to show the top 20 test cases with highest defects.
Each bar in the chart corresponds to a single test case. X-axis displays the names of test case falling under the selected filter criteria and Y-axis displays the number of defects found in the test cases.
You can click the bar to view the detailed information about that test cases.
Failures
This chart enables focusing on the test cases which are failing during the executions. You can either focus on the test cases presenting with high occurrences of failures or test cases with lower occurrences of failures.
You can select a project and date range from which you want to view the test cases. You can choose between top 10 / 20 or bottom 10 / 20 test cases by selecting the appropriate option from the top of the chart. For example, if you select the Top 20 option, the chart shows top 20 test cases with the highest failure rate.
Each bar in the chart corresponds to the single test case. X-axis displays the names of test case falling under the selected filter criteria whereas, Y-axis displays the number of occurrences for test case with failed status.
You can click the bar to view the detailed information about the test case.
Test Cases with Defects and Failures
You can filter this chart based on the defects and failures.
Defects
This chart enables focusing on the test cases in which defects are occurring during the executions. You can either focus on the test cases presenting with high occurrences of defects or test cases with lower occurrences of defects.
You can select the project and date range for which you want to view the list of test cases. You can choose between top 10 / 20 or bottom 10 / 20 test cases by selecting the appropriate option from the top of the chart. For example, if you select the Top 20 option, the chart will be filtered to show the top 20 test cases with highest defects.
Each bar in the chart corresponds to a single test case. X-axis displays the names of test case falling under the selected filter criteria and Y-axis displays the number of defects found in the test cases.
You can click the bar to view the detailed information about that test cases.
Failures
This chart enables focusing on the test cases which are failing during the executions. You can either focus on the test cases presenting with high occurrences of failures or test cases with lower occurrences of failures.
You can select a project and date range from which you want to view the test cases. You can choose between top 10 / 20 or bottom 10 / 20 test cases by selecting the appropriate option from the top of the chart. For example, if you select the Top 20 option, the chart shows top 20 test cases with the highest failure rate.
Each bar in the chart corresponds to the single test case. X-axis displays the names of test case falling under the selected filter criteria whereas, Y-axis displays the number of occurrences for test case with failed status.
You can click the bar to view the detailed information about the test case.
Defects Logged Report
This report shows a grid view of defects that have been logged by Qualitia with the defect management tool, after the test case execution. This includes defects that have been logged automatically after the test case executions and defects that are logged from the Reports section of Qualitia Automation Studio.
This report displays details of the actual defects logged with the defect management system, and hence the number of defects shown in this report will not be the same as the number of test case occurrences with Defect status shown in the KPIs and other charts of the dashboard. The report is displayed based on the Date, Projects, and Test Suites selected in the main dashboard page.
Trends in Defects Logged by Automation Tool
This simple line chart plots the total number of defects that have been logged by Qualitia Automation Studio at the end of executions with the client’s defect management tool over a selected period.
The X-axis shows the dates in which test case executions have occurred whereas the Y-axis shows the number of defects that have been logged during test case executions.
You can filter this chart based on the Date, Project, and Suites. The default view of the chart is at a day level. Higher levels such as Week, Month, and so forth can be selected from the provided Group by filter.
Hover on a dot to view the exact number of defects and click the data point to view the complete list of defects and other relevant details about these defects.
Drill View - Grid showing test case level details
Test Case Execution Drill View informs about some important aspects associated with the execution of test cases in QAS. This view displays the metadata about these test cases execution date, the start and end time, host where the test case was executed and which platform it was run against, and so forth. All this information is presented in the form of a grid.
This view displays the attributes in the following order: Date, Project Name, Test Suite Name, Test Suite Type, Test Scenario Name, Test Script Name, User (who executed the test script), Status, Execution Platform, Browser Name, Browser Version, OS Name, OS Version, Host Name, Host Type, Mobile Browser Name, Mobile Browser Version, Mobile OS Name, Mobile OS Version, Mobile Host Name, Test Case Name, Test Case Author, Test Case Priority, Execution Start Time, Execution End Time, Run Duration.
These attributes are classified into meaningful groups as mentioned below:
- All environment-related columns are grouped together with headers in the green color.
- All other attribute headers are displayed in the blue color.
You can invoke drills from other views of the dashboard to view contextual information. At such times, only the filtered data is displayed matching the drilled down conditions.
For example, if you click the number from the ‘Tests Executed’ box of the automation test progress report, the Drill view displays only the list of test cases that are executed for the selected project and date range.
Datamart Refresh Status
The data shown in the dashboards is fetched from a datamart, which consolidates test executions data from associated Qualitia Project and Results databases. This datamart database is currently updated once every day using an Extract, Transform and Load Process.
This chart displays the history of datamart loading events. It helps to identify if any failures have occurred when loading the datamart, and confirm that the data on the dashboard is correct and up to date.
The chart displays this information in a simple grid form with the start and the end time of the datamart process and the status against it. In the cases of failures, this chart shows the failure point and possible reason for failure, which helps to identify the area that needs to be fixed, for successfully loading the datamart.
This chart shows 5 records per page, and you can use the pagination options to view further records.
Platform Utilization Heatmap
This chart illustrates the overall utilization of the automation platform during the current year in a calendar view with a heatmap. You can easily view the number of test cases that have been run across the year on the daily basis. This number is a combination of test cases executed using automation studio and test cases executed using the CI tools (through offline packages). Only offline executions run in a connected mode with QAS Server will be included in this chart as the results for these would have been synced with QAS. Complete offline executions will be included only if their results have been manually imported into QAS using the offline report utility.
This chart illustrates the usage of the automation platform with different colors. A heatmap uses a proportionate color scheme where red indicates the least number of test case executions whereas green indicates the high number of test case executions. In case there are no test cases executed on any day, that field is left blank (white) for easy identification.
You can filter this chart to view the total number of test cases executed or the total number of hours for which automation platform is utilized for test case execution. When the view by test efforts is selected, you can view the details by hovering your mouse pointer on the desired field.
You can also view the complete list of test cases in the grid view by clicking in the field.
Real Time Reporting Portal
Introduction
The Real Time Reporting Portal allows you to monitor the test suite executions that are in progress or completed. It updates the results of a test suite execution in real-time after each step in the test case iteration is executed.
Also, QAS updates the local copy of the report after each task iteration is completed. If a test case does not have tasks, then it updates after each test case iteration execution is completed.
The Real Time Reporting Portal opens automatically when executing a test suite in the following ways:
- Through a Qualitia Client: It directly displays the (8.3.0 2) Reports and Dashboards Portal.
- Through an Offline Suite: It displays the login screen if the offline suite used is present on your machine and configured with the Qualitia Server. Ensure that you have access to the associated project to track the test suite execution progress.
Real Time Reporting shows the reports of test cases executed through a TFS/Azure DevOps pipeline only after the pipeline execution is completed.
Real Time Reporting Portal does not support test cases executed through HP ALM.
Accessing Real Time Reporting Portal
To access Real Time Reporting Portal, you need to contact Qualitia Admin who installed Qualitia Server to get the Reports and Dashboards URL.
Alternatively, you can access the real time reports from the Reports tab of Qualitia Client.
Pre-Requisites
Before you start test suite execution either through Qualitia Client or Offline Package, you must ensure the following:
Enable the Real Time Reporting checkbox in the mapped execution profile or Systems Execution Profile Settings.
- The Real Time Reporting Server has enough drive space to displays test suite results. For more information, refer to Server Space Notification.
- For an offline suite execution, your computer is connected to Qualitia Server. Also, you need to specify the Qualitia Server URL in the Settings tab of Qualitia Offline Configuration Manager.
Additional Configurations for Log Files
You can customize the logs generated with suite execution results in the mapped execution profile or Systems Execution Profile Settings as shown in the following table:
Settings | Description |
---|---|
Enable Real-Time Reporting | Displays the test execution results in the Real Time Reporting Portal. Also, if Web Dashboards feature is enabled, you can view these records in the Dashboard portal. |
Maximum Size Per Log File | Lets you specify the maximum size for every log file of a test suite execution. The default value is 1MB. |
Application Step Log Size | Lets you specify the maximum size for logs of a step. The default value is 1MB and the maximum value you can define is 2MB. |
Application Log Level | Generates Info Logs, Error Logs, or All Logs for the test suite executions. You can select None, if you do not want any. |
Suite of Suites Screen
The Suite of Suites screen shows the list of the test suite executions associated with the projects to which you have access. You can choose to view the test suite executions that are in progress, completed, aborted or all of them. It also updates the in progress test suite executions in terms of percentage.
For every test suite, it displays the following details:
Test Suite Environment Details
Shows the test suite name, machine name, IP address, and project name.
Qualitia denotes the test suite execution is through a Qualitia Client and Offline denotes the offline suite execution either as a standalone or using a Continuous Integration tool.
Mouseover on the Environment Icon to view the operating system and browsers used for test case execution.
- The total execution duration
Test Suite Execution Details
The total number of test cases, with details of the number of test cases that passed, failed, have defects, and are not executed. It also shows a progress chart with the completion percentage for the executions in the process.
- Realtime Reporting Portal displays the execution results as per the local time of your computer.
- If the execution progress for a suite execution shows a red dot, it indicates that the suite execution results are not updated from the last 5 minutes. You need to check whether the test suite execution is in progress or the connection to Qualitia server is lost.
Real Time Reporting portal marks the suite execution results as Aborted when they are not updated for more than 1 hour. For more information, refer to Aborted Test Suite Executions.
Server Space Notifications
As the number of suite executions increase, the available drive space where Real Time reporting assets are stored starts reducing. The Real-time reporting portal shows a notification when the available drive space is:
10% to 30%: A warning message in amber appears at the top of the Portal page, which indicates the amount of drive space available along with the percentage. You must monitor the drive space and ensure that it does not reduce to less than 10%.
Less than 10%: A warning message in red appears at the top, which indicates that Real Time Reporting cannot display information on new suite executions. Also, you cannot upload or save a report.
In both cases, you need to contact the Qualitia Admin to purge reports. For detailed instructions on how to purge reports, refer to Purging Reports.
Filtering a Test Suite List
You can filter the list of test suites based on associated projects, test suite name, users, and date range. You can also search for a project, test suite, and user.
To filter the test suite list:
Select a date range or click the Filter button in the top-right corner.
Select the appropriate options from the list of filters.
Uploading a Test Suite Execution Summary Report
Whenever a test suite is executed either through Qualitia Client or Offline Package, the test suite execution summary report is locally saved on the computer used for execution.
You can upload a locally saved test suite execution report to the Real Time Reporting when:
- During the execution, the results are not pushed to the Real Time Reporting Portal due to network failure or the services of Qualitia Server restarted
- An offline suite execution was not connected with Qualitia Server
- A test suite execution was not configured to display its results on the Real Time Reporting portal
- A test suite executed through parallel execution using Power Shell Utility
To upload multiple suite execution summary reports, refer to (8.3.0 2) Uploading Multiple Suite Execution Reports to Real Time Reporting Portal.
Note: If you are uploading a locally saved test suite execution report that was executed using any Qualitia version lesser than v8.0.0, then you need to upload the report from the Old Reports screen of a Qualitia Client.
To upload a test suite execution summary report:
- Create a ZIP file of the report folder that you want to upload.
- Open the Suite of Suites screen.
- Click the Upload button placed at the top-right corner.
- Drag the ZIP file of the report or browse to upload the ZIP file of the locally saved test suite execution report.
After uploading the test report, its execution details are available in the Web Dashboards from the next day.
Test Suite Execution Summary Report
You can access a Test Suite Execution Summary Report by clicking a test suite execution on the Suites of Suites screen.
The Test Suite Execution Summary Report contains the execution status for each test case and its iterations. You can view the execution duration for each test case along with start and end time.
You can view the total number of test cases, with details of the number of test cases that passed, failed, have defects, and are not executed.
The following details of the test suite are displayed.
Report Element | Description |
---|---|
Status Icon | Indicates the number of test case iterations that have failed and have defects along with the total number of test case iterations. Also, a refresh icon denotes the relevant test case iteration execution that is in progress. If the current project is integrated with a defect management system, then for defects logged automatically or manually, it displays a bug icon. Upon hovering on the bug icon, all the associated defect IDs along with their links are displayed. Also, it shows whether the defect is logged automatically. |
Test Case Element | Indicates the test case name or a test case iteration number. |
Module | Indicates the associated module name. |
Epic | Indicates the associated epic name. |
Story | Indicates the associated story name. |
Scenario | Indicates the associated Test Scenario name. |
Manual TC Id | Indicates the associated Manual TC Id. |
The execution time for the suites executed on desktop applications does not count the time spent on launching UFT.
The Module, Epic, and Story columns appear only if the associated project of the test suite is integrated with a Qualitia Design Studio project.
If the test suite execution is in progress, a blue-sticky status bar appears at the bottom of the screen, which shows the following details of the test case:
The test case name
The iteration number in progress and the total number of iterations
- A timer shows the amount of time lapsed since execution
Filtering the Test Suite Execution Summary Report
You can filter the list of test cases based on the associated status, test case, modules, epics, stories, scenario, and Manual TC Ids.
To filter the Test Suite Execution Summary Report:
Click the Filter button.
Select the appropriate filters.
Alternatively, you can filter the report by clicking a status in the top-right corner.
Exporting the Test Suite Execution Summary Report
You can export the Test Suite Execution Summary Report as an excel file.
To export the Test Case Execution Summary Report:
- Click three dots, and select Export to Excel.
Sharing the Test Suite Execution Summary Report
You can share the link of this report with the applied filters to another Qualitia user who has access to the associated project.
To share the Test Suite Execution Summary Report:
- Click three dots, and select Share Link.
- Click Copy to copy the link.
Wrapping up the Test Suite Execution Summary Report
A cell in the report does not display more than 100 characters. Hence, you need to wrap the report to view all the data of a cell that contains more than 100 characters,
To wrap the report, click the Wrap Text button ( ).
Customizing the Test Suite Execution Summary Report
To customize the Test Suite Execution Summary Report:
Click three dots, and select Table Config to show or hide the columns.
Saving the Test Suite Execution Summary Report
You can save the report as an offline HTML file on your computer.
To save the Test Suite Execution Summary Report:
- Click three dots, and select Save As Report.
The report is saved in the Download folder, which is specified in the browser's settings.
Logging a Defect Manually in the Integrated Defect Management System
You can log defects manually for a test case iteration whose execution status is Pass, Fail, or Defect. You need to configure the project with a Defect management System: Jira or TFS/Azure DevOps. For more information, refer to Integrating a Qualitia Project with Defects Management.
Test Case Iteration Detailed Report
You can access a Test Case Iteration Detailed Report by opening a test case node, under the Hierarchy column, and clicking the appropriate test case iteration name.
The Test Case Iteration Detailed Report shows in-depth step-level details for a test case iteration.
The topmost bar shows the following details of the test case iteration:
The execution duration along with the start and end time
The number of steps that have failed or have defects along with the total number of steps
The number of tasks that have failed or have defects along with the total number of tasks
You can expand this section to view the following details of the Test Suite execution
- The execution duration along with the start and end time.
- The total number of test cases, with details of the number of test cases that passed, failed, have defects, and are not executed. Also, a pie-chart shows the test case count per status for the test suite.
The following details of the test case iteration are displayed.
Report Element | Description |
---|---|
No | Indicates the step number. |
Status Icon | Indicates the execution status of the step. If the current project is integrated with a defect management system, then for defects logged automatically or manually, it displays a bug icon. You can click the defect id links to view the defects in the associated defect management system. Also, you can know the defects that are auto-created. |
Test Case Element | Indicates one of the following test case elements:
|
Data | Indicates the data associated with the action. |
Object | Indicates the object name. |
Execution Time | Indicates the execution time as per the associated hierarchy type. |
Also, if the test suite execution is in progress, a blue-sticky status bar appears at the bottom of the screen, which shows the details of the test case execution.
Filtering the Test Case Iteration Detailed Report
You can filter the report based on execution statuses.
To filter the Test Case Iteration Detailed Report:
Click the Filter button.
Select the appropriate options from the list of filters.
Wrapping up the Test Case Iteration Detailed Report
A cell in the report does not display more than 100 characters. Hence, you need to wrap the report to view all the data of a cell that contains more than 100 characters,
To wrap the report, click the Wrap Text button ( ).
Navigating through the Steps and Tasks
To navigate to a specific line, enter the line number in the Go to Line text box and press Enter.
You can navigate through the failed steps and tasks in the report using the navigation controls shown in the following figure:
Note: Click the Failure Summary button to know failure categories for the test case iteration. For more information, refer to Viewing Failure Summary Report.
Viewing Screenshot and Logs for a Step
You can view the screenshots and logs of the application under test for each step execution.
To view screenshot and logs for a step:
Select an appropriate step and click the Screenshot and Log icon. The screenshot and the log appears.
You can enlarge a screenshot and navigate across the screenshots of the next or previous steps.
- The Info Log section shows the following details:
- Step Itinerary : Indicates the task, test case, scenario, and suite name.
- Message: Describes how the step was executed. If the step has failed, it shows the reason of failure.
If the step contains the results for an API response or generic actions, it contains a link that opens the results.
You can view the results in the following formats:- json
- xls
- js (used for opening XML files)
Additional Messages: Indicates the name of the action, its execution Start and End time, and associated message received from the engine. If it contains the additional message, click the + symbol to see the additional messages.
- Execution Status: Indicates the execution status of the step as an integer.
- 0 denotes the step has passed.
- 1 denotes the step has failed.
- 2 denotes the step has a defect.
- The Error Log section shows if any exception occurred in execution.
- The Configuration section shows the configured settings used for the execution.
Aborted Test Suite Executions
You can access the Aborted test suite executions from the Aborted tab.
A test suite execution is aborted:
- If the user terminates the test suite execution forcefully
- If the Qualitia Client or offline execution crashes
- If the Qualitia Server services restart
- If the Qualitia Server loses its connection with the Qualitia Client or Offline Package due to network failure
However, for the last two scenarios, the test suite execution continues, but its results are not pushed to Qualitia Server. In these scenarios, after the completion of the test suite execution, you can import the locally saved results from the Suite of Suites screen.
Viewing an Aborted Test Suite Execution Summary Report
If you drill-down an Aborted test suite execution, both Suite Execution Summary Report and Test Case Iteration detailed report screens display the results as per the last update that was received.
Also, both screen shows the Aborted icon in the top-right corner. Upon hovering on the icon, it shows why the test suite execution was aborted for a known reason such as user forcefully terminating the execution or it shows a generic message.
Viewing a Failure Summary
Qualitia has categorized the failures that occur in the executions to help you troubleshoot the issues in test suite executions. Qualitia sorts the failure category as per the number of corresponding failures.
You can select an execution failure category and drill-down to the failed steps.
Failure categorization are not available for desktop projects.
Failure Category | Description |
---|---|
Action Failure | Indicates that the defined action for the step did not execute correctly. |
Action Is Not Supported | Indicates that the specified action in the step is not defined in Qualitia or the specified action cannot be used in the selected browser or environment. |
Cannot Select From Dropdown | Indicates that the step did not execute as the specified drop-down list does not exist or it does not contain the option that is defined as a parameter. |
Class File Is Missing In The Project | Indicates that the custom action did not execute because its source file is missing. |
Failed To Store Data | Indicates that the action related to storing data failed. |
File Does Not Exist | Indicates that either the specified file is not found in the specified location or the file extension is incorrect, or the step failed to create a file in the specified location. |
Incorrect Data Format | Indicates that the parameters contain data in an invalid format, such as invalid date format, invalid file format, or invalid data encryption format. |
Data Is Not Supported | Indicates that the parameter contains data that the object or the action does not support. |
Invalid Expression | Indicates that the conditional expression syntax or the expression provided as input parameter for an action, is invalid. |
Invalid Input Parameters | Indicates that the test data provided for the action is invalid. |
Object Does Not Exist | Indicates that the object is not found or disabled on the selected browser or environment. |
Testdata Provided Empty Or NULL | Indicates that the test data is not provided for the action. |
Timeout Encountered | Indicates that the associated object is not found within the specified Wait Time. |
Uniqueness Violation | Indicates that the unique values are not provided for this category. For example, if a value is already used in Qualitia and you try to use the same value in custom action it is reported as a uniqueness violation. |
TestCase_Cannot_Execute | Indicates that the whole test case was aborted due to reasons such as incorrect application URL, incorrect environment selected, or invalid Javascript expression. |
Expected Result Not Met | Indicates the result of a step is a defect when actual results and expected results are different. |
Lack Of Permissions | Indicates that you do not have access to the environment or on application under test feature you are testing. |
You can add custom failure categories to custom actions as per your requirement, refer to Adding Custom Categories in Custom Actions for Web and Mobile Projects.
To view a failure summary for a suite execution:
- From the Suite of Suites screen, click a test suite execution for which you want to view the failure summary.
- In the Test Suite Execution Summary Report, click the Failure Summary button.
A list of associated failure categories is displayed along with the number of failed steps. - Do the following in the order listed:
- Click a failure category associated with the test suite.
The failure category expands to show the list of associated test case iterations where failure has occurred. - Click a test case iteration.
A list of applicable failure categories in that test case iteration is displayed. - Click a failure category associated with the selected test case iteration.
The failure category expands to show the list of associated failed steps of the selected test case iteration.
If the step is part of a task, then the task name and task iteration are also shown. - Click a failed step.
The associated test case iteration detailed report opens and it highlights the selected step. - You can view the screenshot and info log for the failed step.
- Click a failure category associated with the test suite.
Purging Data from the Server Administration Portal
The Server Administration portal allows you to remove the data associated with the suite execution reports from the Qualitia Server.
You can purge whole suite execution reports, screenshots, and logs.
Only administrators can purge the data.
Once you purge reports data, you cannot recover them.
Purge Data Tab
Purge data tab provides the following information:
Number of Test Execution Reports
Total storage size of the suite execution report data
Total storage size of the image data
Total storage size of the log data
The Purge Data tab, by default, displays the execution reports that were generated in the last 7 days.
The Purge Data tab displays the following information for each test suite execution.
If besides the checkbox of a test suite execution, a red alert icon is displayed, it indicates the suite execution is aborted.
Column | Description |
---|---|
Generated On | Indicates the date on which the report was generated. |
Suite Name | Indicates the name of the test suite |
Project Name | Indicates the name of the associated project |
Execution Type | Indicates the execution type:
|
Iteration No | Indicates the iteration number of the test suite execution |
Total Size | Indicates the size of reports data including screenshots and logs |
Report Size | Indicates the size of reports excluding screenshots and logs |
Images Size | Indicates the size of associated screenshots |
Logs Size | Indicates the size of associated log files |
Test Cases | Indicates the number of test cases in the test suite |
Test Case Status | Indicates the number of test cases that passed, failed, have defects, and are not executed. |
Purging the Selected Reports
You can select the respective checkboxes of the reports you want to remove or apply filters and then select the reports from the filtered list.
For instructions on how to filter the reports list, refer to Filtering the Reports List.
To purge reports data:
Click Purge Data.
Select any or all of the following checkboxes:
All Report Files: To remove all the reports data of the selected reports to remove its entire folder structure. However, the execution summary of the suite execution report is stored in the database.
Images Only: To remove only the screenshots captured for the selected reports.
Logs Only: To remove only the log files for the selected reports.
Click Purge.
An alert message appears that shows the number of reports and the size of files that are ready for removal.Click Continue.
Filtering the Suite Execution Reports List
You can filter the list of suite execution reports based on project, build or release number of projects, test suite, execution type, user, and also include aborted suite executions.
To filter the suite execution report list:
Select a date range or click the Filter button.
Select the appropriate options from the list of filters.
Logs Tab
The Logs tab provides the details of the previous purges.
Column | Description |
---|---|
Date | Indicates the date and time when the reports were purged. |
Activity | Indicates the purging activity |
User | Indicates the user who purged the reports |
Purge Details | Indicates the type of reports data that was removed. |
Data Size | Indicates the amount of data purged |
Details | Click to view the details of test suite execution reports that were purged. |
Uploading Multiple Suite Execution Reports to Real Time Reporting Portal
You can upload multiple suite execution reports to the Real Time Reporting portal in a single go using the Qualitia Offline Jar file. You can upload them by any of the following ways:
From any computer
You can directly upload the reports from your computer if its connection with Qualitia Server is steady. Also, it must have all the test suite execution reports that you want to upload.From the Qualitia Server
You can use Qualitia Server to expedite the uploading of reports as there is no chance of network connection failure.
Pre-Requisites
Before you upload the suite execution reports, ensure the following:
None of the reports you want to upload are:
Already uploaded to the Real Time Reporting portal, except for the ones that were aborted due to loss in network connectivity.
Incomplete as the suite execution is still in progress.
Generated using Qualitia version lower than 8.0.
Your system must have an offline package. If your system has Qualitia client, you can create an offline package. For detailed instructions, refer to Creating an Offline Package.
Else, you can transfer an offline package from any other computer that contains the offline package to your system.Qualitia server has enough space available. If the Real Time Reporting portal shows an error message that the available drive space is less than 10%, contact the Qualitia Admin to purge old reports to free up space.
To upload multiple suite execution reports:
Create a folder on the system from where you want to upload the reports.
Copy each suite execution report folder (ITR_xxx) that you want to upload.
Open the Command Prompt, and navigate to the folder where the Qualitia offline jar is present.
Enter the following code:
java -jar qualitiaoffline-8.3.0.jar UploadReport=true QualitiaServerURL="<Qualitia_Server_URL>” ReportPath="<Report_Path>" allowScreenshots="Yes"
The following parameters denote:QualitiaServerURL is the URL of the Qualitia Server.
Report Path is the location of the folder that contains the suite execution reports
allowScreenshots is used to upload the screenshots of the reports. Enter “Yes” to upload screenshots or enter “No” if you do not want to upload associated screenshots.
Note:If you are uploading reports from Qualitia server, then you do not need to specify allowScreenshots=”Yes”.
If you do not enter the allowScreenshots parameter, then by default the screenshots of the reports are uploaded.
Press Enter to start uploading the reports.
For every report, the following details are displayed:The upload progress in percentage.
The confirmation message for the reports uploaded successfully and they are removed from the reports folder.
The failure message for the reports that failed to upload.