Dashboard
Last updated
Was this helpful?
Last updated
Was this helpful?
When reviewing the AutomationReport.html
, the dashboard offers a snapshot of the testing process's success and areas that require attention. It is the first thing you'll see when opening the report and includes several key metrics that provide insights into the test execution.
Start and End Timestamps: These indicate when the test suite started and completed, giving a clear idea of the duration of the test run.
Tests Passed/Failed/Skipped: Numerical and graphical indicators show the distribution of test outcomes, helping to quickly gauge the success of the test run.
Event Summary: This pie chart visualizes the breakdown of different events during the test run, including passed events, failures, and informational logs.
Operating System: Displays the operating system on which the tests were run, such as "Mac OS X".
Tags
Definition: In the context of this report, "Tags" represent directories within the test_case_flows
folder. Each tag correlates with a specific directory name, indicating a group of related test cases.
Status Metrics: Under each tag, the dashboard displays the number of tests that have passed, failed, or been skipped. This provides a granular view of the test outcomes related to specific components or features of the application that correspond to the directory structure.
Pass Rate: The percentage of tests passed within a tag gives a quick measure of the reliability of the tests in that specific directory.
When interpreting the dashboard, consider the following:
Consistency Across Runs: Look for tags with consistently high pass rates across different test runs for indications of stable components.
Problem Areas: Tags with failed tests point to areas in the application that may need further investigation and possible remediation.
Skipped Tests: Understanding why tests were skipped (e.g., due to dependencies on failed tests or intentional exclusions) can help refine the testing strategy.
Environmental Factors: Knowing the operating system and environment details can explain certain test behaviors and outcomes, especially if they are environment-specific.
The dashboard of your AutomationReport.html
is a powerful tool for a quick and comprehensive overview of the test execution. By grouping test cases into tags that mirror your project's directory structure, you can streamline the troubleshooting process, identify stable areas, and focus on components that require improvement.