ScriptlessAutomation
  • 👋Welcome to Scriptless Automation
  • Discover More About Automation
    • ⚙️Automation platform intro
    • 💡Advantages of Scriptless Automation
    • 🚀Release Notes
      • 📖Open Code
      • 📕License Code
  • Product tools
    • 📪Pre-request Tools
    • 🔧Project Dependencies
  • Automation Architecture
    • 🎨Flow diagram
  • Get Started
    • 🛠️Start with Automation
    • 📖Open Code Automation
      • 🌏Web Automation
      • ↔️API Automation
    • 🏢Maven Configuration
    • 🗜️Setting Up Maven Project in IntelliJ IDEA
    • 🎛️Scriptless Config
      • 🕸️BrowserConfiguration
        • chrome.properties
      • 👥CommunicationConfiguration
        • SlackConfiguration.properties
      • 📧MailConfiguration
        • gmailCredentials.properties
      • 🛂ReportConfiguration
        • extentReportConfiguration.properties
      • 🕵️TestManagementConfiguration
        • testRail.properties
        • zephyrscale.properties
        • testiny.properties
      • ⚙️testNgConfiguration.properties
    • 🍱TestData Configuration
    • 👨‍💼Gherkin Language and Scriptless Automation Framework
  • Automation Import Notes
    • 🎨Points to Remember
  • Automation Platforms
    • 👾ScriptlessApi Automation
      • 🖊️Introduction
      • 🗜️Api Automation Setup
      • 🔃Supported API Request Types
      • 🪧API Automation Template
      • 📚Example of API Requests
        • ⬇️GET
        • ↕️POST
        • ⤵️PUT
        • ❌DELETE
      • 🎯API Response Validation
      • 👨‍👦API Dependent TestCase
      • 📝Store API Variables
      • 📔API with JSON body
      • 🙋‍♂️Api Wait
      • 🗜️API Schema Validation
      • 🏗️API Tailor-Made coding
      • 👨‍🦯API Support Generator
      • ↘️Api Response Store Objects
      • ✍️API Test Report
      • 🚃Api Response Type Validation
    • 🌐ScriptlessWeb Automation
      • 🖊️Introduction
      • 🗜️Web Automation Setup
      • 🪧Web Automation Template
      • 🧮page_object_locators
      • 📜Web Automation Key Phrases
        • 📃PAGE_NAME
        • ⌛WAIT_TYPE
        • 📜SCROLL_TYPE
        • 👨‍💼ELEMENT_NAME
        • 🏎️ACTIONS
        • ⌨️SEND_KEYS
        • ✔️VALIDATION
        • ✅VALIDATION_TYPE
      • 👨‍👦Web Dependent Test Case
      • 🐒MOCK
      • 🛂AutomationAsserts
      • 🏗️Web Tailor-Made coding
      • 📝Store Web Variables
      • 🤼‍♀️Web & API integration
      • 🖇️Dynamic Strings
      • 🗣️ReadFile Annotation for Custom Code
      • 🖼️Page_Comparison
      • 👨‍💼Gherkin Template for Web Automation
    • 📱Mobile Automation
  • 🪶Automation features
    • 🌲Environment and System Variables
    • 🗝️KeyInitializers
      • Types of KeyInitializers
    • ✍️Reporting
      • Dashboard
      • Category
      • Tests
        • Screenshot Section
    • 👯Parallel Testing
    • 🏗️Tailor-Made Coding
  • ⏩Automation Demo
Powered by GitBook
On this page
  • Overview
  • Key Sections of the Dashboard
  • Interpreting the Dashboard
  • Conclusion

Was this helpful?

  1. Automation features
  2. Reporting

Dashboard

PreviousReportingNextCategory

Last updated 1 year ago

Was this helpful?

Overview

When reviewing the AutomationReport.html, the dashboard offers a snapshot of the testing process's success and areas that require attention. It is the first thing you'll see when opening the report and includes several key metrics that provide insights into the test execution.

Key Sections of the Dashboard

Test Execution Timeframe

  • Start and End Timestamps: These indicate when the test suite started and completed, giving a clear idea of the duration of the test run.

Test Outcome Summary

  • Tests Passed/Failed/Skipped: Numerical and graphical indicators show the distribution of test outcomes, helping to quickly gauge the success of the test run.

Log Events

  • Event Summary: This pie chart visualizes the breakdown of different events during the test run, including passed events, failures, and informational logs.

System and Environment

  • Operating System: Displays the operating system on which the tests were run, such as "Mac OS X".

Detailed Test Analysis

Tags

  • Definition: In the context of this report, "Tags" represent directories within the test_case_flows folder. Each tag correlates with a specific directory name, indicating a group of related test cases.

  • Status Metrics: Under each tag, the dashboard displays the number of tests that have passed, failed, or been skipped. This provides a granular view of the test outcomes related to specific components or features of the application that correspond to the directory structure.

  • Pass Rate: The percentage of tests passed within a tag gives a quick measure of the reliability of the tests in that specific directory.

Interpreting the Dashboard

When interpreting the dashboard, consider the following:

  • Consistency Across Runs: Look for tags with consistently high pass rates across different test runs for indications of stable components.

  • Problem Areas: Tags with failed tests point to areas in the application that may need further investigation and possible remediation.

  • Skipped Tests: Understanding why tests were skipped (e.g., due to dependencies on failed tests or intentional exclusions) can help refine the testing strategy.

  • Environmental Factors: Knowing the operating system and environment details can explain certain test behaviors and outcomes, especially if they are environment-specific.

Conclusion

The dashboard of your AutomationReport.html is a powerful tool for a quick and comprehensive overview of the test execution. By grouping test cases into tags that mirror your project's directory structure, you can streamline the troubleshooting process, identify stable areas, and focus on components that require improvement.

🪶
✍️
Dashboard