Analytics features
This document explains Analytics features and how you can best use them to assess your project.
Overviewβ
TestOps Analytics module is the central hub for tracking, analyzing, and communicating testing performance and software quality across projects. It unifies dashboards, reports, and insights that turn raw execution data into actionable intelligence for QA managers, testers, and stakeholders.
Instead of manually compiling metrics from multiple tools, TestOps consolidates data from connected ALM and CI/CD systems into standardized models. This enables consistent visibility across time-based and iteration-based perspectives-helping teams monitor testing health, release readiness, and long-term quality trends.
Core Featuresβ
Dashboards and Reportsβ
TestOps provides dashboards for summaries and reports for deeper analysis.
- Dashboards (e.g., Analytics & Trends, Release Readiness, Live Monitor) give a quick overview of test health and execution progress.
- Reports (e.g., Test Runs Analysis, Defect Trends, Requirement Coverage) let you drill into details to identify issues or improvement opportunities.
Together, dashboards and reports offer both high-level awareness and detailed investigation capability.
Scopingβ
All analytics in TestOps are project-based. Each project aggregates entities like test cases, executions, requirements, and defects drawn from integrated tools such as Jira and Azure DevOps.
Data from these entities are presented in either of these perspectives, that you can choose from in each dashboard/report:
- Time-Based: to observe historical trends or productivity changes.
- Iteration-Based - Release or Sprint: to evaluate iteration progress and quality.
- Current: to assess all available data of the project.
Some reports/dashboards don't have all scopes available, due to its analysis goal.
Filteringβ
Reports and dashboards have filters so you could filter data and compare between filtered datasets (e.g. automation vs manual tests, performance between test authors...) to discover trends and anomalies. Each report/dashboard has unique filters you could select via the filter dropdown:
There are two types of filters:
- Default filters: Each report/dashboard has default filters by entities (for example, Test Case Status, Defect Priority, Run Type,...) depending on the report purpose.
- Customizable field filters: You can configure customizable fields to add more attributes to testing entities, and later on filter by these attributes inside reports. They help you effectively further slice data by your project needs (e.g. regression testing,... ) to uncover unexpected insights.
This gives you precise control over what data is shown, while keeping all widgets and reports synchronized to the same filter context.
Custom Viewsβ
In any report, you can create custom views that store filter for different perspectives, to quickly switch between them without reconfiguring the filters each time. This is especially useful when you regularly configure one report for different purposesβsuch as focusing on failed automated tests, tests from specific suites, or activity within a particular sprint.
With the custom views, every report becomes a personalized workspace that adapts to the way you analyze quality.
What you can do:
- Save filter configurations: After applying filters in any report, save them as a new view with a descriptive name.
- Switch between views instantly: Use the view selector at the top of the report to load saved filter combinations with a single click.
- Set a default view: Mark your most-used view as the default so it automatically loads whenever you open that report.
- Manage your views: Edit, clone, or delete saved views as your analysis needs change.
Widgets and Visual Insightsβ
Dashboards are composed of widgets, modular visual elements that summarize metrics from underlying reports. You can expand widgets to view more details and see the report it's linked to:
While widgets are static in dashboards, they become interactive report view - clicking a segment filters and refreshes the related data view below it. This allows seamless transition from visual overview to analytical evidence.
For Analytics & Trends dashboard, you can customize the dashboard by adding more widgets pulled from reports, to tailor to your team's needs.
Export and Sharingβ
Each dashboard or report can be exported to PDF, CSV, or Excel for record-keeping or shared directly via link. This simplifies collaboration and enables teams to review the same data during stand-ups, sprint reviews, or release planning.
Click Write with AI to prompt the agent to write a description for you before sharing.
AI featuresβ
Analytics' AI features turn raw test data into triage insights to speed failure investigation and internal communication.
AI Briefingβ
The AI Briefing feature condenses report data into concise, insight-ready text for stakeholders. It captures key achievements, risks, and trends to support executive briefings and retrospectives with minimal effort.
AI Analysisβ
The AI Analysis feature analyzes the full test-execution context (logs, traces, screenshots,... ) and then generating a failure analysis with failure summary, and suggestion for remedy.
AI Failure Groupingβ
The Analyze feature analyzes the execution context (logs, stack traces,...) to identify root-cause signals, then assigns the failure to common categories for Analyzing automation error patterns.