Skip to content

Working with Test Gap Treemaps

As a tester or release manager, you often want to know which changes to your software system were not sufficiently tested. Teamscale's Test Gap analysis (TGA) automatically identifies such changes for you. The results of this analysis, i.e., the Test Gaps, appear as Test Gap treemaps in the Test Gaps perspective and on TGA dashboards.

For each Test Gap, you should decide whether or not the corresponding source code should be tested, e.g., by focusing on those changes with the highest risk. To make such decisions, you typically need to know which functionality a specific Test Gap, i.e., the respective method, belongs to and which changes were made to it. This guide explains how to achieve this using Test Gap treemaps:

You may also watch our webinar on Test Gap analysis (in German) to see parts of this guide in action.

Reading Test Gap Treemaps

The picture below shows an example of a Test Gap treemap. The structure of such a treemap visualizes the source code of your software system. Each rectangle corresponds to one method in the source code. The bigger the rectangle, the longer the method (measured in source lines of code).

Test Gaps on a Treemap

The colors on the treemap indicate which of the following TGA states the respective methods are in:

  • Gray methods remain unchanged since the TGA baseline.
  • Green methods were changed since the baseline and later executed in a test.
  • Yellow methods were changed since the baseline and not executed in any test since.
  • Red methods were newly introduced since the baseline and not executed in any test.

The red and yellow boxes of the treemap are your Test Gaps, i.e., they contain the changes to your software system that have not been executed in any test. When you move the mouse over the treemap, a tooltip shows details on the respective Test Gap. This allows you to do a first high-level analysis.

Reduce Risk vs. All Green

Test Gap analysis enables conscious decisions about where to direct your limited testing resources, in order to minimize the risk that comes with changes to your software system. The goal is to prevent changes from slipping through testing without you knowing, not to try to close all Test Gaps. An all-green Test Gap treemap does not give you any guarantees, since it cannot prove the absence of bugs.

Export Test Gap Data

You may download a CSV file listing all the methods and their corresponding TGA state, by clicking on the icon in the upper-left of the treemap.

Excluding Test Code

In some cases test gaps for changed test code is not relevant (e.g. if no coverage for test code is available). To ignore changed test code in Test Gap Treemaps, you can use the "Test-code path pattern" option in the Advanced Settings for the respective repository connector.

Working with Large Treemaps

In case of very large systems, rectangles in the Test Gap treemap may become very small and hard to distinguish. Teamscale provides two features to help you work with such large treemaps.

For one, it often helps to enable the Hide unchanged methods option in the treemap's properties, which hides all the gray rectangles on the treemap, as shown in the following figure. This helps you focus on the changes, especially if they are few compared to the overall size of your system. On the down side, all changes are now next to each other, which means to cannot immediately see which of them are close to each other in the codebase and which are far apart.

Test Gap Treemap with Focus on Changes

Alternatively, you may zoom into a part of the treemap, to focus on individual components of your entire software system. When you click on a Test Gap in a treemap, the tooltip changes into a persistent state that allows you to select an element of the path. If you do this, the treemap switches to show only the code inside the respective subpath of the system. To return to the treemap of the entire system, click anywhere on the treemap and select All Methods from the tooltip.

Persistent Tooltip on a Test Gap Treemap

Analyzing Individual Test Gaps

To analyze a particular Test Gap in detail, click on it in a Test Gap treemap and select Open Method from the tooltip, to see a history of the changes and test activity on the respective method, newest activity at the top.

Change and Test History on the setDistance Method

In particular, this history shows you when the latest test executed the method (if any) and which changes happened after that test. Often, this information is sufficient for a rough estimation of the Test Gap's severity.

To view the concrete untested change in the code, change to the Compare method tab and ensure that Compare to latest tested version (if existing) is enabled. The compare view below will then show you exactly the untested change or simply the entire method content, if it was never tested.

Collaborate with Developers and Testers Best Practice

You should bring developers and testers together for analyzing Test Gaps. With inside knowledge about both the development activities and the testing requirements and processes at the table, assessments of and decisions about Test Gaps become easy.

Adjusting Baseline and End Date

Teamscale always analyzes Test Gaps between a baseline, i.e., the starting point of the analysis, and an end date. While the baseline is typically a specific time in the past, such as the last release, the end date is often simply now. The following picture shows the impact of different baselines and end dates on the TGA state of a particular method.

Test Gap States of an individual method

The first bar shows the method's entire history. The method is first added, then tested, then changed, later changed again, then tested again, and so on. If we choose a baseline before the method was added, TGA considers the method as added since the baseline. Furthermore, if we choose an end date after the last test, TGA considers the method as tested since the latest modification. As a result, the method appears green in Test Gap treemaps.

The second bar shows the effect of moving the baseline forward. TGA now considers the method as modified since the baseline, because it existed at the baseline and changes were made to it afterwards. Since the latest version of the method was still tested, the method still to appear green in Test Gap treemaps.

The third bar now shows the effect of additionally moving the end date backwards to somewhere between a change and the subsequent test. TGA still considers the method as modified since the baseline, but since no test happened after the latest change, the method appears yellow in Test Gap treemaps. If we would now move the baseline back to before the method was added, the method would instead appear red.

Finally, the fourth bar shows the effect of moving the baseline forward until after the latest change. TGA now considers the method as unchanged since the baseline and it appears gray in Test Gap treemaps, regardless of later tests and the end date.

Separating Test Stages

When analyzing Test Gaps, you may want to distinguish which changes were executed at which test stage, e.g., by unit tests vs. manual end-to-end tests. Teamscale allows you to do this, if you upload the code coverage from the individual test stages to separate Partitions. During Test Gap analysis, you may then select any (combination of) these partitions as the Coverage Sources to consider in the treemap's properties:

Coverage Source for Test Gap Analysis


Webinar:

Webinar on Test Gap Analysis (German)


  • 0:00: Motivation und Grundlagen
  • 9:52: Live-Demo Test-Gap-Analyse für ein Release
  • 21:31: Einbindung in den Testprozess und Beispiele aus der Praxis
  • 32:45: Live-Demo Ticket-Coverage mit Teamscale
  • 43:35: Wie Sie Test-Gap-Analyse mit Teamscale ausprobieren können