# Using Test Gap Analysis (TGA) to Identify Untested Changes
# Why Test Gap Analysis?
Untested changes are typically error prone and often lead to production defects . Teamscale's Test Gap analysis is an automated way of identifying changed code that was never tested since its last modification. Such areas are called test gaps. Knowing a system's test gaps allows you to make a conscious decision, which code has to be tested before shipping and which gaps are of low risk and can be left untested. This section shows how to identify untested changes with the Test Gap analysis feature in Teamscale.
To use Test Gap analysis,
- your project should be connected to your source code repository, and
- you should upload test coverage regularly.
# Baseline and End Date
As Test Gap analysis works on changes, you need to define a suitable time span, i. e.
- a baseline, which refers to the date from which on to investigate code changes, and
- an end date, which marks the other end of the time frame.
Only changes that were made to the system between baseline and end date are considered in the Test Gap analysis.
# Creating a Dashboard for Test Gap Analysis
In Teamscale, untested changes are visualized using dashboards. Therefore, we first briefly describe how to configure a Test Gap analysis dashboard, which is needed to identify test gaps.
For more detailed information on how to create dashboards in Teamscale, please see this article.
Switch to the Dashboard perspective and create a new dashboard. On the dashboard canvas, add a new Test Gap Treemap widget. You will see a treemap, a graphical representation of the source code of your project:
Every rectangle on this treemap corresponds to a method in the source code. The bigger the rectangle, the longer the method. Resize the widget by dragging its lower right corner until its size fits your needs. Then hit at the upper left corner to configure it.
In this configuration dialog, there are several settings that determine, which information is shown on the treemap. Depending on your situation at hand, i. e., which specific information you are interested in, you will need to make different settings. The meaning of the options is as follows:
|Path||Refers to the part of your project you want to see on the Test Gap treemap. You might want to select your whole project, but you can also choose a smaller subset, say the component you are responsible for. The treemap will then zoom into that area.|
|Baseline||Refers to the date from which on to consider changes, as described above. Choose a date slightly before the first change that is relevant to you to ensure that no changes are lost.|
|End date||marks the other end of the time frame. The **No end date** option ensures that also future changes and future test information will be shown, i. e. this option keeps the end date always at the current time.|
|Coverage Sources||are logical containers for different kinds of coverage, e.g.»Manual Test« or »Unit Test« or »Test Server 3«. Whenever uploading coverage to Teamscale, a coverage source has to be specified. Here you can select the coverage you want to consider in the Test Gap analysis by simply ticking the corresponding check box, or selecting all coverage sources.|
The resulting treemap should look like the one above. On this treemap, every rectangle has one of the following colors:
Gray for methods that were not changed since the baseline you entered,
Orange for methods were changed and remained untested, and
Red for methods were added but untested since the baseline.
Green, if the method was changed but also executed since its last modification.
# Influence of the chosen baseline on how a single method is shown in the Test Gap Treemap
This picture helps to get a better understanding of how the selected baseline and end date influence what is shown on the Test Gap treemap, and how methods are color-coded. It shows the history of a single method represented using the previously described colors: The method is first added, then tested, later changed again, and so forth. Because the last version of the method is tested, it will appear green on the treemap.
The second row illustrates, what happens, if a baseline is set at a specific point: The method would still appear green on the treemap.
However, if an end date is set as well, the color might change, as illustrated in row three: Because the end date is set before the last version that was tested, the method would appear orange.
Finally, the last row depicts that a method is shown gray, if the baseline is set after the last change date.
In screenshot of the test gap treemap below, the method
Render in the class
RadialBlurEffect.cs is a test gap,
because it was changed after the baseline, but not tested since its last
modification. If this method is tested in its current version, and the
coverage is uploaded to Teamscale, the corresponding rectangle will
become green, and there will be one test gap less.
setter methods are not considered by the
Test Gap analysis. This affects all methods whose method name starts
with either "set", "get" or "is" and whose body contains only one
statement. Additionally, auto generated properties in C#, denoted by
set , are excluded from the Test Gap
analysis. Simple methods are too trivial to test and thus cause noise in
the Test Gap treemap. Eliminating these methods reduces unnecessary
information in the Test Gap treemap and sets the focus on more relevant
# Working with Test Gap Treemaps in Practice
A Test Gap treemap answers the question:
"Which methods have been changed or added since a specific point in time and which of these methods have not been tested yet?«.
These methods, i. e. the untested changes, are called test gaps.
For each test gap, a conscious decision should be made, as to whether the corresponding source code should be tested, or not – e. g. because there are much more critical parts in the code, and only limited test resources left.
To be able to make this decision, typically two pieces of information are needed: First, which functionality the test gap, i. e. the method, belongs to, and second, which changes were made to this method. Developers usually get a good understanding of the functionality of a method when they see the source code. As shown in the treemap above, hovering over a method in the treemap reveals information about the method under the cursor, such as the class and method name. To see the code behind a method, simply click on the rectangle
From the source code view, the complete history of the corresponding file can be accessed by pressing the »H« key, as described here. The history of the code shows the corresponding commit messages and, if Teamscale is connected to an issue tracker, also points to the change requests from where the changes originate. This information is typically enough to decide how critical a change is.
In our experience, it is most efficient to have a person responsible for development and a person responsible for testing perform this workflow together.
To get just a quick overview of the test status of a system, there is a percentage in the upper right corner of the Test Gap treemap. This denotes the test gap ratio for the code shown on the treemap, i. e. the amount of modified or new methods covered by tests divided by the total amount of modified or new methods. In our example, there is a Test Gap ratio of over 77%, meaning that over 77% of the modified and added methods shown on the treemap have not been tested in their current version.
CSV Download for in-depth Inspection
It is also possible to download a CSV file containing all the methods and their corresponding test state, i. e. if the method is unchanged, was tested, or is a test gap. To this end, click on left above the treemap.
In case of very large systems, rectangles in the Test Gap treemap may become very small and hard to distinguish. In these situations it can be easier to identify test gaps when the treemap just shows changed methods, hiding all the gray areas. You can achieve this behavior by enabling the Hide Unchanged Methods option in the treemap's properties, which will hide all unchanged (that is, grey) methods:
# Hiding unchanged methods (before and after)
# Further Reading:
- Did We Test Our Changes? Assessing Alignment between Tests and Development in Practice
S. Eder, B. Hauptmann, M. Junker, E. Juergens, R. Vaas, and K-H. Prommer.
Webinar on Test Gap Analysis (German)
- 0:00: Motivation und Grundlagen
- 9:52: Live-Demo Test-Gap-Analyse für ein Release
- 21:31: Einbindung in den Testprozess und Beispiele aus der Praxis
- 32:45: Live-Demo Ticket-Coverage mit Teamscale
- 43:35: Wie Sie Test-Gap-Analyse mit Teamscale ausprobieren können