# Analyzing Test Gaps on a .NET Desktop Application (Pinta)
As a developer or tester of a .NET desktop application, you may use Teamscale's Test-Gap Analysis (TGA) to improve your testing effectiveness. This tutorial leads your through the setup und execution of TGA on Pinta, an open source graphics editor.
- Step 1: Installing Pinta
- Step 2: Installing the Teamscale Ephemeral Profiler
- Step 3: Uploading Coverage to Teamscale
- Step 4: Analyzing Coverage and Test Gaps
# Step 1: Installing Pinta
As a prerequisite for running Pinta, install a .NET Framework 4.5 (or newer) and GTK# for Windows.
Next, place the Pinta binaries somewhere on your local machine and the corresponding PDB (symbol) files in
To obtain these files, either compile the Pinta sourcecode yourself or download our demo bundle:
Finally, start Pinta, to ensure that the installation was successful.
Troubleshooting: Pinta Fails to Start
Ensure there's only one version of GTK# installed on your system, otherwise starting Pinta might fail due to DLL loading problems.
# Step 2: Installing the Teamscale Ephemeral Profiler
Download the latest release of the Teamscale Ephemeral Profiler and unpack it to
To register the profiler, set the following system variables via Control Panel > System > Advanced system settings > Environment Variables:
Profiler64.dll, depending on the bitness of the Pinta process)
COR_ENABLE_PROFILING variable enables profiling on .NET processes.
COR_PROFILER variable provides the GUID of the Teamscale Ephemeral Profiler, which the .NET Runtime uses to verify the profiler DLL on load.
COR_PROFILER_PATH tells the .NET Runtime where to find the profiler binaries.
Profiling a .NET process requires a different version of the profiler, depending on whether the process is a 32 or 64 bit process.
To determine the bitness of the Pinta process, follow our guide on determining application bitness on Windows.
Setting Environment Variables via Launch Script
You may set these variables only for the target application, e.g., via a launch script.
For an example, check
PintaWithProfiling.bat in the demo bundle.
Next, we configure the profiler itself.
C:\teamscale_dotnet_profiler\Profiler.yml, then edit the file.
Adjust the global configuration section, i.e., the first entry below
match, as follows:
match: # global configuration section - profiler: enabled: false upload_daemon: false targetdir: C:\pinta_coverage
This globally disables both the profiler, to avoid profiling unnecessary processes, and the automatic upload of coverage, which we will configure later.
Moreover, it instructs the profiler to write trace files to
C:\pinta_coverage exists and that the user running
Pinta.exe can write to it.
Otherwise, profiling the process will fail.
Next, add a process-specific configuration sections below the global one (delete any such sections that exist in the template), to enable the profiler specifically for the Pinta executable:
match: # global configuration section - ... # process-specific configuration section - executableName: Pinta.exe profiler: enabled: true
To test your setup, start Pinta, do something (e.g., create a new image and draw a Picasso), and closing it again.
The target directory
C:\pinta_coverage should now contain a coverage report named like
coverage_20190307_1352510131.txt (the date and time parts of the file name will differ) with some content.
Reduce Performance Impact
Profiling processes always incurs a performance impact on the processes. The Teamscale Ephemeral Profiler minimizes this impact, but cannot completely avoid it. To minimize the overall impact, ensure you always configure the minimal set of processes (executables).
# Step 3: Uploading Coverage to Teamscale
The Teamscale Ephemeral Profiler comes with the ability to upload coverage directly to a Teamscale instance. During the upload, the profiler maps the recorded coverage to the source code of the application under test, using the respective PDB files.
Assumption: Reachability of Teamscale
This guide assumes that the Teamscale server is directly reachable from the test environment via HTTP(S). Alternatively, the profiler can upload reports to a file share, where they may be picked up and forwarded from another machine.
Assumption: PDB files
This guide assumes that the PDBs corresponding to the version of the application under test are available in the test environment. Alternatively, you may upload PDB files to Teamscale, such that the mapping from coverage to source code happens there. We do not recommend this, however, because it leads to higher network traffic and disk-space consumption in Teamscale.
# Step 3.1: Preparing Teamscale
In Teamscale, go to Projects and click New project.
Name the new project
pinta and choose the analysis profile C# Default.
Click Source Code Repository and select Git.
Create a new account named
Pinta GitHub with the URI
https://github.com/PintaProject/Pinta.git and without credentials.
Set the Default branch name to
release-1.6 and the Start revision to
2014-03-25 (the release date of Pinta 1.5).
Click Create project.
# Step 3.2: Linking The Version Under Test to a Source Code Revision
Create the file
C:\revision.txt and set the file's content to
This commit ID identifies the source code corresponding to the Pinta 1.6 release, which we want to link the coverage information to.
Keep the Version Under Test and revision.txt in Sync
If you change the version under test, you also need to adjust the
We recommend to automate this in real test environments.
# Step 3.3: Uploading Coverage
C:\teamscale_dotnet_profiler\Profiler.yml again and change the global configuration section to match the following:
match: # global configuration section - profiler: enabled: false upload_daemon: true # enable upload targetdir: C:\pinta_coverage uploader: teamscale: url: http://localhost:8080 # use your URL username: admin # use your username accessKey: u7a9abc32r45r2uiig3vvv # use your access key project: pinta partition: Manual Tests pdbDirectory: C:\pinta_pdbs revisionFile: C:\revision.txt assemblyPatterns: include: - Pinta.* ...
This globally enables and configures automatic uploading of coverage data.
Use the URL and credentials (username and access key) of your Teamscale instance.
project option contains the ID of your Teamscale project such that the coverage data is sent to this project.
partition is an arbitrary label that you may use to distinguish coverage source, such as different test stages.
Teamscale merges all coverage in the same partition and allows you to inspect coverage in different partitions individually (or in any combination).
During the upload, the Teamscale Ephemeral Profiler maps the coverage data back to a particular version of the source code, using the PDB files and the revision ID from the
It will do so for all the assemblies of Pinta (whose names match the regular expression
To test your setup, start Pinta, do something (e.g., create a new image and draw a Picasso), and close it again. The profiler automatically schedules uploading of coverage data from terminated processes every five minutes. To verify this worked, in Teamscale, go to Activity. Once the data arrived and was processes successfully, the entry on top of the timeline represents the coverage upload.
Ensure that the user running
Pinta.exe has read access to
Otherwise, processing the coverage will fail.
You may configure the upload interval in the
Profiler.yml file, using the
Reduce Performance Impact
Mapping coverage data to source code takes some processing time that adds to your overall test execution time. To avoid wasting time on unnecessary mapping, ensure you always configure the minimal set of assemblies, i.e., exactly the assemblies containing the code that you want to identify test gaps in.
# Step 4: Analyzing Coverage and Test Gaps
In Teamscale, go to Findings and click Manage baselines in the right sidebar.
Click Add baseline to create a baseline named
Pinta 1.5 at revision
5ca665f0 for the Pinta 1.5 release.
# Step 4.1: Test Gaps at File Level
In Teamscale, go to Tests and choose the Pinta 1.5 baseline at the top.
The table on the page shows the file-system structure of the Pinta project, with Test Gaps, Execution, and Churn data aggregated for each file or directory. You may drill down in this perspective to check what exactly has been changed or executed and where the test gaps are.
The Execution and Test Gap data may look different for you, depending on the actions you performed in Pinta.
# Step 4.2: Test Gaps at Method Level
In Teamscale, go to Dashboards.
In the right sidebar click Add dashboard and then choose Test Gap Treemap from the list of widgets.
Finally, click Save Dashboard and name the Dashboard
Pinta Test Gaps 1.5 -> 1.6.
Move the mouse over the Test-Gap Treemap and click on the icon that appears in the upper left, to edit the widget. As the Baseline, choose the Pinta 1.5 baseline. Make sure that Manual Tests is selected under Coverage Sources, for the widget to consider the coverage you uploaded earlier. Confirm the dialog.
The widget now shows the code changes between version 1.5 and 1.6 in color. The red and yellow blocks represent methods that have been changed, but where not covered by tests, i.e., test gaps. Hover the mouse over the blocks in the treemap to see details in a tooltip. Click on the blocks the drill into the data.
The treemap may look different for you, depending on the actions you performed in Pinta.