# Tests View
The Tests view displays all tests that are known to Teamscale. Tests can be added to Teamscale by Uploading Test Execution reports in one of the supported formats and by extracting tests directly from the actual source code.
# Hierarchical view
The tests are displayed in a hierarchical structure based on the path of the test.
This view contains two actions:
- Link The button adds a link to this view to the clipboard.
- Filter The button switches to the test filter view.
The available metrics are:
- Count: The count of tests in the container
- Failed: The count of tests that have failed in the latest test run
- Skipped/Ignored: The count of tests that have been skipped (i.e. because they are not compatible with the execution platform) or ignored (i.e. the test is disabled in the code or test configuration)
- Successful: The count of tests that have passed in the latest test run
- Result: The aggregated assessment of the previously listed test execution results
- Duration: The aggregated test execution duration
You can click on any of the metrics to see a trend of the metric over time and a treemap of the included elements that shows which of the elements is affected by the metric.
# Test Filter View
The filter view allows to filter tests based on certain criteria.
This view contains two actions:
- Hierarchy The button brings you back to the hierarchical view on the tests
- Switch to query language The button switches the view from the basic mode to the query language mode. The query language gives you much more power over which tests should be selected.
# Basic Mode
- Name: Allows to specify a regular expression that is matched against the test names
- Duration: Allows to specify a threshold for the upper or lower bound of test executions Note that this always takes the maximum duration of all partitions. If you need more fine grained control over how partitions are treated, switch to the query language mode.
- Result: Allows to specify which results should should be included in the results
# Query Language Mode
For example, to display all tests that fail on one platform, but pass on another one, enter
result() in [failed, error] && result() in [passed] into the query box.
After pressing Enter, the list of matching tests is displayed below the box.
The input box also provides autocompletion to simplify writing queries.
Each test in the returned list provides a link to additional details, the partition, duration and result of their execution.
The partition selector in the top right can be used to hide the durations and results of partitions you are currently not interested in.
If you want to remove tests from certain partitions from the results entirely, you need to filter them out via the query.
partitions !in [Windows] shows only the tests that have no execution information for the Windows partition.
# Test Query Language
|Attributes||For each test, the attributes |
Attributes can be compared to values using the usual comparison operators |
|Set Query||You can use the keyword |
Comparisons can be connected using the logical operators |
|Like Operator||The |
|query||The special function |
The operator |
Some interesting use cases could be:
- Tests that fail in at least one partition (track failing tests)
result() in [failed, error]
- Tests that fail in one partition and pass in another (track inconsistent test results)
result() in [failed, error] && result() in [passed]
- Test that have been failing for longer than 14 days Failing tests are fine, actually that’s what they are made for, however, they should be fixed soon
inState(result() in [failed, error]) > 14d
- Tests that have a duration > X sec. (track long running tests)
max(duration()) > 30s
or for a specific partition
max(duration('Unit Tests')) > 10s
- Tests that have never failed (track candidates of “useless” tests)
inState(result() !in [failed, error]) > 1970-01-01
# Test Query Metrics
You can store a query as a metric with the Add as metric button. It will then show up in the right sidebar. Stored test query metrics can then be referenced from other places like the metric-related dashboard widgets.