While test report documents are traditionally associated with Waterfall, they can contribute to the Agile development process, too.
A test report summary contains all the details of the environments where the code was tested, who tested it, when it was tested and how it was tested. The document serves as a physical log that records exactly what code was tested, in what system configuration the code was tested on and any bugs that came up during testing.
While some IT organizations create test reports that are upwards of 20 pages, the document length is subjective. The overall goal of the test report summary is to record the actions and results of a test. This enables the team to make informed decisions on what procedural improvements can be made for future tests. Often a development team can use test reports for internal audits following a customer complaint, but these reports should be done regularly during the testing process to create better software in each development cycle.
In Agile development, the test report summary represents a record of test execution. Compared to its Waterfall companion, this version is less formal and focuses more on the results.
Let's examine the basic components of how to write a test report and why such a summary can be useful in Agile software development.
What's in a test report?
Before the creation of a test report, the writer needs to determine the target audience and why the report is needed. For example, if an application requires an audit trail for regulatory reasons, the test report's writer likely will need to include more specific data. If the target audience is upper management and they want to understand what the team tested for each release, then the writer can include a summary that outlines the main functions tested. Or, if the report is more of an audit that won't be read by anyone unless a critical bug is detected, the writer can structure the report to include only technical information. Download a test summary report template here.
While such summaries should all include the same basic information necessary for the target audience, there's no formula for test reports. Testers can add or subtract data as needed to fulfill the test report's objective. Here are four major components that every tester should include when writing a test report.
Test objective. The test objective is what type of testing the team and its testers executed, and why. For example, if the test report covers functional testing, regression and performance testing, the test report writer will need to describe the objective for each testing type.
In most cases, regression testing is the main purpose of the test execution. The objective of regression testing can vary, but a team usually performs the practice to search for defects once developers add new feature code to an existing code base. Regression testing is done prior to any new release and varies in length of time and testing depth. If a team's regression testing includes integration, performance or other testing types, the writer should indicate the purpose of each test specifically in the report's objective section.
Test cases, test coverage and execution details. The next element on how to write a test report is to explain the test suite. Specifically, include what type of test was executed, where it is stored and when it was executed. The test report writer can also add the name of the QA professional who ran the test, but that's not a specific requirement. Instead, it's more important to include the how/what/why and actual test results.
The writer should lay out how many tests were executed, passed, failed or skipped in this section. Skipped tests represent tests that a team planned but missed either due to time constraints or because the tests were blocked by reporting defects. In instances like these, teams should also include how much code they tested. A team can use test management applications and tools to specify how much code its QA professionals tested. If such tools aren't available, turn to the development team for a test coverage estimate.
Execution details, which a tester manually records or a test management program tracks, includes who tested the code along with when and where it was tested. A test report writer does have some flexibility with how to display the data and details; it often depends on the number of tests a team runs. The test report can include a general data grid to view the information or another data report. Again, there's no set rule for how this information is included on a test report. Writers use their own judgement.
Defect counts. Another important aspect is documenting what defects testers found. This section is vital for post-testing analysis, which means test report writers shouldn't merely list bug identification numbers. They should include a brief description of each bug to help save time afterwards. However, the writer won't necessarily need to list every bug found in testing. The report writer could consider waiting until the product management team affirms the existence of a bug or defect before including it in the test report.
Testers will need to carefully review the defect list to verify they aren't re-reporting known bugs or bugs already in the repair backlog. The data in this section should focus on defects found in the stated release organized by their priority. Defects and bugs are always found, but it's their priority and severity that determine if the release goes out to customers or if it's held back for repairs.
Platform and test environment configuration details. The configuration and test environment section is tricky. Details are important, but test report writers also need to consider security and compliance when they share information regarding an application's code. The writer should assume the target audience generally understands the test system; this isn't the place to reveal details about an app's servers and code storage.
Once the writer shares the test report, there's no guarantee it will be stored securely. List the server name and the dates, but keep the data basic.
If the configuration was nonstandard, the writer should include that information as well. Standard configuration needs to be documented within the test team for reference.
For example, if, during testing, the configuration changes and causes defects, a writer must include this information in the test report. Someone can analyze what effect that has on the scope or depth of defects and overall testing coverage.
Understand who a test report is for
The test report's importance really depends on the needs of a particular business. It's a handy document to track testing results by release so members of an IT organization know what was tested and when. Whether it's a formal document or a simple summary of what was done, a test report contributes to better software development.