Automating a test case
Provides information for the test automation engineer how to automate a test case once the user has uploaded a test plan document.
Last updated
Provides information for the test automation engineer how to automate a test case once the user has uploaded a test plan document.
Last updated
Once a test plan was uploaded by the user, the automation engineer has the following responsibilities:
Provide a model of the Subject Under Test (Software) and the test environment
Provide an automated version of the test plan
Create the first accepted revision of the test report
This is only required if there is not yet a model registered for the SUT or the test environment. Please refer to section Enabling Software & Environments for more information on how to provide a SUT or environment model.
The automation of the test case itself is centered around a specific revision of a test plan. Therefore, the user interface for an automation engineer presented for a test case differs a bit from the default one.
The automation area of the automation engineer user interface allows you to upload a TestResults.io testcontainer file created by the TestResults.io designer as well as additional supporting files that are required by the automation.
Keep in mind that your automated test case has access to the support files uploaded by the user. Use the Automation Supporting Files functionality only if you need files in addition to the one uploaded by the user in supporting files. As a reference, when to use Automation Supporting Files: This should rarely be the case.
In the automation area you also find the abilities to trigger workflow steps of the Test Automation Workflow which are only accessible to the automation engineer.
A click on the "REGISTER TESTSTEPS" button allows you to register the number of automated steps in the automated test plan. This information is used in case you use TestResults.io to automatically generate invoices either for you external or internal customers.
Keep in mind that you usually do not upload testcontainer files manually. This is automatically done for you by the TestResults.io designer.
As part of your automation it might be required that you need data that is only known at execution time (not classical test data). This late bound data can be provided with the help of execution variables. You can define variables for every revision of a test plan.
If you want to reuse variables defined for a previous revision of a plan, you can click on the "Import From Previous Plan" button.
Every variable is defined by a set of properties:
Property
Description
Name
The technical name of the variable. This is the name that you want to use in the automated test case. The user will not see this name.
Display Name
The readable name of the variable as shown to the user
Pool
If enabled this variable controls the execution scheduling algorithm. The value needs to be a list of values separated by semicolon (;). Every value of the list will act like a semaphore, i.e. if all values are used by at least one test case execution the next execution will be blocked until a value becomes available again.
This feature can be used if resources, e.g. login credentials, are limited.
Secure
Marks a variable as secure. The value of a secure variable will not be shown in the portal user interface (keep in mind, that secured variables might show up in SUT screenshots etc.)
Regex
Allows you to defined a regular expression that needs to be fulfilled by the value of the variable. This make sure that the user enters valid data only.
Message
A message that should be shown to user if a value does not fulfill the requirements defined in Regex.
Value
The initial value of this variable. This is the only property that can be changed by the user.
All defined variables can be accessed in your automated test case with the GetVariables() functionality.
A test report is the result of the execution of a test set that includes a least one executable test case. A test case is only executable if the last revision of the test plan is in the Ready for Execution state. Therefore it is important that the test plans are transferred in the Ready for Execution state as fast as possible.
Details on the workflow can be found in Test Automation Workflow. This section covers the actions required from an automation engineer's perspective.
If the Test Automation Workflow is disabled you only need to upload an automated test plan and the TestResults.io platform will automatically trigger all state transformations for you.
This simplified workflow is target for individuals and small teams only as it doesn't enforce interaction between user and automation engineer but relies on a "first time right" approach. This approach is successfully only in specific scenarios like single individuals or really small teams that have a lot of interaction anyway.
In summary, without the test automation workflow the automation engineer uploads the testcontainer for the automated test plan and the test case is available for execution by the user.
With the full Test Automation Workflow enabled there are two quality gates:
Quality of manual test plan provided by user
Quality of automated test plan provided by automated engineer
The first gate makes sure that the automation engineer understands the test plan and is able to automate it. This includes that the automation engineer makes sure that all required files, data, environment, software artifacts are available.
The second gate makes sure that the automation engineer automated the manual test plan in the intended way. When an automated testcontainer was uploaded the automation engineer claims that the automation is done. The user has then the option to either accept the results or to request a redesign.
In summary, with the test automation workflow the automation engineer first needs to confirm that the manual test plan can be automated. After the automation the automation engineer needs to confirm to the user that the automation is done. After the user accepts this automation the test case is ready for execution.