Once published, a process with A/B Testing enabled will adhere to the specific testing configuration without notifying process participants. However, requests are tagged to indicate which alternative they belong to, and the data is available for process owners to view.
Follow these steps to view testing summary for a Process:
Alternatively, when publishing a process, you can click on See Process A/B Testing Configuration link to view process configuration.
Click the A/B Testing Configuration tab to view a summary of A/B Testing configuration and test results.
A summary of the configuration for A/B Testing is shown in the Summary A/B Configuration section. Follow these steps to edit these settings:
Click Save to save the configuration.
A/B Testing configuration can also be done when publishing a process in Process Modeler.
A summary of the test results of A/B Testing is shown in the Summary A/B Testing Results section.
The summary display the following information about each alternative:
Average completion time
Best time
Worst time
The configuration tab also provides a list of request created for each alternative. Select the Alternative A or Alternative B tab to see requests for each alternative respectively.
In addition to the standard request list view, users now have the option to include a dedicated column for viewing alternatives directly within the request list interface. This feature provides users with quick access to alternative options associated with each request.
The A/B Testing feature allows organizations to compare different process alternatives enabling process owners to make informed decisions based on data, leading to improved efficiency and effectiveness of business processes.
By using A/B Testing, organizations can achieve the following benefits:
Improved Process Performance: Identify process alternatives that deliver better results, such as reduced cycle time or increased productivity.
Optimal Resource Allocation: Determine the most efficient allocation of resources to maximize process outcomes.
Continuous Process Improvement: Utilize insights gained from A/B testing to continually optimize and refine business processes.
Enhanced Decision-making: Make data-driven decisions based on objective evidence rather than subjective opinions.
Alternative Creation: Users can create and maintain two independent variants of a process model, enabling testing and comparison against different scenarios.
A/B Testing: Designers can activate both alternatives for live execution and define rules for distribution, ensuring reliable and unbiased results.
Data Collection & Analysis: Process metrics and user feedback are collected and compared against live alternatives, offering quantitative insights for informed decision-making.
A/B Testing addresses the challenges organizations face in identifying and improving their business processes efficiently. This feature provides concrete evidence to support decision-making by collecting and analyzing data from different process alternatives. It tracks and measures key performance indicators (KPIs) for each alternative, providing objective proof of which performs best.
Watch the following product tour to get a quick overview of how A/B testing works.
The first step to enable A/B Testing is to add an alternate path that a process can take.
Follow the steps below to create an alternative for a process model:
Edit an existing process or create a new one in the Modeler.
When creating an alternative, ProcessMaker will clone the existing model (Alternative A) into a separate tab (Alternative B). Subsequent edits to either alternative will be saved independently.
Edit the two alternative as needed and click Publish to publish the alternatives.
The Start Event for both alternatives must be identical to enable A/B Testing. If the events differ, A/B Testing cannot be enabled, and the following message will be displayed.
When publishing the process, Alternative A will be the default version, however, Designers can choose to publish either Alternative A, or B, or both.
Simultaneously publishing both alternatives enables the A/B Testing configuration, while publishing just one will disable it.
Follow these steps for configuring basic A/B Testing settings:
From the Publish New Version screen, select the A+B Alternatives tab.
Click the A/B Settings button to configure test settings.
For simple testing configuration, select the Simple option and adjust the slider to set the percentage distribution of incoming requests between the two alternatives. By default, requests are evenly distributed between the two alternatives.
Click Save and Publish to publish the two alternatives with basic A/B Testing configuration.
Click the Version Info button to go back to the alternative selection screen.
Click See Process A/B Testing Configuration to view tests results and summary.
For advanced configuration, see Advanced Testing Configuration.
If no changes are made to the A/B Testing configuration, requests are distributed equally by default, with 50% allocated to Alternative A and 50% to Alternative B.
Both alternatives must have the same start event for A/B testing to work correctly.
For advanced testing configuration, choose the Advanced option in A/B settings when publishing the alternatives and follow these instructions:
From A/B Settings, choose the Advanced option and input an expression to set the rules for activating either of the two alternatives.
For more information about expressions, see FEEL Expression Syntax.
Here are a few examples of how to enter expressions in the advanced settings:
When routing based on the value of a Request Variable, an expression like this can be used:
loanApproved == "Yes"
If loanApproved
variable stores a Boolean value, the expression can be written simply as:
loanApproved
It is also possible to combine expressions such as:
(loanApproved) or (loanRating > 500)
When the entered expression evaluates to TRUE, requests will follow the process model in Alternative B.
When the entered expression evaluates to FALSE, requests will be allocated between Alternative A and B based on the percentages selected in the slider. Following table provides some examples of how advanced A/B Testing configuration works:
Click Save and Publish to publish the process.
Click the Version Info button to go back to the alternative selection screen.
Click See Process A/B Testing Configuration to view tests results and summary.
Watch the following product tour to learn how to setup advanced configuration for A/B Testing.
After the initial configuration, you can edit and republish one or both alternatives.
Follow these steps to edit and republish the alternatives:
Edit an existing process that has A/B Testing enabled. The two alternatives will display in their respective tabs.
Edit one or both alternatives as needed and click the Publish button to publish them.
To individually publish the alternatives, select the tab associated with that alternative.
Optionally, enter a Version Name and Description and click Save and Publish.
Publishing only one alternative will disable the other, hence disabling the A/B Testing configuration. In this scenario, all requests will be routed using the published alternative only.
To publish both alternatives, select the A+B Alternatives tab. This action enables the A/B Testing configuration.
Optionally, click the A/B Settings button to adjust the configuration for A/B Testing.
Click Save and Publish when done.
After create alternative process models, it is possible to switch between these models.
Follow the steps below to switch between alternatives A and B:
Edit an existing process or create a new one in Process Modeler.
From the top left of the Modeler, use the tab associate with each alternative to view and edit each alternative.
After a tab is selected, the process model for that alternative will be displayed.
Make changes as needed and click Publish to make the process available to participants.
Process designers can replace the contents of one alternative with the contents of the other.
Edit a process with alternatives.
Click Confirm. The contents of the selected alternative will replace the contents of the other alternative.
Manage which alternative is being tested and carry out tests to see how each alternative performs.
From the Run Test screen, select the Advanced option to enter expression and percentage settings for A/B testing.
If the expression is TRUE
, requests will follow alternative B.
If the expression is FALSE
, requests will be distributed between alternatives A and B according to the selection in the percentage slider.
Leave the Expression setting blank to use only the percentage slider for request distribution.
Adjust the slider to select distribution percentage between the two alternatives.
From the Starting Point settings, select the step from which the test should begin.
From the Type of Run setting, choose Automatic to complete the rest without user intervention. Select Manually to manually complete tasks in the process model.
Optionally in the Additional Data setting, add a JSON data object that supplements or contains the entirety of the mock Request data for the test. JSON data object can be used in place of a Scenario.
Select Bypass Scripts tasks and Data Connectors to skip running running any script tasks or data connectors in the process.
Click Run to run one test of the process. The test will run as configured and tasks will be highlighted as follows:
Active tasks will show as blue.
All completed tasks will be highlighted as green.
Any errors will be highlighted in red.
To run additional tests, click the Run button multiple times.
After running a Process Test, Process Modeler simulates workflow routing based on the provided Scenario and/or JSON data. A colored highlight appears on each Process model object triggered during the simulated workflow.
Node number in the Process model
Status of the Process model
Hyperlink to the sequentially numbered test
The following highlights represent event states during the simulation:
Green-colored highlight: The Process model object completed.
Yellow-colored highlight: This is the breakpoint in the automated simulation from which manual testing begins.
Follow these steps to proceed with manual testing:
Mouse-hover over the Task with the yellow highlight.
Click on the hyperlink of the sequentially numbered test. The Task opens. Manually submit the Task.
Red-colored highlight: An error occurred on the Process model object, and simulation has stopped.
Consider the following when troubleshooting why the error occurred:
Script Task or Data Connector connector error:
Exclusive Gateway element error:
Evaluate if the Scenario and/or the provided JSON data contains mock Request data from which to evaluate conditions.
Review the conditions configured for the outgoing Sequence Flow element(s) from the Exclusive Gateway element, and then review the mock Request data.
Click the Play button (left) at the bottom of Process Modeler to replay the current Process Test.
Click the Reset button (center) to reset the test and remove the highlights that represent event states.
Click the Stop button (right) to stop Process Testing and return to editing the Process model.
Follow these steps to cancel a Test run:
Mouse-hover over the Task with the yellow highlight.
Click on the hyperlink of the sequentially numbered test. The Task opens.
Click the Cancel button in the Cancel Test section of the Task summary.
Click Confirm.
Click the menu, and then select the Configure option for your Process to view process configuration.
For simple testing configuration, select the Simple option and adjust the slider to set the percentage distribution of incoming requests between the two alternatives. By default, requests are evenly distributed between the two alternatives.
For advanced testing configuration, choose the Advanced option and input an expression to set the rules for activating either of the two alternatives.
From the top left of the Modeler, click the + Create the Alternative B icon to add an alternate flow for the process.
The active alternative will be highlighted in green, while the inactive alternative will be displayed in gray, indicating its disabled status.
Expression | Alternative % | Request Distribution |
---|---|---|
The active alternative will be highlighted in green, while the inactive alternative will be displayed in gray, indicating its disabled status.
See the permissions or ask your Administrator for assistance.
See the permissions or ask your Administrator for assistance.
From Alternative A, select the option Replace Alt. A with B or vice versa.
A warning messages will appear.
See the permissions or ask your Administrator for assistance.
Open a process model for a process that has .
Click the menu at top right of the Modeler, and select Run Test.
In the Expression setting, optionally enter a to use advanced configurations for determining how requests are distributed between the two alternatives.
Optionally, in the Scenario setting, select a that contains the mock Request data from which to test. A test can be run without a Scenario.
Hover over a highlighted task. The following information displays regarding the status of that Task:
Green-colored highlight denotes that the Process model object completed
.
again. When setting up the test, select the Check to bypass Script tasks and Data Connectors option.
Red-colored highlight denotes an error or that the Process model object stopped the test at that point.
The Caution screen displays to confirm the cancellation of the test run.
TRUE
N/A
Alternative A = 0% , Alternative B=100%
FALSE
A=50% , B=50%
Alternative A = 50% , Alternative B=50%
FALSE
A=100% , B=0%
Alternative A = 100% , Alternative B=0%
FALSE
A=75%, B=25%
Alternative A = 75% , Alternative B=25%