Overview

Your customers are likely to abandon a form if the experience it delivers is not engaging. While it is frustrating for the customers, it can also upturn the support volume and cost for your organization. It is critical as well as challenging to identify and provide the right customer experience that increases the conversion rate. Adobe Experience Manager Forms holds the key to this problem.

AEM Forms integrates with Adobe Target, an Adobe Marketing Cloud solution, to deliver personalized and engaging customer experiences across multiple digital channels. One of the key capabilities of Target is A/B testing that allows you to quickly set up concurrent A/B tests, present relevant content to targeted users, and identify the experience that drives better conversion rate.

With AEM Forms, you can set up and run A/B tests on adaptive forms in real time. It also provides out-of-the-box and customizable reporting capabilities to visualize real-time performance of your form experiences and identify the one that maximizes user engagement and conversion.

Set up and integrate Target in AEM Forms

Before you begin to create and analyze A/B tests for adaptive forms, you need to set up your Target server and integrate it in AEM Forms.

Set up Target

To integrate AEM with Target, ensure that you have a valid Adobe Target account. When you register with Adobe Target, you receive a client code. You need the client code, email associated with Target account, and password to connect AEM with Target.

The Client Code identifies the Adobe Target customer account and is used as a sub-domain in the URL when calling the Adobe Target server. Before proceeding, ensure your credentials allow you to log in at https://testandtarget.omniture.com/.

Integrate Target in AEM Forms

Perform the following steps to integrate a running Target server with AEM Forms:

On AEM server, go to http://<hostname>:<port>/libs/cq/core/content/tools/cloudservices.html.

In the Adobe Target section, click Show Configurations and then the + icon to add a new configuration.
If you are configuring target for the first time, click Configure Now.

In the Create configuration dialog, specify a Title and optionally a Name for the configuration.

Click Connect to Adobe Target to initialize the connection with Target. If the connection is successful, the message Connection successful is displayed. Click OK on the message and then OK on the dialog. The Target account is configured.

In the Target URLs field, specify all the URLs where A/B tests will run. For example, http://<hostname>:<port>/ for AEM Forms server on OSGi or http://<hostname>:<port>/lc/ for AEM Forms server on JEE.
Consider that you want to configure a Target URL for a publish instance and your customers can access it using the hostname or the IP address, you will need to configure both as Target URLs - using the hostname as well as the IP address. If you confgure only one of the URLs, your A/B test will not run for customers coming from the other URL. Click + to specify multiple URLs.

Click Save.

Your Target server is integrated with AEM Forms. You can now enable A/B testing if you have a full license to utilize Adobe Target.

If you have a full license to utilize Adobe Target, start the server with the following parameters after you integrate Target with AEM Forms:

In the Adobe Target Configuration dialog, select a Target configuration and click Ok.

In the Create New Audience page, create rules. Rules let you categorize the audience. For example, you want to categorize audiences based on operating system. Your audience A comes from Windows, and audience B comes from Linux.

To categorize audience based on Windows, in Rule #1, select OS attribute type. From the When drop-down, select Windows.

To categorize audience based on Linux, in Rule #2, select OS attribute type. From the When drop-down, select Linux, and click Next.

Specify a name for the created audience, and click Save.

You can select the audience when you configure A/B testing for a form, as shown below.

Create A/B test

Perform the following steps to create an A/B test for an adaptive form.

From the Audience drop-down list, select an audience to whom you want to serve different experiences of the form. For example, Visitors Using Chrome. The audience list is populated from the configured Target server.

In the Experience Distribution fields for experiences A and B, specify the distribution, in terms of percentage, to determine the distribution of experiences among the total audience. For example, if you specify 40, 60 for experiences A and B, respectively, the experience A will be served to the 40% of the audience and the remaining 60% will see the experience B.

Click Configure. A dialog appears to confirm the creation of the A/B test.

Click Edit Experience B to open the adaptive form in the edit mode. Modify the form to create a different experience than the default experience A. The possible variations allowed in Experience B are changes in:

CSS or styling

Order of fields in different panels or the same panel

Panel layout

Panel titles

Description, label, and help text for a field

Scripts that do not impact or break the submit flow

Validations (both client and server sides)

Theme for experience B. (You can select an alternate theme for experience B)

View and analyze A/B test report

Once you have allowed the A/B test to run for the desired period, you can generate a report and check which experience has resulted in better conversion. You can declare the better performing experience a winner or choose to run another A/B test. To do this, perform the following steps:

Analyze the report and see if you have enough data points to declare one of the better performing experiences as a winner. You can choose to continue with the same A/B test for more time or declare a winner and end the A/B test.

To declare a winner and end the A/B test, click End A/B test button on the reporting dashboard. A dialog prompts you to declare one of the two experiences as winner. Choose a winner and confirm to end the A/B test.
Alternatively, you can first declare a winner by clicking the Declare Winner button for the respective experience. It prompts you to confirm the winner. Click Yes to end the A/B test.

If you picked experience A as the winner, the A/B test will be put to an end, and going forward, only Experience A will be served to all the audiences.