A/B testing on a website is an essential strategy for improving user engagement and conversion rates. It’s a method to compare two webpage versions to see which one performs better. In this guide, we’ll walk you through the steps to effectively run A/B testing, ensuring you make data-driven decisions to enhance your website’s performance.
Step 1: Formulate Your Hypothesis
- Identify the Change: Determine what you want to change on your webpage and why. For instance, users might not be clicking a button on the main page.
- Purpose of the Change: Aim to test how modifications like color or size changes could impact user behavior.
Step 2: Select the Test URL
- Choose the Right Page: The webpage where you plan to run the test should have sufficient traffic over a certain period, such as two weeks.
Step 3: Define Your Goal
- Target Page: Decide on the page that will be affected by the change, like a “Thank You” page.
- Site Events: Alternatively, the goal could be a specific site event, such as a button click.
Setting Up A/B Testing in Plerdy
- Access A/B Testing: If you have a Plerdy account, go to Conversions > A/B Testing, and click the blue “Create a test” button.
- Test Name: Enter a name for your test.
- Test URL: Add the URL where the test will run.
- Tracking Code: If the “A/B testing script” hasn’t added yet, go to “Settings,” copy the main and additional tracking codes, and add them to the required pages including the goal page. Clear the site cache and verify the tracking code installation.
Test Variants and Settings
- End Date: Set when you plan to end the test. Alternatively, manually stop the test after 2-3 weeks if enough data is collected.
- Audience: Add rules if necessary, including countries and devices.
- Goals: Add the exact URL with https:// or part of the URL for tracking. Usually, this is the final page affected by Variant B.
- Events: Add a goal as an event, following instructions for class or ID. Note that event data aligns with the page view limit for heatmaps. Additional limits can be purchased if needed.
- Sending Events to GA4: Select the checkbox to send event IDs to GA4.
- Description: It’s advisable to add a note about what you changed and the test’s goal, to remember in 2-3 weeks.
Editing Variant B
- Editing Page: Upon opening the Variant B editing panel, select the checkboxes for “Select element” and “Interactive Mode.”
- Make Changes: Apply 1-3 changes to an element such as color, size, or hiding it. You can also edit HTML content like links or images.
- Save Changes: Save individual element changes, and use the “Save all changes” button to save all test modifications.
Launching the Test
- Review Settings: Open the test settings page in a new tab and refresh.
- Start the Test: Click the “Start test” button. You can also view all changes made to the website page by selecting “Show all changes.”
Analyzing A/B Testing Results
- Session Distribution: Each variant receives approximately 50/50% user sessions.
- Key Metrics: Look for the “Improvement” column in the Total Sessions tab, which indicates the winning variant.
- Device Analysis: Examine which device impacted the winning variant most.
- Traffic Analysis: Determine which traffic channel was more effective.
Interpreting Negative Values for Variant B
If Variant B shows a negative value in the improvement metric, it may not perform as well as Variant A. In this case, you have two options:
- Wait a Few More Days: Sometimes, initial results might not indicate the true performance due to factors like low traffic or initial user behavior adjustment. Waiting a few more days can provide more data for a reliable conclusion.
- Consider Variant B Unsuccessful: If the negative trend continues consistently over a significant period, it may be safe to conclude that Variant B is underperforming compared to Variant A. In such cases, it’s advisable to analyze the elements of Variant B that could be causing the decrease in performance and consider revising or abandoning the changes made in this variant.
Tips for Checking Variant B
- Multiple Browsers: Use different browsers to check Variant B, as you can’t see both variants in the same browser session within 30 minutes.
A/B testing is a powerful tool for website optimization. By following these steps, you can make informed decisions based on user behavior and preferences. Remember, the key to successful A/B testing is continuous learning and adaptation. Happy testing!