What to Consider When A/B Testing a Landing Page
In my last post, I talked about the anatomy of a landing page: what it should contain, what has no place there, and what the page is supposed to do. Today, I’ll delve into how A/B testing, or split testing, is one of the best investments you can make to ensure a highly successful landing page.
There’s more to this strategy than figuring out whether red or yellow works better for your call-to-action (CTA) button. Smart A/B testing is an ongoing process that looks at a number of behaviors and elements, and results in a high-performing page with the capacity for continual improvement.
In A/B testing, a marketer splits incoming traffic between two (or more) versions of a digital property and then analyzes the results. This type of testing can be done for campaigns that route traffic to your landing page, as well as the landing page itself.
Funneled A/B Testing—From Broad to Fine Tuned
Applying the principles of a marketing funnel to your split testing is the best way to get accurate results. Keep in mind that you’re looking for ways to optimize your landing page that will convert more visitors, and A/B testing will highlight each successive change that proves more effective.
Begin with broad strokes. Create two completely different landing pages, and monitor your results to learn which version captures more conversions. Once you have an effective foundational design, you can start testing smaller changes—some of which can deliver big results.
The important aspect of fine-tuned A/B testing is knowing exactly what you’re testing and why. Have a hypothesis, and then test to prove or disprove it. For example, you might test a color change to see if visitors stay longer on the page, or alter the wording on your CTA to find out whether one version gets more clicks.
A few examples of small changes you can measure with A/B testing include:
- Images
- Button colors
- Background shades
- Headlines
- Layout
- CTAs
As you test various elements of your landing page, remember that you’re running an experiment, and you need a “control” to measure results effectively. Make sure you’re testing changes against a static version of your landing page for best results.
This type of testing is useless without analytics that provide detailed results. Optimal A/B testing involves analyzing your results as far down your sales funnel as possible—from visits to actual sales.
Set up your analytics to demonstrate how different versions of your landing page, or aspects of your page, affect metrics such as click-through rates, signups, traffic-to-lead conversions, demo requests, contacts for more information, and, ultimately, sales. When you use Adobe Analytics as the reporting source for your Adobe Target tests, you’ll be able to filter results based on any specific metric or target audience that’s already contained in the Analytics tool.
Rinse and Repeat: Keep Testing for Best Results
In A/B testing, the results are not always clear-cut. Many marketers give up on the strategy after an unsuccessful run, or when several tests deliver marginal results. But the most effective way to utilize A/B testing is to view it as a journey, not a destination.
Online marketing is constantly evolving, and your split tests should reflect the changing landscape as well as your own efforts to fine-tune your landing page. Continue to experiment with A/B testing best practices, and you’ll end up with a well-optimized landing page that earns its keep.