A/B Testing Insights From UserTesting’s Brian Smith
Brian Smith has been on the forefront of eCommerce and digital marketing for more than 20 years (his first online purchase was in 1994)! Through much of that time, he’s been running A/B tests for clients around the world. A/B tests (sometimes called split testing) involve comparing two variants of a website or product (let’s call them A and B) to similar visitors to see which one performs better. The one that performs better, triumphs!
http://blogs.adobe.com/creativecloud/files/2017/04/brian-smith.jpg
Now the vice-president of marketing for UserTesting, he and his firm focus on “high-value conversion opportunities”: A/B testing form completion rates, gated content, demo requests, and messaging. We asked Brian why A/B tests are a key part of the UX design process, and for some tips on running them successfully.
Why is A/B testing a key part of UX design?
For years, I lived in a world where I thought I was a really smart marketer. I could look at all the data, through platforms like Omniture, Qualtrics, or Adobe Target, and make decisions to drive the business. However, after I was exposed to UserTesting through working on an Enterprise site redesign at Lynda.com (now LinkedIn Learning), I realized while platforms where I ‘lived’ showed me what happened, UserTesting gave me key insights into why something happened.
So I think of A/B testing as a key part of the UX design process, and UX research as a key part of A/B testing. They should go hand in hand. When you’re thinking of running an A/B test, you often don’t know what to test. A UX professional can step in and uncover insights which can focus A/B test on the most impactful areas. Conversely, when you’re in the design process and using tools like Adobe XD, it’s a no brainer to include A/B testing to help make directional decisions.
A/B testing and the UX design process together really take the guesswork out of developing great products.**
**
When is the right time to A/B test?
I’d argue that you should do A/B testing throughout the entire design process.
There are many studies that show the cost of re-work, or correcting a problem during development, is 10x more expensive than fixing the same problem during the design process. So start your qualitative and quantitative research as early as possible.
What’s the biggest mistake UX designers can make when A/B testing their products?
I think the biggest mistake anyone makes doing A/B tests is assuming that A/B testing is simple. I’d strongly encourage any UX designer to team up with a seasoned A/B testing pro before moving forward with an A/B test. I love the supposed simplicity of running an A/B test, but this type of experimentation can lead you to make dramatic changes in a checkout flow, for example. When a small drop in conversion rate can cost you millions of dollars in sales, it’s important to put together an A/B testing plan and properly scope out the test goals before you begin.
Again, while this might sound easy, it takes a while to properly articulate the metrics you’re trying to drive towards. For example, you might think about changing button colors to drive engagement, but most likely you’ll want to drive higher engagement for a specific audience segment. This might mean that you need to look at not just top level metrics such as clicks, click through rate (CTR), or conversion rate, but also further down the funnel metrics such as buyer persona, lifetime value, repeat purchase rate, etc.
No matter how many A/B tests I’ve run as a marketer, the hardest part is properly articulating the metrics necessary for success and the associated scoping to connect the dots of the user journey.
Learn more about A/B testing and UX research from UserTesting over on their blog,** and be sure to check out our roundup of A/B testing tips from UX designers.**