Personalisation Technology: Testing & Optimisation

Last time I began this series on what to look for in per­son­al­i­sa­tion and opti­mi­sa­tion tech­nol­o­gy with my Per­son­al­i­sa­tion Tech­nol­o­gy Man­i­festo. Now I’m going to look at each of the items in the man­i­festo – the tech­nol­o­gy promis­es – and expand on what brands should be demand­ing from their per­son­al­i­sa­tion technology.

At the core of per­son­al­i­sa­tion are ques­tions of pas­sive pref­er­ence: the flu­id per­cep­tions that sub­con­scious­ly dri­ve con­sumers towards one prod­uct or anoth­er. In this arti­cle, we’re going to start with the entry point for most organ­i­sa­tions in to the world of opti­mi­sa­tion – the basics of teas­ing out those pref­er­ences with accu­ra­cy and ease. The sim­plest approach to per­son­al­i­sa­tion is a healthy rou­tine of opti­mi­sa­tion through con­tent test­ing, and the sim­plest tests are the basic A/B test and the slight­ly more com­plex mul­ti­vari­ate test (MVT).

A/B test­ing is just what you’d expect: pro­vid­ing vis­i­tors with two vari­a­tions on a piece of con­tent and test­ing which results offer bet­ter met­rics- more click­throughs, longer time spent on the page, the com­ple­tion of a full video. There’s noth­ing fan­cy about it, but if it’s done prop­er­ly, an A/B test can give you sur­pris­ing­ly accu­rate data on what works and doesn’t work: just remem­ber to keep the vari­a­tions sim­ple enough to pin­point what vari­able is affect­ing the results.

A sim­i­lar approach, MVT takes the black and white com­par­isons of the A/B test and com­pli­cates them by mix­ing and match­ing vari­ables, increas­ing the pos­si­ble vari­a­tions of any giv­en page, while also offer­ing more accu­rate results. While an A/B test might con­trast two head­er images, or but­ton place­ments, or whole page lay­outs, a mul­ti­vari­ate test will try out each com­bi­na­tion — now instead of look­ing at the val­ue of indi­vid­ual aspects of design, you’re get­ting infor­ma­tion on how they all work togeth­er. More con­crete­ly, A/B test­ing lets you know that one thing works bet­ter than anoth­er: MVT is the process of learn­ing _why _those things work bet­ter. Don’t be afraid of MVT – it can often be help­ful to iden­ti­fy the pre­dic­tive ele­ments in terms of con­ver­sion, which you can then refine through fur­ther A/B/n test­ing between many vari­a­tions of that element.

So what should you look for in your test­ing and opti­mi­sa­tion technology:

Ease of imple­men­ta­tion: IT sim­ply can’t be caught up on such a crit­i­cal but straight for­ward project, nor can you allow the pages affect­ed to be clut­tered with obso­lete or bloat­ed code. Con­sid­er a suite like Adobe’s: a sin­gle-line of code includ­ed on every page of the site or ref­er­enced in each screen of your app is all you need to imple­ment – this is made even sim­pler with the dynam­ic tag man­age­ment core ser­vice, which allows you to imple­ment and acti­vate tags and cam­paigns effort­less­ly, and makes remov­ing them equal­ly as simple.

Ease of con­fig­u­ra­tion: Con­sid­er the many hands involved: who’s look­ing to test what, and how these var­i­ous team­mates will stay organ­ised. Tests should be easy to deploy and easy to qual­i­ty assure, and it should be sim­ple to under­stand what tests are run­ning where on your prop­er­ties. The Adobe Tar­get Visu­al Expe­ri­ence Com­pos­er enables your team to cre­ate, deploy and main­tain opti­mi­sa­tion activ­i­ties with ease. Issues with dig­i­tal gov­er­nance and pri­ori­ti­sa­tion are often over­looked but your solu­tion should show you, at a glance, a sum­ma­ry of all of your activ­i­ties in a dash­board and pro­vide alerts about poten­tial activ­i­ty col­li­sions. Adobe Mar­ket­ing Cloud’s Assets ser­vice also pro­vides organ­i­sa­tions with a cen­tral hub from which cre­ative and tech­ni­cal solu­tions can be shared, stored, edit­ed and pub­lished, free from silos.

Ease of main­te­nance: Be pre­pared to cre­ate and change activ­i­ties on the fly. Solu­tions that pro­vide uncom­pli­cat­ed edit­ing and pub­li­ca­tion put the con­trol in the hands of the mar­keter and enable you to test and opti­mise your sites and apps in real-time. If a suc­cess­ful mar­ket­ing activ­i­ty starts dri­ving large vol­umes of traf­fic to your site then you should be able to quick­ly and eas­i­ly opti­mise the expe­ri­ence for that traf­fic and allow cam­paign man­agers to adapt to infor­ma­tion received while the mar­ket­ing cam­paigns are still running.

Ease of report­ing: Ulti­mate­ly, your data means noth­ing if you’re not able to show sta­tis­ti­cal con­fi­dence in your results. Plat­form inte­gra­tion with an ana­lyt­ics solu­tion is key to mak­ing the most out of your cam­paigns and tests, regard­less of the con­tent. Adobe Tar­get enables you to report in the plat­form itself or see your results direct­ly in Adobe Ana­lyt­ics, so that you can seam­less­ly move between analy­sis and optimisation.

Ease of improve­ment: This is the sweet sci­ence of test­ing and opti­mi­sa­tion, and it can be a real chal­lenge. Improve­ment of test­ing pro­ce­dure is about con­stant iter­a­tion, learn­ing when to make tests more spe­cif­ic for pur­pos­es of accu­ra­cy or broad­er towards the end of dis­cov­ery. Your opti­mi­sa­tion tech­nol­o­gy must make it easy for you to under­stand what is and isn’t work­ing and deploy the next stage of your test­ing strategy.

Tak­ing these guide­lines from page to prac­tice is best illus­trat­ed with a case study. Con­rad Elec­tron­ic man­ages a dig­i­tal mar­ket­place con­tain­ing hun­dreds of thou­sands of indi­vid­ual prod­ucts, ser­vic­ing a vast selec­tion of Euro­pean coun­tries. With so much con­tent and so many con­sumers, Conrad’s chal­lenge was stream­lin­ing their dig­i­tal expe­ri­ence and uncov­er­ing key advances in infor­ma­tion­al dis­play and nav­i­ga­tion­al ease.

With Adobe’s real-time ana­lyt­ics and opti­mi­sa­tion tools, Con­rad pin­point­ed the opti­mal dis­count dis­play, dri­ving up par­tic­i­pa­tion in sales and spe­cials by a sig­nif­i­cant mar­gin, espe­cial­ly for new users. And this dis­tinc­tion would prove use­ful in pre­dict­ing con­tra­dic­to­ry behav­ior to come: Con­rad fur­ther dis­cov­ered what kind of addi­tion­al prod­uct infor­ma­tion pos­i­tive­ly impact­ed con­ver­sion for return­ing cus­tomers, allow­ing them to com­fort­ably tar­get both groups. The result of this is that Con­rad cre­at­ed a 4% uplift in cart addi­tions and nav­i­ga­tion use, all based on tests and tem­plates that could be edit­ed and imple­ment­ed in mere sec­onds, and you can learn more about Conrad’s approach by watch­ing this Per­son­al­i­sa­tion & Opti­mi­sa­tion break­out ses­sion from the recent Adobe Sum­mit (Adobe Tar­get in ‘real-life’: dri­ving opti­mi­sa­tion with Tar­get)

Con­rad pro­vides us with a sol­id image of what a suc­cess­ful test­ing cam­paign looks like- focused dis­cov­ery of extreme­ly spe­cif­ic chan­nels to uplift. With the right tools, exploit­ing these dis­cov­er­ies becomes effort­less. With A/B and MVT explored, we’ll be mov­ing on to the next chap­ter of the per­son­al­i­sa­tion man­i­festo: busi­ness rules and geo-targeting.