Gary,
First, let me say you’re right….mostly. Second, a caveat – many optimization experts will claim that everything they say is the Truth and the only way forward. I do not claim that is the case with any of what I’m about to say. Rather – I claim it is *mostly* true and following this philosophy of testing and learning has a nearly 0% likelihood of failing to provide not just good results – but long-term, continuous improvement for your digital properties. It is not the ONLY way. I do believe it is the best way.
Now – with that out of the way. Let me get to my response.
My claim: analytics should always drive testing
Your response: You point out that this is not 100% true because, at the very least, the vast majority of testing programs you’ve seen are not driven by the analytics team and are in fact often completely independent from that team. The only analytics you see with testing is the post-test results analysis and evaluation of performance by segment – an approach you agree is backwards. In short – if so many programs are able to run and run effectively without this analytics-driven approach – is it really necessary? You point out that there are two kinds of testing that may not require analytics to drive them: “best practice” testing (ie – bolder colors, larger fonts, bigger buttons always work) and “pure creative” testing (copy, images, color pallet).
My response: Just because so few are doing it right doesn’t make the recommendation wrong. In fact, as more and more businesses get more and more involved in testing and the tests they attempt become more sophisticated, businesses are going to learn just how important that up-front analysis can be to ensuring the right tests are selected and designed for optimal potential improvement. Resources are simply too scarce to support a continued “flying blind” approach. Eventually, the testing program plane will hit the side of the cliff and wins will taper leaving the testing program with declining support and the business with fewer and fewer optimization opportunities.
On your point that “best-practice testing” does not really require analytics, I partially agree. However, if you want to focus your “best-practice” test designs in areas where it will be most welcomed and have the greatest potential for positive results, you need analytics to tell you what pages have high traffic but low performance or which pages seem to work better for some segments but not others (suggesting targeting possibilities). Can you run a test with just larger, bolder buttons and expect a lift? Sure. But if you use analysis to help you decide WHERE to run that test and WHICH buttons to change – you’ll have an even greater chance of achieving your goals – and faster.
Finally, “pure creative testing”. I think this type of testing has a huge potential for companies and I agree businesses are not doing as much as they could – largely in my experience because they don’t know what they should be testing. Once again, this is where advanced analytics and especially segmentation and survey analysis can come in very handy. Taking the time up front to do a deep dive analysis of what different segments of customers seem to be motivated by today – perhaps even incorporating results from prior tests or a series of tests that might be run as part of the analysis just to gain more data – can provide a robust “creative brief” the design and copy teams can utilize to offer up challenge creative and copy for the testing team to utilize. This challenge content can be targeted to specific segments or can even be changed for time of day or day of week based on the analysis. Can you do “pure creative testing” without the analysis? Sure. Will you understand the results? Unlikely. And if you don’t understand the win – it’s pretty difficult to replicate it and build off that win in the future with iterative testing. Using the analysis to inform the design and then setting up the challengers in a way to ensure each recipe provides a specific insight provides the foundation to build upon a continuous optimization effort.
Finally – I want to respond to your last two questions. I think I answered the first in my response above (Do the best-practices and pure-creative testing strategies make sense? Yes…and no. They certainly can and often do work without analytics to drive them – but I argue they would work much better if analytics were to drive the plan and design). The second, “What is the right balance between them?” is difficult to answer. The balance of test types to use depends on many factors including the age of your program, the quality of your site, the goals of your program and the availability of creative resources. For example, if you have a new test program, start with the low-hanging fruit and get some big wins under your belt early. Still use analysis to determine which of the low-hanging fruits to hit first – but once you identify where you want to start, use best-practice testing to dive in. But if your program is already plucked all that low-hanging fruit, best-practice testing is probably not for you. Instead, I would focus on using analysis to determine areas of opportunity that you might attack using a pure-creative or personalization strategy – once again using analysis to help plan and support the design. So I guess I would say it’s less a “mix” and more a progression from one into the other. And along the way, the analysis might suggest a completely different type of test that does not come from the best-practice nor the pure-creative pool such as reducing pages in checkout funnel, adjusting site pathing, testing different navigation tools, pricing elasticity, offer testing, load time testing, etc.
In short (too late!) – analytics-driven testing is simply better – regardless of the specific “type” of testing used under that umbrella. Analytics will help you focus your efforts, design the tests, understand the results and move forward with continuous optimization. Without it – you will continue to throw darts blindfolded – sometimes you’ll get lucky – but over time, running a program like this with the blindfold on becomes more difficult – almost as if someone is spinning you around after each test. Analysis can pull off the blindfold and reduce the dizziness – helping you improve your aim and hit the bull’s eye!
Comments