Thursday morning I spent an hour talking “live” with Jim Sterne as part of a new DAA program called “Thought Leader Conversations”. It was cool. The conversations are a more relaxed, unfocused, genuinely enjoyable (at least for the participants) kind of public experience than I’m used to. It’s really nothing more than an extended conversation / QA – but Jim is a master at this kind of thing and non-panel Q&A is something that I’ve always liked better than speaking.
We talked about everything from my early history in credit card (and why I think that data was much easier to work with than digital data) to how our practice in digital analytics has evolved in the past five years to fairly detailed discussions of stuff like Functionalism and Opinion Research. If you were listening, I’d love to hear feedback on what you thought – and if you missed – well, I think there will be a recording on the DAA site soon. I’m pretty sure it’s listenable and at least somewhat informative.
Of course, one of the charms and one of the challenges to something like this is that it was completely (and I mean completely) unscripted. It’s no exaggeration to say that Jim and I spent about five minutes together preparing for this and 4 ½ of those minutes was probably idle chit-chat. Now even though this is stuff I talk about all the time, if you’re answering questions off-the-cuff for 60 minutes, some of those answers are going to come out a bit less than fully baked. So it’s probably no surprise that immediately after the session I logged a new Task note into Outlook – “Blog on Governance.”
Governance and analytics organization aren't my core focus, but they are things I’ve grown more aware of and, I think, more sophisticated about in the last few years. When Jim asked about governance for digital analytics (I’m pretty sure it was a question from a listener), I started with a confession that I’m probably more of a law-breaker than a law-maker when it comes to enterprise organization. My bias has always been to do analytics, not to preach about how it can be done. I also noted how huge a topic this is – with giant fields-of-play around infrastructure, data security and MDM, teams, processes for collaboration, and place in the organization. Then, for the meat of my answer, I chose to focus on two mistakes I see many of our enterprise clients making when it comes to the way they run analytics.
Mistake number one is to focus on analytics as a capability. Like body-builders on Muscle Beach, many organizations are totally focused on creating a powerful analytics capability and put little or no thought into what that capability is for. Creating muscles for muscles sake is every bit as vapid in the enterprise as it is on the beach. It’s fun to gawk at a 200 node Hadoop system (well, fun for me) but just like those muscle-bound beach guys wouldn’t last five minutes on a football field, many of the most powerful enterprise analytics systems are devoid of purpose and drive zero business value. If you’re going to do analytics well, you need to start with business problems and you need to create a real plan about how you’re going to use analytics. Not a plan about resources or processes or collaboration. What you need is a plan that says we are going to study problem X, answer question Y and develop model Z because that’s what the business needs to be successful.
Mistake number two is to divorce analytics from operations. This isn’t just about working to operationalize analytics. It’s about making operations a key part of every phase of the analytics process. When you evaluate analytics projects for that plan I talked about above, one of your key criteria should be the degree to which you will be able to operationalize the results. When you construct models, it’s important to focus on data that is available and affordable if you operationalize the results. And when it comes to putting together the digital organization, you need to marshal operational resources and make sure they’re tied into the output from your analysis. Enterprises routinely ignore all three of these steps, with the result that a significant percentage of analytic findings are wasted.
There’s nothing wrong with either of these answers (though they are probably better here with a few days to digest the question and many more minutes to massage the message) and they are both, to my mind, important aspects of analytics organization and governance that get missed because most of the people talking about how to create analytics never actually do it. The failure to plan analytics, in particular, is missed by most digital analytics strategists because they simply don’t understand how analysis actually gets done.
So why did I make an immediate note to re-visit governance and organization in my blog?
Mainly because there are two other topics I wished I’d discussed in addition to analytics planning and operationalization: the re-organization of testing and analytics and the integration of analytics teams. Both these are high-profile, high-controversy recommendations that are important and politically difficult.
In the recent blog-conversation with Kelly Wortham, we both focused a LOT on the importance of a tight relationship between analytics and testing. That relationship simply doesn’t exist in most enterprises – either in fact or in organization. And, of course, the two are related. When you create an optimization and testing group that is wholly separate from analytics, you’ve made it far, far less likely that either will be successful. You’ve missed the obvious fact that the point of analytics is to drive testing and that testing should be data-driven not the same old subjective throw-stuff-against-the-wall stuff that we’ve been doing for years.
Organization reflects attitudes and drives behavior. When you separate Optimization and Analytics groups, you’ve both revealed and encouraged a mindset that makes both disciplines worse. Analytics is cheapened because it loses focus on operationalizing change. Testing is impaired because it becomes an objective test of random and wholly subjective approaches. Everybody loses.
Integration of testing/CRO and analytics is the single most important organizational change I usually end up recommending when we do strategic plans for our clients. Save yourself some money and do it without paying for a big strategic exercise.
Which brings me to the integration of analytics teams. I’m a firm believer in de-siloing expertise not just data. Every organization I know about is working hard to create central data repositories that de-silo data and make it available for cross-functional analysis. Sad to say, these efforts are largely doomed to failure. They’ll fail not because the technology isn’t good or the analysts aren’t competent but because working with cross-domain data is hard.
In a recent post on data democratization, I pointed out how challenging it is to use digital data if you’ve never tried to analyze it before. Digital data is filled with stuff that appears to be one thing but is actually something slightly different. Unless you understand the technical definitions of things like visitor, session, avg. page time, referring site or geo-location, you’ll almost certainly mis-interpret and misuse the data. And if you’re relying on machine-learning techniques, you can remove the “almost” from that last sentence.
Here’s the thing: digital data isn’t unique. Every data source in the enterprise is the same. So when you de-silo data, you give access to all sorts of incredibly complicated data stores to teams that have no expertise in them and no experience in using them properly. If you want a prescription for failed analytics projects, you have it right there.
This isn’t just a big data problem. Does it really make any sense at all for enterprises to segregate people studying nearly identical problems because of the research methods they use? In more than half the organizations we work with, I’d say that UX and Usability, Voice of Customer, Social Media analytics, and Digital analytics are separate groups that have virtually no communication and NEVER share projects. That’s just wrong. As my daughter would say, “Listen peeps, you’re all studying the same thing.”
I’ve also seen huge reluctance – and huge benefits when it’s done – to combine traditional customer analytics teams with digital folks. These groups work in different tool sets, study somewhat different problems, and often have very different cultures so I get the reluctance. But in today’s world, where digital data is a key component of customer analysis, they need each other.
There’s probably fifty other things about governance and organization I should talk about. Some I might even know something about. But I wanted to add these two critical points around organization from the very front lines of digital analytics to my original answer. If you have a robust and clear analytics plan, pay close attention to operationalizing analytics, integrate your optimization/CRO with your digital analytics folks and build cross-discipline analytics teams, you’ve got a much better chance than average to succeed with digital analytics.
If you'd like to check out the series, you can register for the DAA's next conversation here - Jim will be talking with Pelin Thorogood from Anametrix (now part of Ensighten). Pelin is great and hearing more about that acquisition should be fascinating stuff indeed!