I started off this year with Luther-like epistle on “Measurement that Matters.” It was a small (much smaller than Luther’s) set of theses about how enterprise measurement can be made to matter. The projects I described were chosen because they force or are strongly tied to driving actual use of data in the business. Why? Because far too often measurement is something that the enterprise DOES but doesn’t USE.
In upcoming posts, I’m going to tackle those projects in more detail and show why I think they are powerful, how they can be done, and why, when put together, they create a transformative measurement program. But before all that, I wanted to take a quick look at the opposite end of the equation – why enterprise measurement is so often divorced from change or action.
Setting Yourself up for Failure
In a large enterprise, no single factor may be more important in your long-term success or failure than how you’re organized. Madison’s famous dictum: “If men were angels, there would be no need for government” is apropos in the private sector as well. Good people will often transcend organization. If you happen to have a great measurement lead, you may have success no matter how poorly organized you are. But individual successes are rarely sustained in the large enterprise, if only because success breeds promotion (or, more likely these days, poaching). So over time, it’s hard to sustain a level of human excellence in a large organization that is much above average. Running a large enterprise is fundamentally an exercise in generating good outcomes with average people – a task utterly different from what’s demanded in small, entrepreneurial settings.
Unfortunately, digital measurement has not been well structured in most large enterprises today. Here are the most common and most serious mistakes in organizing enterprise measurement teams:
Failure to Integrate Measurement into Teams: Good measurement can’t be distant from teams if it’s actually going to drive change. Transformation has to come from within since only hands-on workers have the expertise to take advantage of data. This is particularly true in the digital world where information changes rapidly.
Failure to Isolate Measurement from Teams: This isn’t a misprint and it isn’t a contradiction. Measurement has both an internal and an external function. The internal function has to support day-to-day operational change and customer conversations. The external function has to provide senior managers the ability to channel resources appropriately and make broader organizational investment decisions. These are two completely different functions and they MUST be treated separately if your organization is to work. Centralize both functions and no one will use information. Distribute both functions, and you might get measurement use, but you’ll NEVER know what’s effective or where to invest.
Think about it this way, decentralize and embed analytics to drive customer conversations or process; centralize and make analytics independent to drive external reporting, audit and investment functions. Nearly every enterprise I know blends these two functions.
It’s a big mistake.
Here’s a related organizational issue that can just about cripple the entire digital measurement effort: separating testing from analytics. About half the enterprises I see have completely isolated site testing from site measurement. Most (but not all) of the time, these enterprises have a vendor managing their testing. This makes my head spin. What exactly is your analytics team for if not to create test plans for the Website? Talk about measurement without use! If your site tests aren’t being driven by your measurement team, something has to change. If your internal measurement team can’t think up good, data-driven tests, find a new measurement team. If they can, and you won’t let them, re-organize your testing.
I can’t pass onto the next topic without mentioning the single worst mistake you can make as an enterprise – letting your creative agencies self-measure.
Not only are most digital agencies pitifully bad at measurement and analytics, they have a strong self-interested bias in producing “win” oriented measurement. If you think you can optimize your business spend by asking people to self-assess and then paying them based on their own assessment, by all means have your agency measure the channel they own. Paying your digital agency for measurement is like funding their marketing department. Stop the madness.
Hiring all the Wrong People
If your digital measurement department is filled with…well…digital measurement people, then you’ve probably missed the boat. There’s tremendous demand for analysts skilled in tools like SiteCatalyst. There’s nothing wrong with that. It’s a good and useful skill. On the other hand, these folks are too expensive right now AND too limited to be a complete solution. Useful digital measurement requires more. In fact, it requires a blend of at least four different functions: analysts skilled at data manipulation, traditional customer analytics and targeting (usually in SAS), analysts focused on information formatting and delivery (Excel, Tableau, etc.), analysts focused on Voice of Customer and traditional customer research, and analysts focused on digital/site analysis. If you’re missing any of these skills or you've segregated these teams, then you’re team or team structure is probably a limiting factor on success. The very best departments I see are thriving because of the creative blend of traditional customer analytics folks with digital marketers. If enterprises were just a little more aggressive in blending in opinion research folks, they might really have something special.
Skipping the Foundation
Measurement is all about use. So the drive to create product is natural and I’m sympathetic to it. But there’s democracy and then there’s anarchy. The difference between the American Revolution and the Arab Spring is all about having the right foundation for change. Before you start building reports and democratizing data, there’s a foundational exercise you simply MUST do.
The single biggest challenge in digital reporting is that your marketing stakeholders almost certainly don’t have a strong understanding of how to measure success in the digital channel. In some cases (like purchasing products) it’s fairly straightforward. But comprehensive digital measurement for the large enterprise will demand success measurement across a wide-variety of digital touch-points and functions: from ecommerce to branding to customer support. You can’t leave decisions about how do this measurement up to individual managers or business units. If you do, you’ll get metrics that simply don’t capture the necessary business success or capture only some piece of it.
From page views, to time-on-site, to site-wide satisfaction or NPS, the history of digital measurement is littered with failed paradigms. Don’t be a measurement Marxist! Failure here has deep ramifications for the whole of your measurement and marketing effort. Bad success metrics mean bad optimization. It’s that simple. Getting this right is a demanding exercise that requires business acumen, deep understanding of the digital channel, good methodologies, and quite a bit of tool knowledge. It also takes a significant up-front effort before you start churning out reports. Do it.
Accepting the Status Quo
If you’re a senior manager in digital or a C-level wondering why your company doesn’t seem to be winning the data revolution, here’s one very good place to start: a mirror. I’m not sympathetic to the tendency among analysts to disparage real decision-makers. I loathe the term HIPPO (derogatory slang in our industry for Highest Paid Person’s Opinion). Making decisions with data is far harder than people think.
But here’s my advice to senior folks frustrated with the (f)utility of their enterprise measurement. Don’t accept the status quo. Don’t accept measurement that siloes everything. Don’t accept reports that tell you what’s happened but provide no insight into why or what to do. Don’t accept analytics that are divorced from every possible channel of action. Don’t be intimidated by the buzzwords, the data-blizzard, the fancy reports, or the digital slang. If it’s not helping to make decisions, it isn’t any good.
Far too often, decision makers fail analytics by failing to demand enough.
So there it is - a whirlwind tour of the biggest failure points in digital measurement with a few side-notes from Luther, Madison and Marx to keep you on your toes. I’m sure that’s enough philosophy to last a good long while – next up a far more practical and prosaic deep-dive into the six projects that make up my measurement manifesto.
Great post Gary, I feel that organizational challenges and best practices within web analytics is a topic not discussed enough.
I recognize your points on centralised/de-centralised settings, but two organizational questions I'm currently spending much thought on are:
1. Web analytics in relation to Business Intelligence (BI). I guess most companies have started with web analytics separate from the BI-team. But when your digital distribution has grown from 10% to 70%, then suddenly web analytics is essential business intelligence and you find yourself wanting to analyze data sets from both sources. So how to organize the two teams and technologies (merge teams or not) to maximise insights and operational efficiency?
2. Centralised web analytics in relation to the more technical work of providing the web analytics platform (i.e. implementation, configuration, data quality, support, etc.). I think of it as 2 different roles, analyst vs system developer. Initially one or two centralised web analysts usually take on both roles. But when more (decentralised) analysts join and the requirements increase on the platform and the original analysts see the developer role grow at the expense of analysis. Adding more resources in the centralised team raise the question, should every team member have both roles or should we split the team into analysts and developers? Should the developers join the DW/BI-team mentioned earlier and let centralised analysts constitute their own team focusing on analysis? What are pros/cons, best practices?
I like how you divide web analytics skills into four different functions, web analytics is truly multi-facetted. I have made a similar categorisation by dividing web analytics into four perspectives, each with it's own challenges and tools: Customer, Product, Marketing and Distribution (i.e. site or app). For example, conversion rate means totally different things if you look att it from the different perspectives. Is it the conversion rate of a specific customer segment, a product or product category, a campaign or a buying journey on your website?
Posted by: Robert Sahlin | February 19, 2013 at 12:27 AM
As always on your blog, a post which I truly enjoyed. It coincides with the restructuring of our e-commerce team of 10 people. We used to have a central e-commerce analyst who supplied reports to all business seniors and supported them in the interpretation and subsequent decision making processes. This e-commerce analyst was me.
Last week, I told my boss (the e-commerce director) that I would not want to go on like that. I felt that much of my work has not been used in a really valuable way: Some people seemed to not thoroughly read the stuff I provided them, others obviously felt that rather than supporting them I was interfering with their business. So last week we agreed on moving the controlling of high-level metrics / business targets to the director level (I wonder if he will have the time for that though) and moving the measurement of lower-level metrics to those people who actually make use of the data. That way, I hope that the overall measurement efforts will be more efficient since everybody will (hopefully) measure the most relevant metrics which perfectly help to achieve their personal targets. And if somebody thinks, time is e.g. better invested in the acquisition of new partners than in partner performance measurement - fine. Why tell people what and how much they need to measure in order to be successful? And moreoever, how could a central analyst like me ever give useful recommendations to a SEO expert on which keywords to optimise for? This guy has a range of analytics tools which I don't even understand. So let's rather bring some measurement knowledge to business people than business knowledge in really different disciplines (category management, performance marketing, CRM, SEO, CRO, etc.) to a geeky analyst.
Anyway, to be honest I still have some mixed feelings: Will everybody have (or achieve) the analytical expertise to optimise their own business? Will they be able to take the time it really needs? Will they finally adopt a data-driven mindset? Let's see.
Posted by: Michael Ellensohn | February 19, 2013 at 01:43 PM
Michael,
Thanks for the great comment. I think you've hit on what really is one of the central challenges in analytics from an organizational standpoint. We've often been in the same position when it comes to annotation and reporting - it's really tough. I don't think there is one right answer and, as I tried to get at, I don't think every Web analytics problem should get the same centralization/decentralization treatment.
Gary
Posted by: Gary Angel | February 20, 2013 at 12:57 PM
Robert,
Great comment - Thanks!
Regarding your points, I'm very much in favor of blending BI resources and digital resources. But that doesn't always imply blending the organizations. I'm not sure I have a one-size fits all recommendation except that some level of resource type blending is absolutely critical to effective digital analytics.
In most cases, I think I'm in favor of merging those teams. But perhaps even more important to me would be ensuring that the digital team has some BI expertise working inside it.
Your point/question on technical team roles - particularly in a de-centralized environment probably deserves a full discussion. Though I'm a generalist at heart, I find that most large enterprise functions need to be fairly specific (going back, I suppose, to the point about how different big-company and entrepreneurial organizations and resources need to be). That means I'm generally in favor of not trying to have folks split roles - especially across technical and non-technical boundaries. It makes resourcing much harder, though when you can get those folks they can certainly make life a lot easier. Usually, that means that de-centralized analysts need to interface into centralized analysts who have direct relationships with technical folks or directly to the technical teams.
Anyway, your questions certainly deserve deeper thought and I'll see if I can think more about it!
Gary
Posted by: Gary Angel | February 20, 2013 at 01:09 PM
Thanks Gary,
I really enjoyed this post and completely agree with the idea of having both centralized and decentralized analytics. In my experience, you often here people talking about making a choice between the two options but in reality that approach doesn't work in large organizations. You need a mix of the two.
The centralized team is needed to configure the reports, define metrics and set standards for the organization. The centralized team should be the authoritative voice for analytics and insight within your company.
When you get down to the granular detail of specific areas of the business you really need the people who are living and breathing that project or area to be using the data. This is really important as the centralized analytics team will never know everything about every area of the business and if you don't empower people to self serve with the data then your efforts will likely be seen as interference.
Thanks again for a great post Gary.
Best regards,
Billy Dixon
Posted by: AppliedWA | March 09, 2013 at 01:07 AM