The first post in this series discussed two fundamental principles that underlie the process of Web analytics: the assumption of intentionality and an understanding of the “natural structure” of the Web site. The assumption of intentionality is simply the claim that visitor’s navigational behavior is indicative of interest. The claim about “natural structure” of the Web site is that the design and creative on the Web site alter, limit and channel visitor behavior and limit conclusions about intentionality. It is the inherent tension between these two principles that creates much of the complexity in Web analytics and is this tension that makes the brute-force application of statistical techniques like correlation unsuccessful.
In my last post, I took a similar look at the traditional practice of Targeted Marketing. A practice with a long, proven record of success but which has, unfortunately, been attached to consumer channels that are in the process of dying – making it ever less relevant to today’s marketing.
The core Targeted Marketing process involved a fair number of steps, but two of them embody the key “magic”:
2. Enrich your knowledge of that population (if necessary) with data that can be tied to targetable properties such as attitudes, needs or desires (Name Lookup, Household data, Census Data)
.
.
.
5. Create a link between the targetable properties, the behaviors you want to incent, and the messages most relevant to each combination of target group and desired behavior (Survey data in our case – most often with additional behavioral data in commerce targeting)
In Step #2 of the process, we accumulate potentially significant data about an individual. In Step #5, we create a connection between the available data (targetable properties), the outcomes to incent, and the key attitudes that drive toward the outcomes.
It should be fairly obvious that the Web analytics “assumption of intentionality” is directly related to Step #2. If Web behaviors are indicative of interests, then these Web behaviors are targetable data. They become “facts” about individuals that could be integrated into a marketing process.
What’s more, Web analytics techniques are highly suitable to achieving at least part of #5. The proper application of Web analytics techniques is designed to link the targetable properties (navigational behavior) with key outcomes (purchase, registration, viewership, etc.).
However, Step #5 of traditional Targeted Marketing also links targetable properties to attitudes that drive to significant outcomes. In the political world, we used opinion research to create both the links to outcomes and attitudes. In the Web analytics world, we can often use behavioral data to link targetable properties to outcomes. Not so when it comes to attitudes. Is there a way to link online behavior to attitudes that might help target campaigns and creative to drive to the desired outcome?
There are actually three different methods for doing this – all potentially valuable:
- Online Survey Research: Identical to our technique in Database Marketing – ask a sample of the population and correlate attitudes to targetable properties
- Inference from Meta-Data about Content: Use information about what the content is to infer visitor attitudes – if the visitor looks at pricing first, pricing is a significant concern
- Testing: Generate multiple content alternatives and test to identify the messaging (and, by implication, the attitudes) best linked to conversion.
What’s notably missing from the Targeted Marketing process is any equivalent to the limitations of “natural structure” that impact Web analytics. Targeted Marketing techniques don’t have any similar limitations, and that’s a big part of why Targeted Marketing folks have struggled with Web analytics data. They have no experience in the tools and techniques necessary to control for the impact of the natural structure of the Web site on the use of the targetable data and they tend to miss or misunderstand the problem.
Compounding this problem, Targeted Marketing folks have struggled to use any of the three methods for linking Web targetable data to key attitudes. This is less surprising around the second two reasons than the first; after all, Survey Research is a key technique in traditional Targeted Marketing to drive exactly this type of link.
Unfortunately, most sites have failed to create the necessary integration to make this linkage happen in the online world. In traditional targeted marketing, the link between survey data and the database is typically driven by demographics. The customer database contains key demographic facts like age, gender, estimated income, geo-location, and more. In the opinion research, those same facts are collected and tie directly to attitudes and outcomes. The link is obvious and allows conclusions from the Survey data to be immediately used for targeting against customer data.
It doesn’t work that way in the online world, because the targetable facts (Web behaviors) aren’t tied to any demographics. To use Survey data, you first need to integrate it into the Web behavioral data (you might want to check out my Whitepaper on this) and then correlate survey attitudes to the targetable facts. Since most sites haven’t done this, there’s no traditional and obvious way to link targetable online behavior to key attitudes – a major stumbling block when it comes to digital marketing.
In these past few posts, I’ve tried hard to show that Web analytics data and techniques CAN be used to drive targeted, personalized marketing type efforts. I’ve also tried hard to develop what is sometimes called a failure or error theory – an explanation of why very few organizations have been able to do so. It all boils down to this: Web data is intentional – and it can be used to create targetable data based on Web behavior and it can be linked to both outcomes and attitudes. These are the two essential linkages necessary to create effective targeted campaigns.
So why hasn’t it been widely and successfully done?
First, the people currently using and expert in Web analytics data are – almost without exception – entirely ignorant of basic targeted marketing. When you hire experts and consultants whose background is in operational BI (best case) or Omniture tagging and reporting (worst case), why should you be surprised that they know next to nothing about targeted marketing? Asking people who’s professional life has been consumed by Web analytics process, by setting up tags, and by creating dashboards to suddenly build targeted marketing campaigns is highly unlikely to be effective.
From the other side of the equation, experts in Targeted Marketing have to face two significant difficulties in bridging their discipline to online data. The very fact of a “natural structure” to the Web site limits the interpretation of Web behavioral data and makes the brute force application of standard statistical techniques unproductive. Second, the linkage of Web behavioral data to key attitudes doesn’t work the same way in digital as it does in traditional marketing. It’s harder, and one of the key integrations, survey to Web analytics data, is often missing.
In my next posts, I’m going to expand on how we at Semphonic solve these two significant difficulties and have built up a powerful set of techniques for bridging the gap between Web analytics and Targeted Marketing to create a powerful Digital Analytic Marketing discipline. These techniques – from our Two-tiered Segmentation and Behavioral Audience Segmentation, to survey and behavioral integration, to site structure mapping, to Functional and Behavioral Use-Case analysis – all tie together to create the foundation of a fundamentally different and better kind of Digital Analytics Marketing practice.
Gary, this is a great post. Resonates with me quite a bit - my team just finalized an integration between a Comscore survey of attitudes and the Webtrends data for the surveyed site. We are using the resulting integrated data to really help us understand the site experiences driving positive attitudes. For each person who responds to the survey, we capture every action they took on the site (every page viewed, every document downloaded, every success event triggered). We then marry that to their survey responses and verbatims. What comes out of that is a model that we are using to help optimize the site experience. Behaviors that are correlated strongly to positive survey responses are getting a higher priority placement and those that are correlated to negative survey responses are being removed or lowered. Granted we are still having to deal with a survey bias, but when it comes down to it, this is some of the most actionable data we've been able to provide to this client.
Posted by: Jason Widup | February 15, 2011 at 01:41 PM
Jason,
It's funny, because I've gone from being a skeptic about the integration of VoC and Web Analytics data to a huge fan. Not only is that combination analytically rich - which your experience definitely speaks to, but it turns out to be an essential bridge between targeting and attitudes. When we do classic Web analytics, we don't necessarily need to build that bridge. But when you try to create personalization or testing strategies, understanding the attitudes that drive or could drive behavior becomes essential.
Thanks for the great comment!
Posted by: Gary Angel | February 15, 2011 at 02:54 PM
Thank you, Gary! You explain a very(!) important point of potential failure.
I believe it is even more essential to understand this problem when running a business which happens not entirely online, and even more for non-sales websites (like some B2B businesses).
In my experience, these are typical cases where targeted marketing is well developed and mature, but often also disconnected from any webanalytic methodology (and then WA might be treated as a reporting job placed in IT).
I am looking forward to the next article, as always.
For today I am more than satisfied ;-)
Posted by: Matthias | February 17, 2011 at 07:22 AM