Before I go there, however, here's my oft-promised recap of last year’s writing. There were times (particularly in the mid-fall during a period I jokingly called webinarmageddon) that I had so much content coming out that I’m not sure I could give any of it proper marketing attention. So here’s some of the highlights and what I think came out best.
In my Measurement that Matters post, one of my top six things (Theme #6) to focus on was revamping your online customer survey program. I’ve been pushing this hard with our clients and I’m starting, finally, to see some real traction here. Right now, most enterprises measure the wrong stuff, use what they have improperly and disseminate none of it well. I hope to have some great case studies on how to do this right as the year passes.It’s the single biggest opportunity in digital measurement today. It’s also a topic I started out with last year – focusing on some of the worst aspects of existing online survey research practice: benchmarking, site-wide satisfaction, and Net Promoter scores as Key Performance Indicators. These are all disastrously bad KPIs. I may still be a voice in the wilderness on this one, and if, like most people, your still using these site-wide measures, I strongly recommend this little series:
Benchmarking and Opinion Research : Deeply unsound as generally practiced.
Site-wide Satisfaction Wrong, wrong, wrong.
Net Promoter No better than the above - maybe worse.
Social Media and opinion research are, increasingly, part of the same customer attitudes research programme. So it isn’t surprising that I followed up my posts on opinion research with a series on Social Media. I’m not going to reference any of those posts however because they are summed up more compactly in the recent white paper we released. It’s a nice summary of how to effectively think about Social Media and Social Media measurement. The white paper not only covers the stuff you’d expect (segmentation, taxonomy, sampling, technology), it delves a bit into broader issues including the organization of social media efforts in the enterprise.
Yes, I know the hype is insane and makes many a sound person doubt whether “big data” is anything real. I don’t blame the skeptics. But here at Semphonic we’ve laid out our own theory of “big data” – what it means, how it’s truly different (and how it isn’t), why it is important, and how to tackle it. Probably more than half of what I wrote in 2012 revolved around this theme. Here are some of the highlights.
Creating the Right Digital Measurement Infrastructure: A look at sourcing data into the warehouse
and the emerging necessity for a real-time technology stack. This is a theme I
expect to revisit significantly in 2013 (witness last week’s post on DataCloud)
because of my Measurement that Matters (Theme #3) focus on personalization. With all the attention on Tag Management right now, this white paper is as timely as ever though there are new competitors in the space.
Tracking the Customer Journey: One of my favorite white papers of the year lays out the role of segmentation in building a customer model and presages later work with IBM on the challenges of creating an integrated Customer view. Everyone wants this. Everyone gets it wrong. Here’s why.
The IBM Webinar on Choosing a Big Data Technology Stack: There’s an accompanying whitepaper that goes into more detail, but this webinar was one of my all-time favorites. Krishnan is a pro’s pro and doing the webinar with him was a delight. A great summary of the whitepaper without all that pesky reading and a good explanation of what we (Semphonic) mean when we talk about Big Data and why it really is more than hype.
My three favorite analytics pieces from the year dealt with wildly divergent topics. For a deep-dive into Merchandising Analytics, check out the webinar I did with Cloudmeter. It’s a really nice introduction to the white paper – an examination of how to optimize merchandising on multi-product pages. This is an important and unreasonably neglected part of ecommerce analytics. Traditional merchandising strategies fall apart on multi-product pages where merchandising levers tend to shift the distribution of clicks instead of driving incremental volume. The methods here are designed to help analysts tackle a fascinating and complex set of merchandising problems.
I talked about Site Topology in quite a few different venues from webinars to conferences to book chapters to white papers. It’s a new technique for controlling for the navigational structure of the Website and its impact on statistical analysis. I think it’s important work and essential learning for ANY actual analyst in the digital realm. Most basic statistical analysis of Web behavior produces blindingly obvious and uninteresting reveals. The reason? Potentially interesting correlations are overwhelmed by behavior that is largely determined by the navigational structure of the site itself. A vast amount of analytics effort is expended on projects that end up doing nothing but mapping the Website. Site Topology is a method for dealing with the problem in a relatively simple fashion. The webinar with Barry Parshall and Kelly Wortham is a nice but brief introduction (fortunately, their parts happen to be excellent as well). I also think you’ll find the white paper a surprisingly straightforward read.
Lastly, check out this piece from X Change on analysis methods. In particular, I like the discussion around selecting an analysis project. I meant to write more on this but somehow in the crush of business never quite got around to it. There’s a lot more to say on this topic but it’s a nice summary of a great conversation around topics like the aforementioned “finding a research topic” to the “role of visualization in data exploration.” Bread and butter stuff for in the trenches folks.
I closed the year with a fairly extended series on creating
a digital measurement strategy. This mirrors much of my personal
workload at Semphonic these days. I have a number of these projects in hand and I’m trying
to create a comprehensive approach for Semphonic that drives toward measurement
transformation not incremental improvement. It’s still a work in progress, but
I’m particularly excited about a few key pieces: the assessment framework, the
business model, and the drive from measurement framework through to technology
stack requirements. What I like
about the process is way it deeply connects the current state to the desired
measurement system via a measurement foundation and a data science roadmap
that, in turn, drive all the technology stack decisions. I believe it’s a different
and better approach to strategic consulting. What I love about the process is that it’s truly strategic. It isn’t a
bunch of boilerplate PPT best-practices re-cycled from customer to customer. Since
the approach is based on building a model of the business, it’s a fundamentally
different plan for every client. The methodology is the same, but the output is
entirely different and custom. Surely that's right for anything that goes by the name of "strategy".
I'm looking forward to an (at least) equally productive and exciting 2013 - and to creating a whole new perspective on measurement that matters.
And yes, I'm also looking forward to Sunday. Go Niners!