There must be a word for an experience that is simultaneously exhausting and re-energizing, restorative yet draining. A word that captures at least roughly the way I felt this summer, kicking back with a brew after a day of white-water rafting. Empty but elated.
I can't think of the word, so I'll call it "xchanged".
It really was fun.
As usual, I'm going to cover some of the ideas I got and thoughts I had in the next couple of posts. For now, though, I'm just going to take a "short" cruise over the last four days; what happened, what seems important, and what really struck me. If you were there, perhaps it will spark some pleasant reflections of your own and if you weren't, I hope to throw in enough useful information to make the journey worthwhile.
X Change starts for me on Monday with our Think Tank Training day.
Here's a true story...On Sunday night I remembered that last year, after teaching three 2-hour classes in a single day, I'd decided it was too much and swore that no one would do more than two the next year. Wish I'd thought of that a little earlier.
There's no better way to start a day of teaching than by remembering how you stupid you are.
I started with a class covering the creation of an online data model; pretty much the same stuff I've been blogging about this year but the first time I've done the class. It went pretty well. I'm going to tune the class a little bit but on the whole I think it would pair beautifully with the class I gave last year on choosing a warehousing technology.
This year, however, I followed that up with a totally new class on creating a Customer Intelligence System - a complete integration of all Voice of Customer and Customer feedback systems. The idea is to create the kind of consistent QA, data classification, and reporting for verbatims (everything from Social Media, to online feedback and reviews, to traditional survey research and even call center) that we enjoy in our traditional structured data warehouses. Systems like this are the single biggest data opportunity in today's enterprise and I covered not only the reasons why but the core components of a complete CIS.
I closed with the class I did last year (only slightly tweaked) on Use Case Analysis. This was last year's highest rated Think Tank class and I'll be disappointed if it doesn't take top honors again. It's a really good class and a joy to teach. It was also my biggest class of the day - so I'm hoping it's a rare case of reality matching expectation!
Voice a bit strained (no more than 2 classes next year - remember!), we all headed down to the hotel's dock and onto our ship. What could be better than a great conversation on Mobile Analytics with Stephen Robinson of AutoTrader as we watched the sky slowly darken, the city lights slowly emerge, felt the air slowly cool, and our champagne slowly gurgle away?
We had dinner (and great conversation though hardly a word about Web analytics) as the boat slowly glided back across the Bay and then trooped over to an open-air wine tasting and chocolate reception. I think it was lovely, but my recollections are a bit hazy...
Tuesday we kicked-off with Elea Feit of the Wharton Customer Analytics Initiative. The WCAI is one of the coolest programs I know of. They farm out large enterprise problems in customer analytics to teams of academic researchers - who will take 9 months to a year to come up with deeply researched, academic-quality answers. Fascinating stuff. One attendee told me that they had seen more hard facts in Elea's presentation than in five years of reading Web analytics blogs. Then added with a slight tick - except for your blog of course. Hey, no tick necessary. Our clients don't really appreciate us publishing their findings (it's a requirement to work with the Wharton folks) and we would kill to be able to work on the sort of deep, deep dives they get to tackle.
Not only is Elea's presentation worth checking out, you should go to, download, and then bookmark the WCAI website and all the research whitepapers that have been and will be published there.
It was the perfect keynote for X Change: thoughtful, serious, advanced and fascinating.
After that, of course, comes a day of Huddles. I started with David McBride of Comcast's "Analytics in the Cloud" session. This was a true learning opportunity for me because Semphonic actually has a direct (not just client) interest in the cloud. We regularly take data feeds for one-off analysis projects and run then onto our machines and into SPSS, SQL-Server or SAS for analysis. We would much prefer to be out of the IT business and in the cloud for all of that activity, but we've been worried about issues like software licensing, cost-control on a project basis, and configurability. Pete O'Leary of Quantivo and Jim Hazen of SAS contributed some great technical insight into the problems and issues; and some of the enterprise experiences were illuminating. David's experience in the cloud has been extremely positive and while their costs have grown, they work with significantly larger data sets than we typically have to process. I just don't know of any other place than an X Change Huddle where you can get sophisticated, hands-on advice about things like the real issues of cloud-based deployments. Great session.
Next up was Matthew Fryer of Hotels.com's "Getting the Data to Tell It's Secrets Huddle" which is certainly more in my wheelhouse. This was my first chance to really talk to Matthew in-person and I thoroughly enjoyed the session. I will say, however, that my own focus was probably a little different than most other folks in the Huddle. I was hoping to focus more on repeatable techniques for analysis - of which there are a paucity in Web analytics. My view is that successful analytics (at least for a consultancy) need to have a high-level of repeatable success. Market Basket analysis is a good example. Every time you run a Market Basket analysis, the findings are unique but the method is identical, the findings are similar in structure, and the results nearly always interesting. We need some of that in our discipline and right now, only attribution analysis comes reasonably close (and that, as I complained in my recent Facebook note on ClearSaleing, isn't typically done with Web analytics tools). I'm going to tackle some ideas I have (based on projects we've done) for repeatable analytic solutions in an upcoming post.
Last Huddle of the day was Ian Gruber of Walmart's Huddle on Scorecarding. Both Ian and I lamented afterward that we didn't have a projector to talk through examples; most Huddles don't require that but it would have been nice here. Next year I'll make sure that's possible. Lively discussion here, though. As usual, Bob Page (eBay) is way out in front with his approach. They've scrapped most standardized reporting at eBay (except high-level financials) and have built an in-memory cube that the CFO and team use dynamically in meetings. This certainly addresses one of the big issues in the Huddle, which was how rapidly reports tend to age - and it solves a major BI problem which is the time-lag between information requests and answers. I have two concerns about the approach, however. First, like much of what's done at eBay it requires a very sophisticated management team; that's something not everybody enjoys. I'm also concerned that this approach is better for financial and operational reporting than online marketing and that there are some issues with embedding intelligence into such systems (like seasonality). I've got to think about it some more.
After that, we took a short break (my one chance to go swimming) and then convened outside by the Bay for sangria and our Spanish-themed dinner. Tapas may be a bit of a stretch for even the best hotel, but I was happy enough on a warm night with sangria, a great Flamenco group (that's something I didn't expect to write), an amazing burnt-copper moon hovering over Coronado, and an enjoyable table with Matthias Bettag of Bayer in Germany, Paul Phillips of Causata, and Gary Church of the Allant Group (fellow Indiana guy) talking everything from X Change in Europe (a gleam in my eye) to flag football for boys and how my daughters only seem to like very expensive sports. A few last conversations over a slice of Spanish Apple Pie, heave a long, contented sigh, and head off to bed. No late, late nights for this puppy.
Half way done and very tired!
Interesting note on eBay putting there data in a readily accessible cube to reduce/eliminate standardized reporting. We've played with this concept and are in the process of developing a framework for our sales data (non-web stuff at this point). Beyond the technological challenges this presents, my concern is the usability of having access to so much granular data. There are two main parts to that concern (at a quick thought at least):
1. Users understanding: Most of our executives and managers understand the various data points at a high level, but when you get into specifics, and how they inter-relate, that knowledge level starts to quickly break down. It's common for them to ask me to combine or cross reference data points in ways which really make no sense or lead to misleading/misunderstood results. The report is often not telling them what they think it is.
2. Fishing expeditions: We all like good news and reports which substantiate what we think. And most of our folks have strong enough integrity to be honest with data, even when the news is not what they want. But there is a certain percentage of people who are all to willing to keep slicing and dicing data until they get a 'good' number. I've had people ask for a report to be tweaked time and time again in what quickly becomes apparent is a fishing expedition for better numbers.
Ideally there is a balance between standardized reports and the ability to drill down. Different levels for those of various abilities.
Posted by: Cleve Young | September 19, 2011 at 01:30 PM
Cleve,
Great comment - I pretty much agree. I do see advantages - particularly since so much standardized reporting is ignored after the first few iterations. Much, obviously, depends on the willingness and ability of your management team to use data interactively, but I think there are significant advantages and disadvantages each way.
On the whole, this is probably something you couldn't convince a management team to use if they didn't already want it. And if they do want it, they'll probably let you know in no uncertain terms!
Posted by: Gary Angel | September 19, 2011 at 06:36 PM