I had the whole last week in the office (our new, improved, larger offices where so far even the air conditioning works!) which was blissful after my mad dash around the country the week before. But before it drops out of my mind, I wanted to give jot down some notes from the Red Door Speaker Series panel I did on Content Marketing.
Red Door is a San Diego based Interactive Presence Management firm. They put on a regular series of speakers and panels – mostly for SoCal companies – across a wide range of online marketing topics. They host them at the rooftop club above their offices and it is a spectacular venue. The building is about fifteen stories high and right above the open center-field wall of SD’s baseball stadium. The club is obviously setup to host evening events over the games – there is an enormous outdoor patio and one whole wall is made of glass doors. The room itself is lovely and relaxed.So picture a beautiful San Diego day (what other kind is there), sweeping views of the ocean, the skyline and green fields of the ballpark; picture a whole wall open and a light breeze floating in; is it any wonder the conversation was relaxed and enjoyable? Setting does matter – as I always taken note of when planning X Change. And if I could have every room at X Change just like the one in San Diego, I would.
Joining me on the panel were John Faris from Red Door, Marc Figueroa from Vistage International and Alex Pham from the LA Times – all of us talking about the importance of content, how to develop content networks, how to blend professional and user-generated content, and, of course, how to measure what’s working and what isn’t.
Here are a few of my notes:
My Blog: Probably the perfect example of how NOT to make content pop. As I heard John describe some basic presentation no-no’s (10 pt Arial font, too much text) I was thinking – hey that’s me. Then Alex described how important very, very regular blogging (far more than I do) is to sustaining interest. They've measured this pretty carefully and have seen substantial cliffs when production falls below some pretty high thresholds. That’s one difference between professionals and the vast majority of us amateurs: consistent (in every sense of the word) production.
On Measuring Content – the Sales-Cycle: Content measurement is trickier than people think. One of the most common questions I get asked is “What content did visitors view in the visit where they purchased?” To show why that question may be the wrong one to ask, I talked about a project we did awhile ago with a car company in a country with flat-rate pricing and online care sales. "What did visitors do in the buying session?" was one of their main questions. When we came back with an answer we told them that in the session where visitors purchased a car, they mostly came to the home page, clicked through to the vehicle, and clicked on the “buy path.” They didn’t view any content at all. Of course, in the 2-3 months before that purchase, they had visited the site something like 10-13 times and viewed, on average, around 130 pages, 3 different makes, and 7-8 different car configurations. About the only uninteresting content session was the one where they made a purchase!
You can’t measure content effectiveness intelligently unless you understand your sales cycle. And for most (but by no means all) sites, measuring content effectiveness means multi-session tracking.
On Measuring Content – Success: I also pointed out how deceptive success rates for different types of content can be. Content that drives directly to a sale will nearly always out-perform content that more generally describes a product or service. You can’t expect a page describing what a 529 Plan is (College Savings Fund) to out-perform a page selling a specific 529 Plan. That’s why in our Functional methodology, we carve up content into separate buckets we call Informers, Convincers, Re-Assurers, Converters, etc. The goal is to group content that is equidistant from the success metrics so that it can be easily compared.
That same principle of comparing content applies to media sites as well. To understand how well content is performing, you need to understand how its readership compares to other content pushed at similar times on similar days in similar placements and with similar site drives. Simply looking at the # of views an article gets leaves a lot to be desired.
Indeed, looking only at content consumption for a media site leaves a lot to be desired even when all of that additional context is provided. I gave another example from a large publisher who optimized content on the home page by measuring readership. Seems right, doesn’t it? But it turned out that some articles drove significantly more additional content viewing even though they themselves were less viewed. Measured by total views generated, there were articles that significantly out-performed apparently more popular content.
On Measuring Content – SEO: Getting natural search traffic is one of the most important online marketing functions for many sites. For community and media sites, it can be THE most important task. But the task doesn’t end with getting traffic. SEO traffic is notoriously poor performing for media and community sites. The one-and-done syndrome is ubiquitous. There are two implications to this: first, driving SEO targeting by performance not clicks is every bit as important as driving PPC by performance not clicks. The second implication is that, as with PPC, your job doesn’t end with traffic. Building and testing a design for deeper-site pages and article content that is optimized for SEO traffic is absolutely critical.
On Measuring Content – Video: There was some interesting discussion of video measurement (watch out, by the way, for the upcoming Beyond Web Analytics podcast on this topic with Jeff Jordan). My pet peeve about video measurement is that most of the effort here seems to be concentrated on measuring inside the video and people forget about measuring the impact of video on the rest of the site. A common theme in our analytic efforts is measuring the integration of video within the broader session and finding ways to blend video and non-video content within the same visit. Measuring this integration takes some extra work in implementation (but not too much) since analytics tools tend to segregate video reporting and analysis from the rest of the reporting.
On Professional vs. User-Generated Content: Professional content will almost always out-perform user-generated content – and it is often an essential ingredient in generating user-content. Figuring out how to blend user-generated content and professional content is probably one of the most interesting business and measurement challenges. We have a variety of clients who are tackling this: from a company with a small group of centralized professionals whose content is re-purposed and localized by a large network of bloggers to a company that off-shores content development around key themes culled from their SEO research to a company that aggregates user-comments across a variety of different sites and sources.
One common theme on the panel – you can’t measure the producers of UGC the way you do professionals. Professionals should have constant feedback about what worked and what didn’t. UGC producers need encouragement and support.
On the new breed of PR: We talked a lot about evangelism and how product oriented, often localized evangelical marketing is replacing traditional PR. I think companies need to be more aware of the role of internal evangelists and this will become an increasingly common job in tomorrow’s enterprise. Having product managers / creators as evangelists is great. But having a professional devoted to evangelism is better. I can personally attest that blogging, twittering, etc. takes a major toll on time. To do it really well is more than a full-time job and more than you can reasonably ask of people in your organization who already have line responsibilities.
Going into this panel, I was a bit nervous about the topic. Content Marketing is at the heart of most any web site – but it’s such a broad topic that you could easily imagine a panel neither scratching beneath the surface nor providing a coherent overview. I thought we managed pretty well at both – but perhaps that’s just the cool breeze and sunshine talking in my mind!