Simulation and Enterprise Reporting
The core of my eMetrics thesis was that enterprise reporting has been almost entirely focused on showing the current state of the digital marketing program. This focus on what’s happened is a crippling deficiency. It isn’t just that such reports aren’t as good as predictive models, it’s that the reports don’t help explain why anything happened. Since understanding is the lynchpin of action, with our relentless demand for actionable KPIs, we’ve eliminated virtually any chance of our reports actually generating action. Our dashboards don’t help us understand what will happen. They don’t help understand what happened. They only tell us what did happen. I’ve likened it to a weather report with nothing but the current temperature. Or, if you prefer, imagine a map like this:
Knowing where you are without knowing how you got there or how to go anywhere else just isn’t useful!
How do you do better? You build reports that embed models of the system. The model represents how the various aspects of the system fit together, captures which are most important, and describes the degree of interdependence between factors. When you build a model of a system, you’ve created real understanding. Represented in a report, you’ve shown a decision-maker how different factors fit together – meaning you’ve help identify the levers of change. Even better, a model is inherently predictive. By allowing a user to tune and change various aspects of the model, you’ve created a tool that helps run the business. That’s actionable.
While there are many ways to model systems, regression analysis is probably the most common technique. In regression analysis, you’re creating a model of how changes in a set of variables impact the target (dependent) variable. But regression isn’t suitable for every problem (though where it is suitable it’s a perfectly good technique for creating the type of report I’m discussing here). In creating reports of complex systems like digital marketing, you’ll often need to combine the results of multiple models built using quite different techniques. That’s where simulation comes in. Simulation techniques provide a way to combine different types of analysis and different problem sets and study how they interact. It’s ideal for a problem with multiple dependent variables and complex interactions between systems.
Simulation has another benefit. As I’ve talked about models, I’ve probably given the misleading impression that predictive models always generate understanding of how a system works. Unfortunately, that isn’t completely true. It’s true often enough – I don’t want to make it sound like there is a continual disconnect between models and systems. But there are times when the most predictive variables are not the ones that help you understand a system.
Going back to my weather forecasting example, imagine a primitive tribe that had been given a barometer. A clever medicine man could effectively predict the weather by tracking the needle on the barometer. Such a system will generate good predictions but no real understanding. For predicting weather, that really doesn’t matter. Weather is completely exogenous - there's no element of control. We don’t need to understand how it works, we just need to know if we'll need an umbrella for tomorrow's dance around the bonfire.
Business problems are different. We have control over aspects of the system and it’s important that we understand how the levers we control can impact the results. since your marketing spend and your competitor’s spend often have a very positive covariance, a regression model might work just as well or even better (against historical data) using your competitor’s marketing spend instead of your's as an independent variable. But you only have control over your marketing spend - not your competitor's. A well-built simulation model will probably demand the inclusion of both factors, which is surely correct.
Simulation is a
natural technique for forcing an analyst to address the real-world.
The Customer Intelligence System (CIS)
Which brings me to my presentation this past week at VoC Fusion. In that presentation, I walked (well, maybe took a brisk trot) through the critique I’ve been developing of the way most enterprise’s are doing Voice of Customer research. Drawing on lessons from the evolution of behavioral analysis, I argued that enterprise VoC research is too fragmented, too siloed, too chaotic, too focused on site not customer, too taken up with meaningless or flatly misleading top-line metrics, too static, and too poorly disseminated. That’s a lot of “2s”! And while it might seem contradictory to complain that VoC is poorly done and poorly disseminated (the food is awful and the portions are too small!), the two are tightly bound. Part of getting the resources you need to do the job right is effectively disseminating what you have.
The solution? The integration of VoC efforts into a Customer Intelligence System that has intake from online surveys, offline surveys, social media, site and bricks-and-mortar feedback, and call-center data. The system would enforce standardized design, classification, and segmentation of the information and would be the center of a process in which VoC instruments and research are constantly tweaked and refined to answer emerging business questions. It would also serve as the foundation for an enterprise-wide dashboard capturing the state of the customer.
It’s an ambitious design, but it’s also the type of system that can be implemented for a fraction of the cost of a typical big data analytics mart and dramatically improves the enterprise’s understanding of the customer and the resulting ability to effectively shape the customer experience.
The Role of the Customer Intelligence System in Modeling
So how does a CIS fit into the reporting and modeling story? It doesn’t always. You might build many a forecast report and never need or to use the type of information contained in a CIS. There are a fair number of problems, however, where social media and other Voice of Customer data can fill what would otherwise be a pretty big hole in our models.
You can build a model out of any data. You can’t necessarily build a good one. At the core of most digital modeling problems are unresolved questions about demand, customer choice and attitudes, key segmentations, and competitive positioning. You can almost always build a model without accounting for these factors. I can, for example, build a model of Website traffic with nothing more than historical Website traffic data. It just won’t be a very good model.
Unless I understand how different types of visits and visitors generate different levels of repeat usage and how those visit-types interact with each other and drive the overall customer relationship, I haven’t really created a model that can drive understanding and help tune the system. In my eMetrics presentation, for example, I showed how social media data can be used to help model demand for a new product.Here's another critical point. Taking your model down to the level of customer experience makes it much more likely that it will help identify the true levers of change. These types of variables have the dual benefit of being importantly predictive AND potentially controllable.
This doesn’t mean you build a CIS just to generate parameters for your model-based reports. That’s just one use of much broader system. But it shows how, as we drive deeper into customer analytics, the various components of the ecosystem begin to come together.
Model-based reporting (particularly simulation based reporting), will force you to go deeper into systems than you’ve ever had to before. The instant you put a forecasting tool into a decision-makers hands, you should be prepared for them to try the craziest stuff. What happens if I don’t spend anything on advertising? What happens if I kill my TV budget and spend 100% of my dollars on digital? Models based entirely on historical data will often miss the deeper connections that would help return plausible answers.
When you drive your models down to the customer level using data from a CIS, you're far more likely to be able to handle these types of ahistorical what-ifs. It’s more work. Sometimes, let’s be honest, it’s too much work for a single problem. But over time, the more you can integrate real customer data into your modeling, the better your analysis will be.
[Drop me a line if you'd like a copy of my VoC Fusion presentation!]