Greetings from Down Under! I arrived in Sydney on Sunday (Saturday by U.S. time) for a week of meetings and conferences. Never having been here before, it’s the rare business travel trip I’m actually excited about. Unfortunately, I was greeted with something less than that famous Aussie cheer – at least with respect to the weather. But wind, a bit of chill, and off-and-on again rains didn’t keep me from a long, long ramble across the city. I started this post early morning Monday here and it feels just like blogging at home, since I can flip to my browser and see the NFL scores pouring across. Weird!
With the move to EY, I always expected that the nature of the work we were doing was going to shift in some respects, and it has. We’re still doing most every kind of engagement we did before, but there are some new kinds of projects that considerably expand the scope of our practice. Of these new projects, helping enterprise’s design and stand-up an analytics center of excellence is by far the most significant. It’s also the area where I think we’ve most effectively blended the elements of our data analytics practice with EY’s broader Enterprise Intelligence practice – not least because it’s an area where EY already had a rich, fully developed practice, albeit one without any digital analytics focus.
So I’ve putting a lot of thought into various issues around creating an analytics CoE and reflecting on our own experience as a practice. While building a consulting practice isn’t quite like building an internal CoE, there are strong similarities particularly around the actual execution of analytics. I’ve also been testing those ideas against the broader practice here and the experiences standing up large scale cross-enterprise analytics capabilities.
In the process, I’ve been refining a set of guiding principles that are shaping the way we think about the problem. I’m going to layout those principles (at their current state of evolution) here, and dive into them in more detail in subsequent posts.
Let’s start with where all analytics questions should begin – with the business. Despite all the current hype about data scientists and machine learning, much of which makes analytics sound like some magical process, virtually all useful analysis is hypothesis driven and business generated. It’s also driven by methodology not exploration. The worst mistake an organization can make in standing up an analytics capability is to believe that the analytics projects can be left uncharted or should be self-generated by the analysts.
Starting with business requirements may seem pretty obvious, but here’s a principle that I think is easily missed. Investing in analytics is like venture capital investing. The wins can be big but there’s likely to plenty of misses too. We tend to talk as if every analytics project is going to bear wondrous fruit and that just isn’t true. But when a project does hit, it will drive more than enough returns to pay for the whole effort. What that means is that you’re much better off (and wiser) to take a portfolio approach to analytics. Building a portfolio of projects spreads the risk around and makes it much more likely you’ll get substantial returns.
If you’re going to take a portfolio approach to analytics, that means you need the ability to execute on multiple projects simultaneously. What’s the best way to do that? Going back to my post around data science, it’s to take a team based approach that relies on rapid cycles of iterated analysis. We’ve based this approach very much on modern approaches to software development. In many respects, I’ve come to believe that Agile is even better suited to analytics than it is to software development where it was birthed.
My next principle is one that I’ve had to grow into. As an analyst, nothing drives me crazier than to see good analysis ignored. And like most analysts, I’ve had a tendency to blame the organization. But I’ve come to realize that it’s our job to make sure that analysis can be operationalized and to pave the way for it to be used. Far too often we ignore that side of the problem, but it’s the single biggest failure point in the whole analytics paradigm. We need to pick analytics problems that can be operationalized and we need to find better ways to deliver the fruits of analysis.
Which brings me to my next principle – analytics needs to be tightly coupled with data democratization. Most organizations think of these two things as separate, even opposite. They aren’t. Effective data democratization isn’t about distributing data; it’s about distributing knowledge. Analytics is about building knowledge. There is no effective data democratization without analytics.
Binding analytics deeper into the business is a two-way street. The very best analytics organizations devote a truly astonishing amount of time on sharing, mentoring, saving, and circulating analytics. Building a sharing structure into the CoE is critical to long term success. By using methods like Agile, we build significant mentoring and sharing opportunities (as well as knowledge transfer from consultants to employees) into the very DNA of the CoE. But there is much more work to be done building the organization of analytics.
I think it's apparent how these principles begin to intertwine and support each other. And it’s really what I talked about in my last post that ties them all together – eating your own dog-food by creating a virtuous cycle around analytics and the processes that drive it.
I’m not going to pretend that this somehow captures all there is to building a world-class CoE. There’s so much involved in creating any robust enterprise capability that distilling a complex effort into a list of principles is never more than a rude approximation of what experience can bring to bear. But put these principles together and you have at least a rough picture of how you might hatch an analytics capability that isn’t something run of the mill; that isn’t the usual dreary corporate exercise; that can help create, drive and sustain something truly special in the enterprise.
Comments