Three reasons why it's getting harder to get data analytics right

Sanjay Srivastava SVP, Chief Digital Officer
Print This Page

September 29, 2016 - Working closely with Fortune 500 clients over the last few years I've watched data-driven digital transformation rocket to the top of their boardroom agendas. Many have invested significantly in big data, analytics, and data science centers of excellence (CoEs). Some are already achieving clear competitive differentiation, significant cost compression and higher asset leverage, others are working through their transformation.

Across these journeys, I find three long-term trends in analytics that must be adequately planned for and addressed in order to succeed:

  • The explosion in data and the pace of innovation in software
  • Democratization of analytics and the need for management tools
  • Instrumenting for action and the world of digital applications

Explosion in data and the pace of innovation. The exponential growth in the size and spread of data is well documented. Less discussed is the real world challenge, economics and complexity of analyzing data at the edge, moving what's required, and aggregating that which will eventually be needed - all in an elastic, scalable and secure fashion. Equally, the pace of innovation in analytics software - the number and diversity of software utilities, and the speed at which they are becoming available, is unprecedented. Tapping into all this innovation leaves users with an ever-changing workbench of tools that they need to work with. The result? Many data analytics hubs are arguably obsolete by the time they're finally up and running.

Democratization of analytics. Just as the world of sales analysis moved from MIS reports produced in the back office of the past to the CRM and excel-equipped sales managers of today, we foresee a world where analytics will be generated closer to where insights are needed. It's a world in which data science breaks free of the four walls of the “Analytics Department," and embraces a broader base of insight-users who more fully and directly harness the power of data analytics. Already today we see the early signs of this in the composition of teams that come together to address many data science problems. And as that happens, the need to manage the data – ingesting, cleansing, transforming, auditing, and managing lineage – will all need to be taken on by a broader set of users, and analytics CoEs must plan for this adequately, or risk missing the full power of data and the combined intelligence of the organization.

Instrumenting for action. Converting data into insight, the traditional goal for analytics, is no longer enough. First, insights must be contextualized for relevant use. And second, analytics must be instrumented to turn insights into action in real time. Finally, systems must be connected and configured so workflows are spawned dynamically, intercepts engaged automatically, and changes in business processes driven in real time. Getting this right can be the difference between optimizing parts inventory for the long run, and actually stocking the exact spare part on a specific airplane before takeoff to distant locations. This requires integrating analytics into the rest of the existing technology landscape by plumbing the right digital applications that drive insights into action in real time.

A practical framework for success lies in planning for three areas of focus comprehensively – elastic infrastructure, data orchestration and digital applications.

The infrastructure layer needs to be scalable to deal with the explosion in data, and elastic to incorporate the best-in-class evolving toolkits. Alongside, as they become available, new analytics software needs to be tested and vetted, mashed-up and prototyped, and quickly adopted and integrated. Large sandboxes to build this in obviously help (we have the benefit of one of the largest sandboxes in the industry), but great analytics teams address this by adequately and proactively planning for and investing in data infrastructure layer and innovation capability.

Importantly, the orchestration layer needs to allow for the democratization of analytics, and assimilation of data science discipline into the fabric of the larger organization. More specifically, its critical to design for a broader set of enterprise users that can inspect and orchestrate data, and ready it for automated application of statistical inferencing techniques. Thankfully, many necessary components are now available from new startups and in the fast evolving open source space, but companies that achieve best results always incorporate an operator's view in the design of this layer, and naturally this comes easier for those that have already been using analytics at scale.

Finally, the application layer addresses the need for instrumenting for action. The bad news: For the most part there are no ready made applications - the space is creating itself as we speak. Success here demands a lean digital approach in building an agile, contextualized application, and seamlessly integrating it into the existing IT landscape. And best practice here is in assembling the right and tightly integrated teams of design thinkers, data scientists, high velocity engineers and process operators, that can come together to design and implement applications in this layer.

We have been at it for a few years now, and when I first started in this space, the need for advanced analytics models and the right statistical inference algorithms was the long pole in the tent. No more. Aggressively exploiting these long-term trends in analytics is now becoming the sustainable driver of success. And best-in-class analytics initiatives must design for all three of them. First, last and always.

Continue Reading

Ready to