February 28, 2017 - The search for technology that can drive the next industrial revolution is heating up. Enterprises and institutions across the globe are investing significantly in R&D to apply artificial intelligence (AI) techniques in their businesses. And AI assets are being acquired at a fast clip, with some 140 AI-focused enterprises bought out in the past five years alone. What's more, nearly three in ten of those transactions occurred in the first nine months of 2016.
Of course, interest in AI technology isn't new. I went to school studying expert systems — the rage at the time. But we had prematurely hailed the arrival of the AI age back then. That traditional AI didn't so much reason as it did automate. It was akin to a math student being taught step-by-step to solve a very complex, yet narrow, problem, without grasping the underlying reasoning that could be applied to other, much broader problems.
This traditional AI approach knew what something was by comparison to other defined items of its kind. For example, we used to identify a non-compliant loan application using supervised learning from a labeled dataset. And from there on, all loan applications submitted could be automatically classified and acted upon with a pre-determined workflow.
The new generation AI — deep reasoning — breaks this traditional dependency on known datasets. Instead, deep reasoning performs unsupervised learning from large unlabeled datasets to actually reason in a way that can be applied much more broadly. In other words, with enough (read: enormous) data, this next-gen AI can “learn to learn" for itself.
So why is this happening now? It isn't just the new algorithms. It's because the advancements in AI techniques can take advantage of two other converging trends: computing that is becoming ambient and elastic, and large data sets that are becoming easy to extract, store, manage, and use.
And this is getting to be real. Today, we are working with clients to automate customer support using these techniques. AI helps us take in a corpus of knowledge and learn from it to start responding to customer service queries, and continuously learn more with every interaction across a whole wide variety of areas. And the benefits go well beyond economics and speed — quality is actually improving !
But whats becoming clearer is that in practice AI has a critical dependency on data. As in lots and lots of unsupervised learning data. The challenge is that most of this data normally exists in a transient state, spread across a myriad set of legacy systems in large enterprises, so that it must be systematically captured and utilized in a strategically planned fashion. And often this becomes the missing link. So enterprises (like mine) are systematically investing in and growing large sandboxes of (in our case, B2B) data sets that ultimately help build the AI algorithms that can drive significant digital transformation for (their) businesses. These practical details around data strategy and planning often get lost in all the talk around AI.
The idea that you can have AI without large sets of data is like trying to make ice without water. For those starting the AI journey in enterprises, it's an insight worth reflecting on.