December 15, 2016 - Blogs, conversations, and conferences abound on the topic of big data. With big data comes the promise of an advanced understanding of customers and the market, which suggests that big data is functionally synonymous with advanced data. At the center of the big data discussion is the focus on its volume, variety, variability, validity, and value.
A quick search on big data reveals concerns about big data's commercial value or ROI. Some researchers suggest its value is small in relation to its cost, while others specifically question the depth of incremental insights that big data provides over traditional data. Some have suggested that hyper-innovative analysts using traditional data have a competitive advantage over big data analysts leveraging standard big data methodologies.
In contrast to big data, there is less buzz on big analytics. The evolution of analytics has not kept pace with the evolution of big data. For example, generalized linear models (GLM), ordinary least square models (OLS), and segmentation analyses (SA) are at the center of many major analytic methodologies, regardless of the data size. Although GLM and SA are both well established, big data can test more sophisticated model structures, such as generalized non-linear models (GNLM vs GLM), time-varying parameter least square models (TVLS vs OLS), and time-series cross-sectional segmentation analysis (TCSA vs SA).
Most advanced analysts can apply advanced methodologies on an adhoc basis, but to do so at scale requires a purpose-built analytics infrastructure. Many analysts working with big data have been trained in small data analytics. Others have been trained in some of the analytics used with big data, but many of these methodologies are just slightly modified versions of small data analytics. As a result, many companies struggle to see the incremental commercial benefits of big data because they use analytics designed for basic data.
Analytic creativity in the face of small toolsets
One would expect the analytic results created with large analytical toolsets to be considerably more insightful than those created from limited toolsets, yet this is often not observed. If it were, then the primary focus of analytic discussions would be on the number of tools, yet companies have found that the more tools available, the less consistent and innovative they behave. For that reason, companies are focusing on fewer, more comprehensive tools, which are purpose-built and more advanced.
Analytic creativity in the face of small datasets
Small datasets face the same scenario as small toolsets. In traditionally smaller datasets, far more creativity is required to glean insights—a fact which often sparks analytical innovation.
As a result, over the past few decades, analytical advances have been vast. Well-trained advanced and wide-angle analysts can do some amazing things with traditional-sized datasets:
- Consumer packaged goods
- Marketing-mix analytics were almost always created at an aggregated national or geographic level, which produced reasonable, stable models with little noise
- Advances in data quality and analytical techniques, however, have given way to new store-level and product-level modeling, which provide granular metrics and insights
- Promotion-response modeling was typically conducted at national physician-specialty level, which led to insights about large groups of physicians
- Advances in modeling and shared network analytics have allowed analysts to generate physician-level models. Insights are now created by individual physicians rather than just geographic groupings
- Media analytics were almost always based on the standard media measurement metrics: reach, frequency, and viewership volume
- Advances in audience data, functional forms and modeling techniques have enabled modelers to incorporate message type, audience composition, and cross-communication-based analytics to deliver tactical and strategic value to decision makers
- Customer loyalty
- Lifetime value and customer up-sell analytics are commonly built using logistic regression and customer segmentation
- Advances in various forms of models provide analysts with tools in dynamic-propensity modeling, social interaction and social influence modeling, and early-warning indicator analytics. This has led to a much wider understanding of customer loyalty and how to impact key drivers
- Reliability analytics have historically leveraged survival analysis, logistic modeling, and related methodologies
- Advances have given way to censored and stochastic distributions, and high-order, multi-parameter functions. This allows for more complex analytics with a reduced negative impact from higher degrees of noise
All verticals have improved analytics over the last few decades, even before the introduction of big data, by improving the data already collected and innovation of the analytics of this data. As such, while companies ramp up big-data investments, they can already benefit from implementing advanced analytics on their existing data. This requires customized analytics, and advanced, unbiased procedures.
Analytics created to report the results, rather than support a belief, are better suited to provide course corrections and commercial value. As a result, companies that invest in the highest caliber of analytics on their existing data are likely to see far greater returns on that investment than companies that invest in standardized and basic analytics on big data. Plus, companies that invest simultaneously in big data and advanced analytics purpose-built for big data will be able to generate the incremental insights that has been the core promise of big data, thus driving competitive advantage into the future.