Data granularity refers to the level of detail, the "fineness" of the information collected and analyzed. All dimensions of data are potentially considered:
Just like implementation capacity or process acceleration, granularity gains are one of the catalysts of the current revolution transforming marketing decision support solutions. Inaccessible data, unsuitable methodologies, insufficient computing power prohibited any progress in this area until just a few years ago. Standard results were reduced to elements that were too aggregated and too imprecise to effectively manage media actions or their digitization. How can we still consider developing communication strategies or even tactics based on a simple comparison of "television" and "digital channels" ROIs, all formats and platforms combined at the scale of a country on a very broad target?
Whatever the technique used - MMM, incrementality tests, experiments and more, used alone or in conjunction - the robustness and actionability of the results depend on the system's ability to translate the "real world", to transcribe the role of each type of marketing action with consumers, their contribution to brand performance...
For more relevant, more precise models. While it may seem paradoxical, adding granularity often simplifies the construction of models and improves their precision. More granularity in the data avoids "drowning in average values" the effects, often not homogeneous, of marketing actions. For example, evaluating the effectiveness of a highly localized urban digital display campaign based on a brand performance in the entire country (combining displayed and non-displayed areas) inevitably leads to underestimating its contribution. The same goes for the effectiveness of actions carried out on different online video platforms, too often aggregated in a total "VOL" that erases differences.
Various studies have shown very sensitive gains in quality and stability for models brought about by greater granularity. Let's mention in particular those carried out by Nielsen*: an analysis of 19 MMMs on product categories in Japan’s consumer goods and e-commerce sectors, for which models were simultaneously carried out on sales and media data on both national and regional levels (46 prefectures). MMMs’ exactness varies very significantly depending on the granularity of the data on which they are built. When the MMM approach on national data allowed the identification of a contribution to incremental sales for 16% of levers / media channels, this proportion reached 66% when the regional dimension was introduced.
Similarly, by taking into account differences in the effectiveness of marketing actions between different customer segments for retailers and online commerce players, fifty-five notes significant gains in precision for our models. This is the basis of an agent-based modeling approach* that allows us to simulate the effects of campaigns on different consumer cohorts.
More generally, an overly aggregated view does not allow to take into account mix effects within a medium. For example, in search, a variation in the ratio between the amount spent on the brand and the total amount spent on search will have a strong impact on performance. If the brand and non-brand dimensions are not taken into account and this ratio varies significantly over time, the modeling results will not reflect consumer reality at all and will be imprecise, or even erroneous.
Results at an operational scale. Instead of staying at the level of large aggregates, marketing analysts can now understand how each component, each element of the marketing mix contributes to brand performance. Who can still be satisfied with a measure where all campaigns, all "Social media" platforms are cumulated, where video formats cannot be isolated...? How can we not integrate a "Retailer" dimension for an analysis of "Retail media networks effectiveness when the actions carried out there are significantly different?
In addition, this depth of analysis offers a better understanding of interactions between marketing channels, highlighting potential synergies and leverage effects that can be exploited to improve overall results. Take, for instance, an urban digital display campaign, where measuring gains from an interaction with a TV campaign or a promotional action in the displayed areas provides directly operational insight.
The insights obtained obviously open up to an allocation of marketing resources closer to operational needs on all relevant dimensions (marketing levers, geographical areas, consumer segments...) whereas MMM results previously did not offer this indispensable zoom enabling actionable results. Thus, adopting granularity in MMM is not just a technical or methodological choice, but a business imperative.
A more robust and reactive projection into the future. A better understanding of each marketing channel role at a finer level, reflecting more faithfully the reality of markets, also makes forecasting and simulating the impacts of planned campaigns in a scenario-planning approach more reliable.
Operational improvements also affect agility in marketing decision-making and the ability to quickly adapt to changes in market dynamics. Indeed, by monitoring granular data, marketing analysts can quickly identify changes in consumer behavior, emerging trends, competitive challenges or the evolution of target reactions to a media or creative element, the performance of a media strategy, or a speaking tactic.
While granularity’s contribution to the measurement and optimization of marketing effectiveness appears undeniable, its integration into decision support systems remains a sizable project, to be managed as such.
The first challenge is of course to determine the optimal level of granularity within a data platform. A major pitfall can be to aim for too much granularity. A higher level of granularity can generate an additional cost. It is also necessary to be vigilant regarding:
The point is not to forgo the fundamentals of curation and information collection... no data science technique will be able to compensate for a lesser relevance of basic data!
Similarly, the choice of modeling approaches and processing algorithms is crucial. It is often necessary to find a compromise between the granularity of collected data and the complexity and performance of necessary models. Thus, a reasonable level of detail will be determined for model calibration, exploitation, and validation. In any case, it should be noted that data at a finer level of detail can be kept in a data platform and be the subject of ad hoc studies using specific data science procedures. Thus, for a retailer, the data is modeled by product department for recurring analyses; a finer detail (by brand within a department for example) is the basis for ad hoc studies when a more advanced issue needs to be analyzed (for example: the synergy between a retailer's communication and a manufacturer's promotion strategies).
fifty-five’s experience shows that in most cases, success is based on a progressive, step by step implementation. "Baby steps", forming the basis of any "Marketing Efficiency"* decision support system development, are even more essential when aiming for gains in granularity. Each step gives rise to a critical review and a prioritization of needs in terms of data platforms enrichment and of algorithms capable of extracting and exploiting insights regarding marketing action effectiveness.
For an advertiser in the "Consumer Goods" sector in the United States, after implementing a nationwide MMM solution, the first step was to test the potential gains of modeling on a finer geographical scale for which weekly sales data were available.
After a few weeks, the analysis on a sample of these areas:
In parallel, the brand was able to compare these gains with its controlled experiments on geographical areas (GeoTests / GeoExperiments) and its quasi-experimental studies based on data observed in history.
The second step of this process, carried out in 6 weeks, consisted of adapting the data platform and models to generalize this progress in both geographical and media dimensions.
A third step allowed for the enrichment of the system by taking into account:
In conclusion, data granularity is crucial for marketing mix modeling and, more generally, for marketing effectiveness measurement and optimization tools. It allows for increased precision, greater operability, forecast robustness, and increased responsiveness to market changes. However, achieving optimal granularity still requires appropriate data management and a judicious choice of modeling algorithms.
Article co-written by Mathieu Lepoutre and Arnaud Parent
Discover all the latest news, articles, webinar replays and fifty-five events in our monthly newsletter, Tea O'Clock.