Attribution is digital marketing’s new battleground. The science of assigning value to customer touch points has never been more talked about, more invested in or more open to disruption as it is now. Attribution is big business with almost £40m spent by UK companies in 2016 trying to understand the relationship between intent and conversion.
The art of attribution modelling has seen rapid change over the last few years. In early forms, attribution for online marketing was linear, and relied almost exclusively on either first-click or last-click models. Both were popular and logical, but also sought to assign value to marketing channels by focusing on very specific parts of the customer journey flow, namely the start or the end.
In time, these linear models became more sophisticated. Time-decay and metric-driven attribution became widespread a few years ago, particularly with the scrutiny placed on online display ads. It was Google Analytics in 2012 that codified naming conventions for the various attribution models available within its system, and given GA’s reach and accessibility these became the lingua-franca for multi-channel attribution (as long as you could track that channel via Google Analytics of course!)
However, no matter what attribution model Google, or the various other exponents of traditional modelling have created over time, there has remained a fundamental problem common to them all. These attribution models were all assumptive. They were implemented based on a pre-disposed conclusion of which model was best or made the most sense. So, no matter their level of sophistication, the model was still assigning an arbitrary value to each marketing interaction, and the importance of that arbitrary value was pre-ordained by a human assumption.
So, if your business believed a user’s first interactions were the most important to their engagement with your product, then you went first click or maybe even time-decay (giving the most weighting to the oldest interaction). Discount-orientated businesses often went last-click.
Enter the world of machine-learned attribution. Machine-learning, is the idea of computers constructing algorithms, which allow them to learn and evolve based on data. While the concept might conjure some 2001: A Space Odyssey preconception about computers subsuming the human race, machine-learning is used in practice across the internet today, such as news aggregators combing millions of articles to recommend the ones most suitable to you, or estate-agents predicting house prices.
Machine-learned attribution models in digital marketing apply computer algorithms to the millions of customer touch points generated by an e-commerce company to award value to that company’s individual marketing channels based on how important those channels are in converting customers to purchases.
We can think of machine-learning applied to marketing attribution in two ways:
Sorting Data Automatically - instead of having to manually crunch data to classify marketing into different buckets and models, a machine learned algorithm will do this automatically, deciding how best to use each data point on the fly.
Removes Human Assumptions - Machine-learned attribution allows the data to speak for itself. There are no arbitrary decisions on how important a channel, place in the chain or customer interaction is. Instead, the attribution model will collect and analyse vast numbers of customer journeys and touch-points, and use algorithms to analyse customer journeys and define value based on the need of each marketing channel to be present in a customer interaction to drive a conversion. So instead of feeding data into a model, which crunches it and then returns answers based on the model…the model is created and evolved based on the data.
Probably the most well-know exponent of machine-learned attribution was Adometry, which Google acquired in 2014 and rebranded as Google Attribution 360, a cross-channel, data-driven attribution model that could crunch data at mind-bending rates and assign value to an advertiser’s various marketing channels based on the ever-evolving trends within customer purchases.
And now, thanks to Google’s recent announcement that their new Google Attribution tools will be free to access, machine-learned attribution will be available to the masses, sounding the death knell for the rigid parameters and biased assumptions of traditional attribution.
Google Attribution will come with a few key pieces of functionality that will make it a must-have for digital marketers who are searching for the answer to that most elusive of questions - ‘where should my marketing spend really go and why?’
Google claim that its new Attribution product will solve a number of critical issues, chief among them being the difficulty of using solutions that are often very complex and not well integrated to ad-serving tools. There is also the cross-device question. Multi-device data is perennially underserved by attribution solutions, and because Google has so much logged-in reach across so many devices they can present a more meaningful view of cross-device purchasing than any other third-party attribution provider.
It’s no secret that tying online and offline conversion data together to create a complete picture of customer purchasing behaviour is considered the elixir of multi-channel retailing. Google’s ‘In-Store Visits’ metric is the company’s first cut at addressing this, by pulling data from a variety of sources such as wi-fi touch points, Google Maps, GPS and logged-in data to create in-store traffic estimates from online marketing campaigns. We can expect this kind of data to feature prominently in both Google Attribution and it’s paid-for big-sister Google Attribution 360, in the months ahead.
But Google Attribution’s most heralded innovation will be a switch to data-driven attribution modelling. Google themselves said: “Data-driven attribution uses machine learning to determine how much credit to assign to each step in the customer journey - from the first time they engage with you brand for early research down to the final click before purchase.”
This is somewhat understated. Because Google Attribution will for the first time give marketers free access to a data-led, machine learned solution that will give the most credible, un-assumptive view of cross-channel marketing value available today. And of course, it’s fully integrated with the Google suite of marketing delivery tools, including Adwords, Doubleclick, Adsense and Google Analytics.
The latter point of course will shine a light for some on the main point of contention for using Google Attribution - its impartiality. There have already been whisperings from some about this being yet another step towards Google becoming a “one-stop shop” for marketers, and marking their own homework in an even more overt way than with Google Analytics.
And while these concerns are not without merit, it feels that particular horse has already bolted. So much advertising revenue is now spent with Google, would it really make sense for them to create a biased attribution solution that favoured their own inventory? Brands are already so invested with Google that it’s hard to imagine a free Attribution solution will tip this imaginary scale of impartiality.
Machine-learned, data-driven attribution algorithms are also sometimes chastened for being too ‘black-box’. After all, last-click is a concept that we can all understand and explain. A data-led algorithm that analyses conversion patterns and compares the paths of customers that covert to those that don’t to determine value - well, it isn’t going to be quite as easy to explain to the CEO! But this view belies the need to help marketers make the most accurate and well-informed decisions possible on where to put their marketing budgets. That’s why we do attribution modelling. And it brings us back to this idea of assumption. Machine-learning takes the guesswork out of attribution.
And while machine-learned attribution models are not exactly a cutting edge science, a freely available tool led by the world’s biggest internet company may finally purge the tide of misinformation that has characterised how marketing budgets are spent for far too long. Finally, data and science are being used to guide, rather than validate, decision-making.