30 second summary:
- As third-party cookies will eventually be phased out and marketers seek alternative approaches, they may find themselves lost in a sea of data when attempting to measure and evaluate impact.
- Focusing on user quality rather than attributable conversions can mitigate the inconvenience of losing third-party cookies
- Moving from cookies to a new engagement model will require constant testing, so keep the data simple where possible
For years, digital marketers have been spoiled by third-party cookies and the ability to accurately track engagement – it made life easier and reporting campaign activity a breeze. Such an approach allowed us to easily see how many conversions Meta, Criteo or an influencer contributed with minimal effort. But the eventual disappearance of third-party cookies requires accurate engagement data to ensure that switching to new identifiers can be as clear as possible. However, due to ignorance or convenience, many advertisers continue to view overly positive and blindly optimistic metrics as the truth.
Counting your chickens before they converted
If we take Facebook for example, they have no way of knowing to what extent their services contributed to a conversion. There are many ways to produce extremely inflated numbers, such as having multiple touchpoints and a conversion associated with multiple channels, or even inaccuracies due to false positives. This is particularly concerning for those engaging in heavy remarketing based on past users who have already visited or interacted with a site. The question must be asked: When working with inaccurate metrics, will remarketing actually contribute to further conversions or will it simply attribute missed clicks to campaigns that don’t increase revenue?
As humans, we love to oversimplify things, especially complex models. Imagine how complex a visit to your web page is: you get a session connected to a user, which considers different attributes such as age, gender, location, interests and their current activity on your site. This user data is then sent, for example, to Google Ads in a remarketing list.
The remarketing list also provides a notable variable when trying to make sense of conversions. Facebook and Google users are not 1: 1, with a user on Google often logged into more devices and browsers than the average Facebook user. You may get a conversion from a device that Google has linked to the same user, while Facebook may not have detailed information.
Populate remarketing lists with each user who visits your website. These remarketing lists create “similarities” in Facebook and “similar” in Google. These “likes” can be extremely useful, as although traffic from a channel can be attributed to zero or no conversions, they could in fact help create the most efficient “likes” in Google Ads which can then generate a large number of conversions to low cost.
Identify data that helps you avoid over-attribution
All automated optimization efforts, whether it be campaign budget optimization (CBO) or target CPA, depend on data. The more data you provide to the machines, the better results you get. The larger your remarketing lists, the more efficient your automated / smart campaigns on Google will be. This is what makes user value so multifaceted and incredibly complex, even when you don’t take into account the action impression of an ad.
With this incredible complexity, we need an attribution model that can genuinely represent engagement data without inflating or underestimating a campaign’s conversions. However, although there may be many models suitable for producing the most accurate results, it should be remembered that attribution itself is flawed. As consumers, we understand that the actions that drive us to conversions in our personal lives are varied, with so many things that cannot be traced enough to be attributed. While attribution may not be perfect, it is essentially the best tool available and can become much more useful when applied alongside other data points.
The latest non-direct click attribution model
When trying to avoid bloated data, the simplest attribution model is a non-direct last click. With this model, all direct traffic is ignored and all conversion credit goes to the last channel the customer clicked on, ultimately preventing conversions from being incorrectly attributed to multiple touchpoints. It is a simple model that considers only the bare minimum that still manages to solve the problems of over-attribution by being direct. This way, marketers can measure the effect instead of attributing parts of the conversion to different campaigns or channels. It really is a very straightforward approach; essentially, “if we do this x, does y increase?”. Of course, like all attribution models, the non-direct last-click approach has its downsides. For one, it’s not a perfect solution for over / under contribution, but it’s an easily replicable and strategically sound approach that provides reliable data where you can measure everything in one place.
In any case, the delayed death of the third-party cookie is certainly causing many to re-evaluate their digital advertising methodologies. For now, proactive marketers will continue to look for privacy-friendly identifiers that can provide workarounds. First party data may have a more important role to play if user consent can be reliably acquired. As you wait for the transition, putting your data in order and finding accurate and reliable approaches to attribution must be a priority.
Ensuring the accuracy of this data is therefore paramount, this can be achieved by ensuring that there are no discrepancies between clicks and sessions while all web pages are tracked accurately. In the absence of automatic tracking, UTMs should also be used to track all campaigns and, if possible, tracking should be server-side. Finally, marketers should test their tracking with Tag Assistant and make sure they don’t create duplicate sessions or lose metrics during the session. Ultimately, once the third-party cookie becomes completely obsolete, the direction in which marketers go will ultimately be decided by the data, which must be as accurate as possible.
Torkel Öhman is CTO and co-founder of Amanda AI. Responsible for creating Amanda AI, with her data / analytics expertise, Torkel oversees all technical aspects of the product ensuring all ad accounts run smoothly.
Sign up for the Search Engine Newsletter Watch for insights into SEO, the search landscape, search marketing, digital marketing, leadership, podcasts, and more.
Join the conversation with us on LinkedIn and Twitter.