The Battle for Consent in 2023

Mattia Fosci

-

CEO & Co-founder

February 15, 2023
This is some text inside of a div block.
min read

2023 is shaping up to be the battleground year for consent. You thought the introduction of consent management platforms (CMPs) dealt with that? Wrong. This is a problem that isn’t going to go away anytime soon. So in this article we wanted to walk you through some of the issues the industry is facing right now and how Anonymised is helping to solve them.

Does my business need a CMP?

If you’re processing the personal data of your users or customers, then you need a CMP. It’s technology that websites use to obtain legal consent from users to process their personal data, typically through cookies and trackers in operation on the domain. However, ever since the General Data Protection Regulation (GDPR) required publishers to gain consumer consent to acquire their data, they had to come up with a new way to capture lucrative personal information. And with this came the rise of so-called “dark pattern.

What are dark patterns, and why are they harmful?

Dark patterns are a type of online architecture designed to deliberately obscure, mislead, coerce and/or deceive website visitors into making unintended and possibly harmful choices. There are many different types of dark patterns, but most, if not all, are designed to trick users into unintentionally disclosing more data than they otherwise planned, often to their own detriment but to the benefit of the business or organisation. In a recent paper called ‘Dark Patterns after GDPR’, researchers discovered that ‘dark patterns and implied consent are ubiquitous.’ They’re particularly insidious, because they often target psychological vulnerabilities and attention spans by appealing to users’ emotions to get them to part with personal information. There’s no doubt that the introduction of CMP’s has and will continue to drastically reduce consent rates, undermining not just advertising but also analytics and other uses of the data. It’s clear to see from the high industry adoption rate of dark pattern options, from ‘bundled consent’ to ‘address book leeching’, that many businesses think it’s an effective way to increase traffic and thus conversions. However, the reality is often the opposite, dark patterns can damage trust and create a confusing and frustrating UX for users. More importantly, dark patterns have increasingly started to receive legislative and regulatory scrutiny, both abroad and in the UK. It’s unlikely that those who have been swimming in the murky waters of dark patterns will be able to for much longer.

Build that (cookie) wall

If not dark patterns, then what? Many think the ‘catch-all’ solution is to introduce a cookie pay wall, which means restricting access to content or much of the site unless users agree to be tracked or pay a regular subscription for privacy – effectively a "pay or okay" system. With some of the biggest publications in the world, including The Washington Post and The New York Times, using paywalls as a successful revenue model, it’s no surprise that data protection authorities in many countries (Netherlands, Austria, USA) think that it promotes genuine user choice. However, there is opposition, including from the ICO in the UK. The reality of paywalls is that for many users, paying for privacy can be very expensive and inaccessible, whilst giving the user no real or free choice and effectively ending the ‘free internet’. Not to mention the fact that businesses on average only make a small return per user for passing on data. It’s not just the ICO, there is also growing concern online about the GDPR compliance of cookie paywalls, with the Dutch DPA having already made the move to make cookie paywalls illegal. Who knows how long it will be before other DPAs follow suit?

Problems ahead for the TCF

The TCF, a flagship data-sharing framework created by the IAB for gathering internet users’ consent for targeting with behavioural ads, was supposed to be the consent solution to get businesses in line and help users take back control of their private data. The TCF rollout has been popular and has been popping up all over the regional web, having been widely adopted by companies and businesses – including tech giant Google. However, a case has been submitted by the Belgian Data Protection Authority to the European Court of Justice for its apparent failure to comply with the GDPR principles of transparency, fairness, and accountability, as well as the lawfulness of processing. With a decision on the ruling set to be given at the end of 2023 / start of 2024, it's clear to see that what the industry needs right now is clarity, or a TCF 3.0 that fixes all the problems and gives companies more certainty.

How Anonymised will win the battle

We believe dark patterns are inherently wrong and advise companies we work with to avoid them at all costs. And TCF isn’t doing anything to help the problem. Legitimate interest should never override lack of consent and users shouldn’t be asked to object to it. That works for advertisers, but not always for users. Finally, paywalls are an understandable solution for publishers, and we sympathise with their reasons for adding them. However, we’re concerned about what it means for a free internet for all. That’s why Anonymised is set to be victorious in this particular battle. By using contextual lookalikes, we can provide the data anonymously. But even more importantly, it can all happen without needing consent. Our unique technology predicts a person’s interests based on common interests of other visitors to the page. The more high-quality data put into our system, the more accurate these predictions get.

Book a demo
Go back arrow

Back to hub

x