Targeted ads and Meta’s fine: lessons for consent and data protection literacy

Edoardo Celeste (Dublin City University)

The 4th of January 2023 has been definitively not a positive day for Meta, Mark Zuckerberg’s multinational company owning both Facebook and Instagram. Not only did Facebook lose a bid to dismiss a US lawsuit promoted by a sculptor who accused the company of allowing counterfeit ads which violated the artist’s copyright, but also received a record fine of €390 million from the Irish Data Protection Commission.

Facebook and Instagram were condemned to pay fines of €210 million and €190 million respectively for violation of the General Data Protection Regulation (GDPR), and more specifically for relying on their terms of service as a legal basis for processing data for the purpose of offering targeted advertising, which represents the core business of most social media platforms. The business model of these companies, indeed, consists of building profiles of their users – for example, distinguishing them by age, gender, interests, and categorising them according to their ‘behaviour’ on the platform – and allowing advertisers to target specific categories of individuals. In this way, for example, the sports car manufacturer trading in Europe will not waste money on advertising their products to young individuals from Latin America with an interest in environmental protection, but their adverts will be displayed to categories of users that correspond to its typical customer, such as middle-aged European individuals with a passion for motoring. To do so, social media constantly processes users’ behaviour on their platforms in a practice that is called ‘profiling’.

Is profiling illegal?

Profiling is not an illegal activity per se. Even though it may sound like a form of corporate spying, in reality, it represents a normal form of data processing. But there are of course conditions to respect. It is needless to say that profiling for targeted advertising purposes cannot consist of revealing to a potential advertiser that a specific individual – Mrs X – loves plants and cosmetics. Social media companies sell the opportunity to display specific adverts to categories – ie groups – of individuals. Secondly, social media companies cannot collect and process users’ data secretly and surreptitiously. In 2018 the so-called Cambridge Analytica scandal emerged, a political consulting firm used personal data acquired from millions of Facebook users via a personality profiling app to influence their political choices through targeted advertising. In light of this episode, the second condition might be obvious to any reasonable individual but, in reality, the boundaries between legal and illegal profiling in this case are more blurred than what one might think – and that was indeed the contentious matter that led to this year’s unprecedented fines imposed on Meta.

Take it or leave it

Meta did not ask for specific consent to process users’ data for targeted advertising purposes on its platforms. Instead, it relied on the users’ acceptance of its Terms of Service, which represents the ‘contract’ that every user has to accept in order to use a specific platform. Meta’s products Terms of Service did not conceal this circumstance – the user is told very explicitly and in plain language. However, users until now did not have the possibility to use Meta’s products without being subject to data profiling: take it or leave it. And this – according to Meta – because targeted advertising represents a core component of social media platforms, justifies the inclusion of data processing for this purpose in their Terms of Service as a ‘contractual necessity’.

A clash of visions

Article 6(1)(b) GDPR allows data processing if ‘necessary for the performance of a contract’, and it is by relying on this legal basis that Meta has thus far processed the personal data of its users for profiling purposes. However, the European Data Protection Board, the body that brings together the representatives of all the European national data protection authorities, in its decision of December 2022, held that Meta in this way essentially ‘forced’ its users into being subjected to targeted advertising, without providing them with a real possibility to consent to it. The fact that this decision was taken by the Board reveals that there is not a single univocal vision and qualification of the role of targeted advertising in the context of social media. Indeed, this case landed on the desk of the Board as the Irish Data Protection Commission could not agree on the entirety of the fines with the other national data protection authorities involved in this case, the first siding more with Meta and arguing that social media intrinsically offer a ‘personalised experience’ – read: users receive targeted advertising – while the latter contending that social media are spaces for social interaction and that targeted advertising represents an ancillary business activity. Despite the pure data protection question related to the interpretation of the GDPR, this tension exposes a conflict between two visions. On the one hand, social media may be seen as private spaces, fiefs of powerful multinational companies, mere business activities. And on the other hand, online platforms may be seen as public forums, spaces for social interaction, services which are essential to exercise constitutionally protected freedoms.

Why should we read a banner?

Last, but certainly not least, I am sure the reader is wondering – sadly: but what would change if Meta were required to ask for a specific consent to process users’ data for targeted advertising rather than simply stating this circumstance as a fact in its Terms of Service? Would another banner that every one of us would distractedly click on just for the sake of looking at the page behind it make any difference? Unfortunately, beyond the goodwill of the EU legislator, who in the GDPR sets specific requirements related to users’ consent, the reality shows that we rely too much on free online services, and we have little understanding of the purposes of data protection, which is often perceived as too bureaucratic and formalistic. A bitter but formative lesson for EU data protection law: the fines introduced by the GDPR are crucial to educate big corporations, but we should also find the way to make that banner simpler to understand and to explain to the users that their data protection choices might not be so irrelevant.

 

* Dr Edoardo Celeste is an Assistant Professor of Law, Technology and Innovation and Programme Chair of the Erasmus Mundus Master in Law, Data and AI at the School of Law and Government of Dublin City University. In 2022 he was awarded the Prize as the Best Early Career Researcher of the Year by the Irish Research Council.

 

The views expressed in this blog reflect the position of the author and not necessarily that of the Brexit Institute Blog.

Latest Posts

The views expressed in this blog reflect the position of the author and not necessarily that of the REBUILD Centre Blog.