The 3 biggest challenges in GDPR for online media & advertising

Dr Johnny Ryan GDPR 1 Comment

This note explains the three deepest challenges that the online advertising industry must overcome to survive the new European data rules. It also outlines our approach. 

The General Data Protection Regulation (GDPR) and the ePrivacy Regulation (ePR) pose particular challenges for publishers, brands, and adtech companies. These go beyond the normal gap analysis and security overhaul that other businesses must undertake to comply with the new rules. Online advertising and media businesses’ ability to function online depends on the outcome of three deep challenges.

Deep Challenge 1: Obtaining consent to process an internet user’s personal data.

Despite some lingering debate to the contrary, businesses will need consent from internet users to use their personal data for online behavioral advertising. This poses a UX challenge. Businesses asking for consent must relay detailed information to users about how their data will be treated.

Businesses will have to provide the following information to internet users when seeking their consent.[1] 

  • Who is collecting the data, and how to contact them or their European representative.
  • What the personal information are being used for, and the legal basis of the data processing.
  • The “legitimate interest” of the user of the data (This refers to a legal basis that may be used by direct marketing companies).
  • With whom the data will be shared.
  • Whether the controller intends to transfer data to a third country, and if so has the European Commission deemed this country’s protections adequate or what alternative safeguards or rules are in place.
  • The duration of storage, or the criteria used to determine duration.
  • That the user has the right to request rectification to mistakes in this personal information.
  • That the user has the right to withdraw consent.
  • How the user can lodge a complaint with the supervisory authority.
  • What the consequences of not giving consent might be.
  • In cases of automated decision-making, including profiling, what the logic of this process is, and what the significance of the outcomes may be.
Yet, the consent request must also be concise[2] and “not unnecessarily disruptive” of user experience.[3]  In addition, user permissions must be carefully handled and audited, and the various data rights enshrined in the GDPR must be provided for.[4]  

Deep Challenge 2: Stop personal data leaking during online ad transactions. 

Solving Deep Challenge #1 and obtaining the consent of a data subject is not sufficient, however, because there are two points of data leakage from the online ad system that break the rules. This poses enormous risks to all parties in the online behavioural advertising system.

Leakage point A: sharing personal data to solicit bids.

Every time an ad is shown, an advertising exchange sends personal data about the website visitor (this is the person who is about to see the ad) to hundreds of its partners. Most of these partners are prospective advertisers, and the data they receive in the milliseconds before an ad is shown informs their decision about whether the visitor is a suitable target for their advertising, and whether they should bid money so that their ad might be shown to be shown to the person visiting the web page. This all happens in milliseconds.  [See video]

The competition between these bids is supposed to select the highest price for the ad, and make advertising more discriminating. A recent refinement of this technology is “header bidding”, in which a website visitor’s personal data are shared among hundreds of partner companies by a single advertising exchange, but among thousands of companies by several exchanges. Personal data are broadcast so widely, among so large a number of businesses, that it would be difficult to conclude the required contractual relationships among them, or to be assured that all recipients will treat the data properly.[5] 

Leakage point B: unauthorised collection via insecure advertising units.

Online ads can include executable code. This can be benign, perhaps rendering a simple HTML5 animation. Or it can be leaky, loading further code that lets unexpected companies track users.  Auditing ad code for data leakage is unlikely to be foolproof, as the code might be updated after auditing. Meanwhile, any malicious tracking code that has sneaked in can borrow from the malvertising playbook to lie dormant in testing environments, and only activate after it is launched into the wild.

There may be no way to guarantee that personal data are not being leaked to unauthorised third parties without introducing PageFair-style server side rendering.

Deep Challenge 3: Most users will be unknowable. 

Today, the majority of online advertising is targeted by the widespread sharing of personal data. But soon it is likely that this will be the exception. Consent is required. Few people will give it, and what consent is given will be piecemeal.

As we have written previously, people favor convenience over privacy. There are three reasons why this may change.

First, privacy may become a convenience in Europe. The draft ePrivacy Regulation requires that devices and browsers must present users with a choice of what degree of tracking they will accept – and makes the choice enforceable.[6]  Assuming this survives in the final regulation text then it is likely that most users in Europe will not choose tracking. Perhaps only a small percentage will opt back in for their data to be widely used.

Second, users are going to learn how businesses in online advertising handle their personal information. As noted above, the GDPR requires that an exhaustive level of detail be provided to users on how their personal information is used by every party that wants to use it. It also envisages iconography to concisely communicate data use, risks, and rights in plain language.[7]

Third, the Regulation introduces a new focus on security that will contribute to user fears. All parties that handle data are now required to protect personal information from misuse and leakage.[8] In addition, data controllers have to tell users when their personal data have been stolen in a data breach.[9] The practice of covering up data breaches will end, and users will learn how often their data are exposed.

These developments mean that internet users will be confronted with the extent to which their behavior across the web is tracked, how these data are used, and how often they are stolen. The result will be a wave of paranoia about personal information. And at the same time that this occurs, users will have the new right under the ePR to opt out of tracking from the outset, and under the GDPR will have the ability to withdraw any consent they give for tracking at any time, as easily as they gave it.[10]  

Therefore, consenting, trackable users will be the exception. They will probably be accessible only through a subset of trusted publishers, and sold as a premium audience. This niche audience will take time to build, and great effort to retain. And the average internet user will be untracked and unknown.

Our conclusion is that the most important challenge of Europe’s new data rules is not how to get consent. Nor is it how to audit these permissions once given. The most important challenge is how to target advertising online without using personal data, because consent may be unobtainable more often than not. This is among PageFair’s core areas of focus today.

WHAT WE ARE PRIORITIZING AT PAGEFAIR 

  1. We are designing a mechanism to obtain consent, and to audit it.
  2. We are developing technology to neutralize personal data leakage from the online advertising system. This will protect all parties involved in an advertising campaign from legal risk.
  3. We are making a new technology to intelligently target advertising where personal data are not available.

Download PDF of this card

 NOTES 

[1] Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L119/1, Recitals 39, 58, 60-63, and Article 13 paras. 1-2, and Article 13 and Article 14.

[2] ibid., Recital 32 and 58, Article 12, paragraph 1.

[3] This may be aided by new iconography envisaged by the European Commission, which is intended to concisely communicate data use, risks, and rights in plain language. ibid., Article 12, para. 7, and Article 70, para. 1, (r). Industry bodies should take the opportunity to contribute to this iconography.

[4] A data subject now has the right to access, correct, or delete their personal data, to object to automated decision making based on these data, and to move these data to another service. (The GDPR, Article 15 to Article 21.) In addition, the Regulation requires that it must be as easy for a data subject to withdraw consent as it was to give it at any time.

[5] The issue is not one of informing the user necessarily, since Article 13, paragraph 1, e, allows one to inform the user of the “categories of recipients of the personal data”. The issue is that the data controller must have contracts in place with data processors that guarantee that the processor handles the personal data only in the manner dictated by the controller. See ibid., Article 28, paras. 2, 3 and 4, and Article 29. While this is already required in the current Data Protection Directive (See (95/46/EC) 1995, Article 17, para. 3.), the GDPR backed this up with new sanctions, and requires that these contracts define the nature and duration of processing (GDPR, Article 28, para. 3). Similar agreements must also be in place when one processor engages another (ibid., Article 28, para. 4), and a processor can only do so with express permission from the controller (ibid., Article 28, para. 2).

[6] Rapporteur’s draft report on the proposal for a regulation of the European concerning the respect for private life and the protection of personal data in electronic communications and repealing Directive 2002/58/EC, June 2017, Recital 22 and 23, Article 9, paragraph 2, Article 10, paragraph 1 and 2.

[7] ibid., Recital 32, and see also notes on iconography in Article 12, para. 7, and Article 70, para. 1, (r).

[8] GDPR, Article 32.

[9] ibid., Article 33 and Article 34.

[10] ibid., Article 7, para. 3, and Article 21, paras. 1 and 2.

  • Heather Paulson

    Great article thanks for posting!