Although it is intimately connected to our everyday lives, advertising technology (adtech) goes unnoticed by most of us.
One of the most pervasive techniques used by browsers and advertisers to get relevant ads in front of increasingly receptive eyes is that of the ‘third-party cookie’.
What is a third-party cookie?
These are tiny pieces of data that are stored in your browser on behalf of a website whenever you visit. Websites that you visit later can access these cookies and use them to tailor the ads they show or the deals they offer.
An example: suppose that you are planning a trip to Lisbon and doing online research about where to visit and what to eat. Some of these websites leave cookies on your browser that can be witnessed by other websites when you browse to them. The next time you visit Amazon you may find that - miraculously - there is a sale on guide books to Portugal.
As simple and benign as third-party cookies sound, this mechanism is powerful. As internet users, we are spreading information about our browsing history far and wide. While more tailored advertisements is in theory a benefit, the privacy implications are manifest.
How do we tackle the issues related with adtech?
This status quo is transparency theatre. Even though these cookie warnings are meant to abide by the GDPR principle of consent, those of us in the majority of users who ignore the warnings do not have the time to give informed consent and we gain little assurance that our data is being used in appropriate ways. Even if GDPR risk is mitigated in a technical sense, practically there is no benefit.
Adtech needs a major facelift!
One such direction, until it was killed in January 2022, was Google’s FLoC; the Federated Learning of Cohorts. The idea behind FLoC was to map people into a group of like-minded users called a cohort. A website can then request the cohort identity of a given user to generate targeted ads, without getting more specific or identifying data from the browser. Topics, which is Google’s proposed replacement for FLoC, instead tracks user interests based on browser history and makes these interests available to advertisers in real time.
The Google proposals have received mixed reviews. They may not be as private in practice as suggested. Then again, it might actually be a more pragmatic approach than simply blocking third-party cookies, which at first glance is more privacy-friendly, but has the effect of strongly incentivising much more privacy destructive technologies such as device fingerprinting.
We see no solution on the table as getting adtech quite right.
The ingredients that are missing are meaningful transparency and minimality. A key aspect of adtech transparency should be to provide information about when a person’s browsing data is actually used. This avoids some of the problems of the never-ending cookie warnings, but also introduces ‘actionability’.
An example of actionable adtech transparency: you find that a particular company’s website is making heavy and seemingly unjustifiable use of your browser data. You then consider switching to an alternative business to get the same service.
This methodology, in which statistics about your adtech privacy are consumed after-the-fact, and then made actionable, eliminates the need for a constant stream of up-front warnings, without foregoing ultimate transparency or accountability.
As members of the PAD team, we cannot help but notice that this methodology is closely related to the trust-but-verify nature of decryptions under the PAD system. Once a user grants in-principle decryption approval to another party, that party does not need to request permission to decrypt. Instead, that party is subjected to unavoidable transparency about whether and when decryptions were made.
This counter-intuitive property allows many of the PAD use cases to fit together well. In the case of PAD Places, if you are in an emergency and a friend needs to identify your location, requiring that the friend seek permission may be unsuitable if you are unable to grant it.
Applying PAD-style transparency to adtech..
Applying PAD-style transparency to adtech might be a modest part of solving the problems inherent to this domain. Technology that enables the average person to get a sense for which websites apply adtech proportionally - rather than provide a useless stream of blanket warnings - also could be used to empower the individual to be compensated for use of their browser data. Users who are highly motivated by this compensation could even optionally attach additional attributes about their identity or browsing history, and secure this information in encrypted form under PAD. If advertisers wish to use this data, they must transparently decrypt it and subject themselves to a micropayment to the user.
This self-sovereign ownership of browser data may be a moonshot for now, but regardless, the notion of meaningful transparency already incentivises our second goal of minimality. Privacy-centric consumers will track and recommend businesses that make the most out of the least data and provide more relevant ads without making unjustifiable intrusions. Reputation-sensitive companies will respond and learn to live with less. Incentive compatibility is a hallmark of great technology.
We see meaningful transparency and incentivised minimality as the principles currently missing from the adtech debate surrounding third-party cookies and alternative solutions like topics. Any technology that provides individuals with an easily interpretable layer to see how their data is actually used will play an outsize role in the adtech of Web3.0.
If you would like to keep up to date with the PAD team, follow us on our social media channels:
LinkedIn: PAD Tech