A decade ago–long before the current controversies over what big companies are doing with our data–a lot of people were already irate about ad networks that followed their activity across sites in order ever more precisely to target marketing messages. A feature called Do Not Track arose as a simple, comprehensible way for browser users to take back their privacy. To opt out of being tracked, you’d check a box in your browser’s settings.
Notably, this didn’t opt out of advertising–just the technology used to target ads. With Do Not Track checked, no web server or embedded code would associate your behavior at a given site with actions elsewhere on the web. It was a great idea.
And now it’s dead.
Oh, for all practical purposes, DNT died years ago. But Apple’s removal of the Do Not Track preference from Safari for Macs and iOS in an update in early February officially signaled the end of what might have been a workable understanding between consumers and the advertisers that rely on ad-tech networks to target them.
Apple’s move follows the dissolution of a World Wide Web Consortium (W3C) project, the Tracking Protection Working Group, which shut down after eight years on January 17. The release notes for Safari 12.1called Do Not Track an “expired” standard, which is sadly accurate.
In October 2018, on its public mailing list, the W3C group discussed how to describe Do Not Track’s failure in a preface to its final piece of work. After some back and forth, the group agreed on the language that appears:
…there has not been sufficient deployment of these extensions (as defined) to justify further advancement, nor have there been indications of planned support among user agents, third parties, and the ecosystem at large.
It’s an artful self-own by the group’s participants, which included representatives from ad industry trade groups, large advertisers, and ad delivery platforms, as well as ones from privacy groups, governments, and browser makers. After a flurry of work from 2011 to 2013, the group hadn’t met face to face since 2013, according to its notes.
The working group’s existence used to imply that the ad industry was actively moving towards a consensus on self-regulation when it came to online privacy. But DNT turned out to be a useful fig leaf, not a solution. “The best way to sabotage a process is by wholeheartedly participating in it,” says Alan Toner, a privacy and data protection special adviser at the Electronic Frontier Foundation (EFF), who represents his organization at the W3C.
In the ultimate irony, Apple told me via a spokesperson that it removed Do Not Track after the W3C group shuttered because, if enabled, it could help ad networks “fingerprint” a browser, a technique used by tracking systems to defeat ad blockers by identifying unique characteristics in a user’s browser configuration.
It could have all been so different.
Do Not Track bubbled up from the seeming success of the Federal Trade Commission’s National Do Not Call registry, which went into effect in 2003. It allowed consumers to register their phone numbers as being unwelcoming to commercial solicitations. Companies making calls to people other than customers have to purge these numbers from calling databases. (Do Not Call was ultimately a failure, because it only prevented scrupulous parties from calling, not those who blithely ignored the law or were engaged in outright scams.)
Initially, Do Not Track was going to be a similar kind of central registry. But in 2009, privacy advocates Chris Soghoian and Sid Stamm implemented the idea as a simple Firefox plug-in. The plug-in would add a Do Not Track header to the metadata a browser sends to a server on initiating a connection. If a user had enabled Do Not Track, the value of the header would be “1”; otherwise, “0.” It was that simple. It didn’t matter from a technical perspective that no server knew how to interpret that header at the time and therefore ignored it; the policy details could be worked out later.
This straightforward idea caught fire, and within a couple of years, all the major browser had added an option to express a preference. Stamm, now an associate professor at the Rose-Hulman Institute of Technology, says the header was “a way to shout, ‘Hey, I don’t like this!'” He developed the plug-in with Soghoian because “people were really unaware how much data was collected about them.”
Stamm and Soghoian, who is now a privacy and cybersecurity adviser to U.S. Senator Ron Wyden (D-OR), were part of a group of privacy advocates and security engineers who advocated for DNT. By 2011, the FTC appeared poised to recommend that DNT evolve from a nascent browser feature into a regulatory requirement. The W3C opened a working group to study how to turn DNT into a fully recognized standard that would define how it could be implemented.
Arvind Narayanan, now an associate professor at Princeton and part of that early DNT-formulating group, said via an email statement that the prospect of federal legislation brought ad players to the table. But when that legislation didn’t materialize, “the prolonged negotiations in fact proved useful to the industry to create the illusion of a voluntary self-regulatory process, seemingly preempting the need for regulation.”
The moment passed. Those involved in the ad industry, whether social networks or ad-tech firms, had little interest in pursuing DNT if they could avoid it. Publishers didn’t demand the technology as a way to protect visitors to their sites; advertisers didn’t act as though it affected them directly.
One of the wrenches in the works was the issue of whether Do Not Track was really a binary deviation. As a two-position switch, it was either off or on. But if a user hadn’t considered the matter–or didn’t even know DNT existed–a third state existed: not yet decided. If a user hadn’t chosen to turn on DNT, browsers either left it turned off–or didn’t send DNT info one way or the other to websites.
Microsoft broke the model. In 2012, the company opted to preset Internet Explorer’s Do Not Track to the “on” state without requiring a user to pick or confirm that choice. Though the move defaulted to the most privacy-friendly option, it also put a crimp in Google’s ad hegemony, which Microsoft would not have seen as a bad thing.
Companies that were part of the ad economy already had reason to be wary of DNT; a DNT that stopped users from being tracked without them explicitly opting in looked like an existential threat. “Do Not Track started with one leg cut off the moment Microsoft used it as a marketing tool, by turning it on by default,” says Dan Jaye, a veteran online-ad veteran, most recently the founder of aqfer.
Sam Tingleff, the chief technology officer of the Interactive Advertising Bureau’s Tech Lab, provides another reason why the ad-business players took issue with DNT: From their perspective, it was too simple. A user could only turn it on or off for the browser as a whole, without per-site whitelist or blacklist options, something that the W3C group was working to elaborate on.
The lack of legislative or regulatory action, Microsoft’s DNT misstep (which the company reversed–too late–in 2015), and the W3C’s stalled movement forward left the DNT checkbox in place but without any power. Narayanan says that it was clear to him Do Not Track had failed by 2013. The corpse only stopped kicking recently. Do Not Track died before consumers had a chance to gain a taste for being tracked.
In the absence of consumers’ ability to express a preference and without U.S. regulation governing tracking, what did the ad industry expect would happen? It’s not clear.