When you go into the privacy settings on your browser, there’s a little option there to turn on the “Do Not Track” function, which will send an invisible request on your behalf to all the websites you visit telling them not to track you. A reasonable person might think that enabling it will stop a porn site from keeping track of what she watches, or keep Facebook from collecting the addresses of all the places she visits on the internet, or prevent third-party trackers she’s never heard of from following her from site to site. According to a recent survey by Forrester Research, a quarter of American adults use “Do Not Track” to protect their privacy. (Our own stats at Gizmodo Media Group show that 9% of visitors have it turned on.) We’ve got bad news for those millions of privacy-minded people, though: “Do Not Track” is like spray-on sunscreen, a product that makes you feel safe while doing little to actually protect you.
“Do Not Track,” as it was first imagined a decade ago by consumer advocates, was going to be a “Do Not Call” list for the internet, helping to free people from annoying targeted ads and creepy data collection. But only a handful of sites respect the request, the most prominent of which are Pinterest and Medium. (Pinterest won’t use offsite data to target ads to a visitor who’s elected not to be tracked, while Medium won’t send their data to third parties.) The vast majority of sites, including this one, ignore it.
Yahoo and Twitter initially said they would respect it, only to later abandon it. The most popular sites on the internet, from Google and Facebook to Pornhub and xHamster, never honored it in the first place. Facebook says that while it doesn’t respect DNT, it does “provide multiple ways for people to control how we use their data for advertising.” (That is of course only true so far as it goes, as there’s some data about themselves users can’t access.) From the department of irony, Google’s Chrome browser offers users the ability to turn off tracking, but Google itself doesn’t honor the request, a fact Google added to its support page some time in the last year. A Google spokesperson says Chome lets users “control their cookies” and that they can also “opt out of personalized ads via Ad Settings and the AdChoices industry program” which results in a user not having “ads targeted based on inferred interests, and their user identifier will be redacted from the real-time bid request.”
There are other options for people bothered by invasive ads, such as an obscure opt-out offered by an alliance of online advertising companies, but that only stops advertising companies from targeting you based on what they know about you, not from collecting information about you as you browse the web, and if a person who opts out clears their cookies—a good periodic privacy practice—it clears the opt-outs too, which is why technologists suggested the DNT signal as an easier, clearer way of stopping tracking online.
“It is, in many respects, a failed experiment,” said Jonathan Mayer, an assistant computer science professor at Princeton University. “There’s a question of whether it’s time to declare failure, move on, and withdraw the feature from web browsers.”
That’s a big deal coming from Mayer: He spent four years of his life helping to bring Do Not Track into existence in the first place.
Why do we have this meaningless option in browsers? The main reason why Do Not Track, or DNT, as insiders call it, became a useless tool is that the government refused to step in and give it any kind of legal authority. If a telemarketer violates the Do Not Call list, they can be fined up to $16,000 per violation. There is no penalty for ignoring Do Not Track.
In 2010, the Federal Trade Commission endorsed the idea of Do Not Track, but rather than mandating its creation, the Obama administration encouraged industry to figure out how it should work via a “multistakeholder process” that was overseen by W3C, an international non-governmental organization that develops technical standards for the web. It wound up being an absolutely terrible idea.
Technologists quickly came up with the code necessary to say “Don’t track me,” by having the browser send out a “DNT:1" signal along with other metadata, such as what machine the browser is using and what font is being displayed. It was a tool similar to “robots.txt,” which can be inserted into the HTML of a web page to tell search engines not to index that page so it won’t show up in search results. The “stakeholders” involved in the DNT standard-setting process—mainly privacy advocates, technologists, and online advertisers—couldn’t, though, come to an agreement about what a website should actually do in response to the request. (The W3C did come up with a recommendation about what websites and third parties should do when a browser sends the signal—namely, don’t collect their personal data, or de-identify it if you have to—but the people that do the data collection never accepted it as a standard.)
“Do Not Track could have succeeded only if there had been some incentive for the ad tech industry to reach a consensus with privacy advocates and other stakeholders—some reason why a failure to reach a negotiated agreement would be a worse outcome for the industry,” said Arvind Narayanan, a professor at Princeton University who was one of the technologists at the table. “Around 2011, the threat of federal legislation brought them to the negotiating table. But gradually, that threat disappeared. The prolonged negotiations, in fact, proved useful to the industry to create the illusion of a voluntary self-regulatory process, seemingly preempting the need for regulation.”