Certain radical critiques of capitalism posit that it must necessarily monetize everything in time. The basic argument is that capitalism inherently requires infinite economic growth to function properly, but we are in a world of finite resources. As traditional resources used to feed the furnace are exhausted, more aspects of life that were previously outside of the money economy must be drawn into it – including abstract things like behavior, relationships and even thoughts.
The merits of such theories are debatable. What is beyond debate is that human thoughts and relationships are already in the advanced stages of monetization. Prof. Shoshana Zuboff, a leading expert in the field of business administration in information technology settings since the 1980s, coined the idea of “surveillance capitalism” (in the April 2015 edition of the Journal of Information Technology) to describe this phenomenon – the observation and recording of as much personal data as possible to create highly effective targeted advertisements.
Google is one of the best and oldest examples of surveillance capitalism in action. Their ostensibly “free” services, like Search and Gmail, have always been monetized by the data they collect from users. Same story with Facebook. These systems are opaque at best for the end user. You can never be entirely sure exactly what or how much data they’re collecting, how detailed a personal profile they’re building on you, what it is being used for or whose hands it is passing through. Thus the “surveillance” aspect – it’s as if you have hidden cameras recording you all the time as you move about virtual space.
The purpose of all this is nothing more sinister than advertising. The deeper a marketing company can get inside your head, the more effectively they can advertise to you. The lack of regard for personal privacy and fair disclosure in this process has always been troubling, but most of the tech companies that have surveillance capitalism as their central revenue model are simply concerned with making money by selling things in the most ruthlessly efficient way possible.
Unfortunately, that isn’t the only way in which this technology can be used.
While surveillance capitalism for marketing purposes is creepy, it becomes truly dangerous when these tools and database assets wind up in the hands of political actors with bad intentions.
One familiar example is the use of targeted advertising by foreign intelligence agencies to sow political and social unrest. The Internet Research Agency, a notorious Russia-based “troll farm”, has been linked to at least 270 fake Facebook accounts purporting to be tied to American social movements. These fake groups, with names like “Aztlan Warriors” and “Black Elevation”, not only fomented dissent by spreading misinformation online but managed to remotely organize actual meetings and protests in American cities. The Internet Research Agency was found to have purchased at least 3,500 targeted Facebook ads to draw users into their groups.
Of course, these techniques have been employed in domestic politics as well. Cambridge Analytica’s illicit access to the data of 87 million Facebook users was put to use in targeted ad campaigns in the 2016 presidential election in the United States. In other countries, it has been put to use in propping up authoritarian regimes by profiling dissidents, magnifying cults of personality and organizing smear campaigns.
In her writing on the concept, Zuboff cites articles by Google’s chief economist Hal Varian as encapsulating the central practices of surveillance capitalism. These are:
- Data extraction and analysis
- New contractual forms due to better monitoring
- Personalize and customize
- Continuous experiments
These are all neutral concepts when taken at face value. However, the way in which Google (and companies with similar structures) use them is characterized by Zuboff as “The Big Other.” The Big Other of surveillance capitalism represents a new expression of power by tech companies. The interest is in simply extracting as much value as possible from society with very little regard for the damage caused, which Zuboff sees as extending to the undermining of democratic norms.
As with any industry that decides it’s acceptable to externalize damaging costs to the public, regulation is the answer. This largely uncontested new expression of power has grown primarily due to the traditionally laissez-faire attitude toward regulating the tech industry. Tech is in need of the sort of regulation that is applied to industries with the potential to cause great public harm when mismanaged such as energy, finance and pharmaceuticals. Tech has traditionally been thought of as an industry that people voluntarily participate in, but the negative applications of the social capitalist approach (particularly political unrest and authoritarian support) are demonstrating that tech practices impact the social fundamentals of human rights and government institutions.
Of course, regulation is always a contentious issue. The European Union’s General Data Protection Regulation (GDPR) is the first of the large-scale efforts to address the societal problems stirred up by unchecked surveillance capitalism. Tech and big data companies throughout the rest of the developed world appear to feel that GDPR-style regulation is coming to their homes at some point, based on the things they are saying and the moves they are making. Key features of such regulation include the right to see and modify the information being collected on each individual, more robust opt-in systems and the “right to be forgotten” upon request. The GDPR model provides for substantial fines for companies who do not comply, up to 10 million euros or 4% of annual global turnover.