Interview

16 September 2019

Shoshana Zuboff on the age of surveillance capitalism 

Your existence has been claimed as a raw material by a parasitic economic logic, with human rights undermined in the process. Oh, and marketers are funding it. Welcome to the age of surveillance capitalism

This was originally published in issue 60 of Contagious magazine. To find out more about how you can become a member of Contagious and gain access to lots more great content, click here.


Harvard Business School professor Shoshana Zuboff has devoted her career to studying the rise of digital technology and its implications for society. Her latest book, The Age of Surveillance Capitalism, explores how the data-gathering methods and mechanisms of companies such as Facebook and Google have normalised a state of mass behaviour tracking, which Zuboff declares is ‘as significant a threat to human nature in the 21st century as industrial capitalism was to the natural world in the 19th and 20th’.

Alex Jenkins caught up with Zuboff to find out why marketers are complicit in creating this threat and why all that consumer data you’ve been collecting may be about to become a toxic asset.

For those of us who are a little rusty on our economic and political theory, can you explain the concept of surveillance capitalism?

The way I like to explain surveillance capitalism is to put it in a historical context. Capitalism evolves by continuously claiming things that exist outside the market dynamic, bringing them into the market dynamic and turning them into commodities. The key flywheel in the evolution of industrial capitalism was the idea of claiming nature for the market dynamic. The meadows and the forests and the rivers were turned into commodities that could be sold and purchased – real estate and so forth.

On the small scale, we see this happening around us all the time. There are now apps to let you know where there’s a parking space, and you can actually hire people to go and claim the parking space for you. You’re taking a public good and bringing it into the market sphere. Surveillance capitalism is comparable to industrial capitalism’s annexation of nature but the commodity that it’s creating is based on private human experience.

What puts the ‘surveillance’ in surveillance capitalism? Google understood that just grabbing your experience, bringing it into data for their own systems of production and sales, was not going to sit well with people. So, right from the start, they understood that these mechanisms had to be hidden. They had to observe through a one-way mirror. That’s what makes it surveillance.

In the book you take issue with the oft-used phrase: ‘If the service is free, then you are the product,’ retorting ‘You are not the product; you are the abandoned carcass.’ What’s the distinction?

Surveillance capitalism unilaterally claims private human experience as a source of free raw material that can be brought into the marketplace, used for production and ultimately used for sale. Private human experience becomes a commodity in this new economic model.

What happens with the taking of private human experience is that the claim of it being available and being free justifies these companies taking it and turning it into data. That’s a very important step. A lot of the conversation when we talk about regulatory responses starts with the idea of data ownership. But when you understand the mechanisms, my argument is that the issue begins before there is even data to own. That first step is taking experience and turning it into data in the first place.

You are not the product; you are the abandoned carcass

Shoshana Zuboff

With that mindset, Google’s various products, such as Search, Maps, Photos, Gmail, etc, aren’t actually different products at all – just mechanisms for turning experience into data.

There is a complete misunderstanding of what all these things are. They are supply chain interfaces. The only thing that surveillance capitalists really have to worry about is supply chain. It’s about expanding new flows of behavioural surplus. Every interface for the internet becomes a supply chain interface.

It’s starting to become more evident in everyday life and people are now feeling the impingement of this in many bizarre ways. For example, Google has finally admitted that there are Google employees who listen to the audio [recorded by its automated assistant]. All of these devices that record conversations, those conversations don’t disappear into the ether. They become commodities. In the business they’re called dialogue chunks and there’s a whole ecosystem of people who listen to and analyse them. They test their human comprehension of a dialogue chunk against an AI voice recognition system and they use that to create and improve AI.

These training sets, the dialogue chunks, get sold to the CIA or anybody who’s interested in improving their voice recognition capability.

You’ve described surveillance capitalism as both ‘an expropriation of critical human rights’ and a ‘significant threat to human nature’. If that’s the case, there’s a clear culpability on the part of the technology companies that created this dynamic.

But, given that many of them are largely funded by ad dollars, do you think there is complicity on the part of marketers?

Without marketers, this would not have been. The advertisers are complicit. The advertisers became the market. They are the demand, they are the marketplace. They were the initiating contact for all of that. The fact is that they had sold their souls to the black box a couple of decades ago. Of course, without understanding the full implications of what they were doing and what the consequences were going to be. They may begin to see that there’s such an opportunity for them to get out in front and lead right now.

The advertisers are complicit. They are the demand, they are the marketplace. The fact is that they had sold their souls to the black box a couple of decades ago

Shoshana Zuboff

While the public seems to be becoming more aware of surveillance through things such as your book or the Cambridge Analytica scandal, the flip side is an attitude of, ‘If you have got nothing to hide, why should you care?’

Well ‘not caring’ has to be parsed into a lot of pieces.

First of all is ignorance. Right now, 98% of Facebook’s revenue and 87% of Google’s revenue comes from targeted advertising. That’s a lot of money. The enormous market capitalisation of these companies has led to unbelievable wealth. They’re sitting on tonnes of capital and a lot of that capital is used to achieve one specific goal: to make sure that their systems are designed in ways that keep populations ignorant. People simply don’t know what’s going on. Most of the systematic evidence shows that when people do find out the depth and breadth of the backstage operation, they do care.

And when we do find out it becomes, ‘I don’t even know where to turn.’ A lot of the ‘not caring’ is a pernicious interpretation of the so-called privacy paradox. People say they care then they continue to use the systems, so the companies use that as proof that the systems are okay. That’s what’s called the ‘natural fallacy’ in logic. Just because it’s used doesn’t mean it’s good.

The real truth is that people don’t realise what’s going on because this stuff is carefully hidden and because the alternative has gradually been foreclosed to such an extent that, everywhere you turn, you’re marching through these supply chains. You try to organise dinner with your friends and family on Facebook? You’re marching through the supply chain. There’s almost nothing you can do in the course of just creating an effective daily life that doesn’t march you through the supply chain.

The systems are designed to be undetectable and indecipherable, but the campaign that went with that was a rhetorical campaign. This is where Orwell really comes in useful.

His overriding interest was in language and he looked very carefully at how language is used in warfare and the euphemisms that conceal the truth in language. That’s what the surveillance capitalists have been genius about. Right from the start there are rhetorical campaigns of misdirection and obfuscation: ‘If you have nothing to hide, you have nothing to worry about’ or ‘If there’s something to hide then you shouldn’t be doing it’ – lines that ring through the decades. This is a rhetorical campaign of domination to get people off their guard, to lull them into a sense of ease. It’s the opposite of what people should be doing, which the surveillance capitalists know very, very well.

Are you concerned that you’re pitting a moral argument against a commercial reality? Just because it may be ethically wrong to erode privacy at a societal level, it’s still really good business for some companies that aren’t hugely concerned about ethical considerations.

It’s only in the realm of ethics until there is law. Take, for example, child labour. The Gilded Age industrial leaders, who later we decided to call robbers, weren’t called robbers in the late 19th century, they were just super wealthy industrial capitalists. They were the millionaires. One way they got so wealthy was that they paid slave wages, they forced people to work seven days a week, for long hours in terrible working conditions. They employed children. It was society getting together and saying, ‘For children not only to be working in the dangerous conditions of a factory, but for children to be working at all, is a danger to society because we need children to be healthy, to be educated and they need to be able to integrate into our society as citizens so that we can have a democracy.’

The idea that children should be abused and wasting away in workplaces and working long hours is morally wrong. But it’s morally wrong because it means that we cannot have the kind of society we want to have. At first [in the US], you could employ children in some states and not in others. Then people realised, this has to be the whole country because this is about our society. That’s when we got comprehensive federal child labour legislation.

Will the current situation change dramatically once the law catches up?

Law trails behind the market because the market moves into lawless space. That’s the whole idea of: ‘We took nature because there were no laws to protect nature because no one thought it could be taken.’ There were no laws to protect private human experience because no one thought it could be taken. It takes a while to figure out what’s going on because democracy works slower than the market – and that’s a good thing. But then democracy figures it out and then we have the period of law.

You sound quite optimistic about this. Do you think the days of surveillance capitalism are numbered?

Without a doubt.


Even though the law seems to have hugely lagged behind the pace of technological innovation and adoption?

It’s not like our lawmakers have been working for the last 20 years to fight this stuff. The real fact is that we haven’t gotten started. We’re only now understanding the scope and the consequences of these mechanisms and methods. We haven’t yet created the laws and the regulations to interrupt and outlaw them. Once we get started and have the right focus and the right target, of course laws can bring this to heel. Things like [former Google CEO and executive chairman] Eric Schmidt saying, ‘Government can never catch up to us’ and the whole specious argument that law is the enemy of innovation and so forth – all of these things are just silly. They’re incorrect.

These are the rhetorical devices that have been invented to scare off the lawmakers and make the public think you can’t really do anything about it. Finally we’re seeing around the world a vanguard of lawmakers who are starting to get it. The International Grand Committee [on Big Data, Privacy and Democracy] just met for the second time in Ottawa. It now represents 14 countries. By the time it meets in Dublin in the fall, I’m sure there will be more countries. This is beginning to represent a huge portion of the world’s population. I can tell you for certain that the lawmakers who are on that committee are grasping the dimensions of this situation.

How do you see the implications of that regulation playing out in the future?

My view is that all of the data that people celebrate as big data is threaded with stolen assets. As law comes on stream, these assets are going to be reinterpreted as toxic assets. Just like the sub-prime mortgages that threaded through the derivatives market and all these financial products were reinterpreted as toxic assets and tanked the market in those financial products. I believe that day is coming.

Once the world wakes up and we get the regulation, those are going to be toxic assets. I think that’s the trajectory that we’re on.

Contagious is a resource that helps brands and agencies achieve the best in commercial creativity. Find out more about Contagious membership here.

This article was downloaded from the Contagious intelligence platform. If you are not yet a member and would like access to 11,000+ campaigns, trends and interviews, email [email protected] or visit contagious.com to learn more.