You Don’t Want Your Privacy (or do you?)
In a data-driven world where everything we do can be quantified, analyzed, and monetized, privacy has become one of the defining issues of our time. As such, it’s one of the four main topics on the agenda at Now / Next / Why, Contagious' event in London (29 April) and New York (8 May) where the Contagious team and expert external speakers will explore the big social, technological, media and marketing shifts impacting brands.
In order to get into the privacy spirit we thought we’d make public some of what we consider to be the most interesting thinking going on in this arena. To kick things off we caught up John Foreman, Chief Data Scientist for MailChimp.com and author of Data Smart: Using Data Science to Transform Information into Insight about his take on the data privacy trade-off.
TL;DR? Then just remember this: 'If as a company all you look at are laws to determine how you treat privacy, then you're going to end up looking like assholes. Legal, maybe, but assholes nonetheless.'
You’ve talked about Disney’s MagicBands being one of the first examples you’ve seen of ‘physical design for the sake of digital data purity. How much do you think the impetus to collect data is going to affect physical design in the future? Are product designers increasingly going to be paired with data scientists?
I think data collection and analysis is going to affect product design in myriad ways. The Disney example you mention is one where the product itself has been designed to ensure data purity (prevent switching between users). Alternatively, we'll see a lot of products which are designed to add in some data gathering component even if data-gathering isn't necessary for their operation (much like you have with smart phone apps and games these days). Data has become indispensable for improving and expanding a product, and so people are going to bake in data gathering even if they don't exactly know how to use it at the outset.
For example, in MailChimp, we had a step where the user could put in a plain text version of their email for those readers who can't load HTML email. But then we looked at the data and discovered that only 1% of users ever changed the auto-generated text email. So guess what? Our designers eliminated the step and simplified the application. Data is very, very important when deciding not just to add features and design elements but to eliminate features. People always say they need something, but when you look at the data, you may see that in fact no one wants it. It's hard to kill bloat without data.
You’ve talked about the fact that while we ‘give privacy lip service, we vote with our keypresses and our dollars’. But while we quite happily trade our privacy for convenience/utility/entertainment now do you think that people are slowly becoming more privacy-conscious? Will people start to want (and want to pay for) their privacy?
If I offer a game like Flappy Bird online for free but it tracks your GPS location and gives me your email address, phone number, and first name, and then I offer that same game for $.99 but it's tracking-free, I would bet a lot of money the vast majority of folks would opt for the creepy free version. No, people do not want their privacy and will not pay for it.
Their realtime location and identity is worth less than a dollar to them. I don't think that value judgement is going to change anytime soon unless we see gross privacy abuses perpetrated by corporate America that hit close enough to home to make people reconsider. Customers understand frightening stories more than abstract concepts around data collection, and I don't think the stories are frightening enough. Yet.
As privacy becomes more important to people, will brands be judged more on what they *don’t* do with people’s data rather than what they do do? Is privacy going to become a competitive advantage?
No, I don't think it will become a competitive advantage. Check out this great opinion piece from Bruce Schneier on the Stalker Economy.
Let's be generous and say that 5% of people truly care about privacy (value it at more than $.99). Then what's better for a company? On one hand, they could alienate 5% of their customer base but collect as much data as possible and get a data-driven revenue uplift out of it (revenue management modeling for example). On the other hand, they could commit to privacy, make 5% of their customers stay, make 100% say they're happy, but keep revenues flat from a data strategy perspective. Until the scales tip in favour of people caring about privacy, the revenue argument just isn't there for companies.
How do you think the definition of privacy has changed? Is it an outdated concept in a data-driven world?
I think that we're in a very confusing place right now. People talk a lot about "my data" while simultaneously signing up for free services that make money only by collecting personal data and serving up ads. I think privacy is a valuable concept. And it doesn't have to be outdated. But the products and services we're being sold on the cheap that consume our data are so attractive that we're effectively treating privacy as outdated.
I see two possible ways that something good can happen for privacy in the short term:
- Online advertising ceases to make companies as much money, because people just don't engage with the ads. Companies now actually have to produce a product people want to buy, so they spend less time invading their privacy to serve up better ads.
- Legal action. Given the effectiveness of the U.S. legislative and legal systems, I'm not counting on something happening this way.
As Chief Data Scientist at Mail Chimp how do you balance giving people data-driven insights with the privacy issues?
If as a company all you look at are laws to determine how you treat privacy, then you're going to end up looking like assholes. Legal, maybe, but assholes nonetheless. At MailChimp we go beyond the law to look at our customers' expectations of privacy. We don't make money based on data collection or ad targeting. That means we can be very intentional in how we collect and use data. And we are.
We don't shy away from using data, but rather try to build tools that are a win for everybody without ever needlessly exposing data. Take for instance out anti-abuse machine learning models. They use a global set of data on two billion email addresses that we collect. But we don't share that data with anyone. No, we use it to make sure non-compliant users get ejected from our system which helps maintain good inbox placement for our other senders. Everyone except the spammers wins in that arrangement. And that's a privacy trade-off that our users have told us they're willing to make.
Interested in hearing more about privacy? Book tickets to Now/Next/Why via Eventbrite or call Arianna on +44 203 206 2975. Contagious Magazine and Contagious I/O subscribers receive a 10% discount.