16 August 2019
On brands and gender bias with Caroline Criado-Perez /
Contagious speaks about the gender gap with the author of Invisible Women: Exposing Data Bias in a World Designed for Men
In the Wildfire section of the forthcoming issue of Contagious magazine, we explore the startups and industry stalwarts using technology to tackle gender bias.
Writer and campaigner Caroline Criado-Perez knows better than most the ways that this world has been built to the detriment of women. Her new book, Invisible Women: Exposing Data Bias in a World Designed for Men, brings together an impressive array of data to shine a light on how gender bias affects the design of everything, from new technologies to government policy.
We spoke to Criado-Perez about what brands and marketers can do to address the problem of the ‘default male’ and make the world more inclusive.
Why do you think it’s important for brands and companies to become more inclusive?
Even with the rise of feminism we are used to thinking of men as the default humans. This mindset is pervasive and has never been addressed. If they care about serving the needs of more than half the population then companies need to make products that work well for everyone, not just men. There’s a social imperative, but there’s also a business imperative. You should be designing for this huge market. There are a lot of women in the world who are also making purchase decisions and it makes sense to account for them. It surprises me that more haven’t done it because it’s such a gap in the market and an obvious win. I was surprised when I heard that Apple was discontinuing the SE model, which is the small model of the iPhone and the only one that fits in my hand, when they could be capitalising on it.
Image by Jeshoots, from Pixabay
What is the most important thing that you have taken away from researching and writing ‘Invisible Women’?
Something that has continued to make me really angry is the excuses that would always come up that boiled down to: ‘Women are too complicated to measure.’ This comes up in all sorts of fields from the economy to travel infrastructure and health. It’s not just forgetting that women exist, it’s actively saying that women don’t matter and that is incredibly sobering. You have to be doing such mental gymnastics to make it seem fair that it makes sense to not measure reality because it’s too complicated. If you’re going to design anything from public policy to medication and technology that is going to work beyond just the lab then you’re going to have to engage with reality.
If you run a company or a brand, how do you detect if your product or service is biased?
You can collect data, that’s the way you discover it. You test your product on women as well as men and you aggregate your data. The solution to all of this is so incredibly simple.
What is your opinion on terming things gender neutral?
The way that we assign the term gender neutral to different things is wrong. Most of these references aren’t gender neutral at all. One of the best ways that we can prevent male default is when you mean male, say male. For example, when we say football what we actually mean is men’s football. When we mean women’s football that’s what we say. We have to stop allowing the male to occupy the default, to stop allowing them to occupy the gender-neutral space. It might not matter so much when it comes to football, but it matters a hell of a lot when you start thinking about the way that it impacts on things like product design and health research. These small things are very easy to do and it’s something that brands can do in their advertising. We don’t notice it the majority of the time, but when you’re talking with gender-neutral terms you’re actually still reinforcing the male default.
Your book addresses gender blindness in tech with issues such as audio devices being more likely to understand men. What needs to change?
There needs to be far more women hired as designers and coders because with the best will in the world, men just won’t know all the things that are likely to affect women and the needs that women are likely to have. Things like Apple not including a period tracker is so clearly a result of someone simply having forgotten about women. It’s not malicious, it’s to do with perspective. Another example I cite in the book is of a virtual-reality game called QuiVr where there was an incident of a woman being sexually assaulted in the virtual-reality setting. The guys who developed the game dealt with it immediately but said, ‘How could we not have thought of this really obvious issue?’ I think it’s very clear how they didn’t think of it; it’s not part of their daily experience.
How does race impact the data bias? For example, how does a white woman’s experience of data bias differ from that of a black woman?
There is a data gap when it comes to women but when it comes to data that is aggregated by sex and ethnicity or sex and disability, it just doesn’t exist. I would often try and find stats for black women or for disabled women and I just couldn’t. The specific issues that affect black women are lost in larger groups like ‘black people’. Part of the problem with the male default is that women are viewed as a minority, they think we are all the same and just throw us into one big category.
What would you say is the most pressing gender disparity that you came across in your research?
It’s really hard to choose a specific area because they’re all incredibly distressing. How could I possibly choose between health research where women are dying because we aren’t recognising their symptoms and car design where women [wearing seatbelts] are more likely to be seriously injured if they’re in car crash because we are designing cars based on a male body? [Which Volvo has since tried to address]. All of these problems are incredibly important. Yes, there are some issues that are less serious like air conditioning in offices being too low for the average female body. But at the heart of these problems, from the most serious to the most irritating, is the same fundamental problem of not collecting data on women and using data on men because we think of them as the default humans.