An elite woman cyclist’s biking injury, so severe it required two surgeries to address, can be traced back to her bike saddle – designed for men.
Hannah Dines is a British Paralympian whose standard-issue saddle caused a painful lump of fat to form to protect her sensitive tissue, known as a lipoma, requiring two reconstructive vulva surgeries to remove.
While male cyclists also suffer saddle sores, the ‘forward-leaning’ position of elite women cyclists on a bike saddle is the primary cause of damage. Unlike male cyclists, whose ‘valuable parts of the male genitalia can be moved out of the way’, women are literally bearing the brunt of gender bias within the design process.
“It would be remiss not to stress that saddles are designed with men in mind,” Dine writes, in an article published by The Guardian.
“Bike parts, technological developments and racing procedures are all geared towards men.”
Due only to the persistence of several women cyclists, sponsor Specialised innovated a bike saddle specifically to address their issues, with memory foam technology to alleviate pain and numbness.
It’s the only saddle Dine can use.
Based on research from the seminal book on the topic, Invisible Women, an article published by The Wall Street Journal looks at the history of gender bias inherent in product design.
Products are designed with ‘Reference Man’ in mind – a five-foot-nine, 70-kilogram white male, aged 25 – 30.
Consequently, gender bias in design has long been an issue. One of the main blockages is the feedback loop resulting from the overconcentration of men in key industries, as articulated in the WSJ article:
‘Research is largely funded by men, who make up 93% of venture capitalists, who in turn hire male designers and execs to focus on products that suit male needs.’
Caroline Criado-Perez, author of Invisible Women, found in her research that the exclusion of women and their experiences in the data-gathering process leads to the output of skewed data. And, seeing as most product designs are only dreamt up for men, they overwhelmingly fail to address the needs and wants of women.
Smartphone companies are routinely criticised for making their phones bigger with each new model. Many women want to upgrade to the newest technology, too – in fact, more women purchase iPhones than men – but bigger phones are increasingly difficult for them to use as advertised.
Then there’s issues associated with artificial intelligence (AI) algorithms used by smartphones and just about every other tech system: AI draws on white, male-centric data. This lack of gender and racial diversity poses problems for real-world applications.
Loosely defined as ‘intelligent systems that have been taught or learned how to carry out specific tasks without being explicitly programmed how to do so’, technology increasingly relies on AI.
Computer scientist Joy Buolamwini documented her 2015 experience of facial analysis software after her face wasn’t ‘white enough’ for the software to detect. Buolamwini had to put on a white face mask just to be recognised.
In an article published by TIME, Buolamwini discussed the research she has conducted on gender and racial bias in AI facial recognition systems developed by IBM, Microsoft and Amazon.
“The companies I evaluated had error rates of no more than one percent for lighter-skinned men. For darker-skinned women, the errors soared to 35 percent.”
“AI systems from leading companies have failed to correctly classify the faces of Oprah Winfrey, Michelle Obama, and Serena Williams. When technology denigrates even these iconic women, it is time to re-examine how these systems are built and who they truly serve,” she writes.
In 2018, Amazon was forced to abandon its automated hiring tool experiment after it was revealed the AI system was preferencing male candidates over women candidates.
Because the system was trained on 10 years’ worth of previous recruitment data, men’s historical ‘dominance’ in the tech industry meant the system built up an ingrained preference for men and disregarded women candidates who were also qualified.
The exclusion of women and their experiences in the data-gathering process leads to the output of skewed data. And, seeing as most product designs are only dreamt up for men, they overwhelmingly fail to address the needs and wants of women.
Most concerning of all is that male-centric designs can have lethal implications.
Given that the crash test dummies used in the car design process are traditionally modelled on a slight iteration of Reference Man – male, 175 centimetres tall and 75.5 kilograms – cars are not overly safe for women.
(Every person who does not reflect Reference Man’s measurements and/or gender, however, is obviously more likely to suffer more significant injuries in car accidents than those who do.)
A study published in 2011 by the University of Virginia analysed US car crash data between 1998 and 2008. It was found that where the ‘average’ man and woman were involved in a comparable crash wearing a seat belt, women were 47 percent more likely to sustain serious injuries.
Another study published by the university in 2019 revealed that, on average, women are 73 percent more likely to experience serious injury or die in frontal car accidents than men.
This is despite women crash test dummies being developed as early as 1966. The critical failure, however, was to create female dummies in the likeness of Reference Man – meaning female dummies do not accurately reflect the anatomical makeup of the average woman’s body.
As a result, car designs do not account for women’s differing ‘muscle and ligament strength’, ‘spinal alignment’ or ‘mass distribution of different body parts’, which we now know makes all the difference for safety outcomes.
Similarly, women in professions requiring personal protective equipment (PPE) report that the equipment does not cater for their size or anatomy.
A 2017 research paper by UK group Trades Union Congress chronicled women’s experiences of gender-biased PPE within several industries, including the police, coastguard and rail service.
“We don’t need to be ‘Barbie-fied’, just have the same gear as the men, but with an adjustment to allow a proper fit for women,” explains a respondent from the NHS Estate Department.
A policewoman respondent reported that her ill-fitting stab vest hardly ensures her safety in the field:
“My stab vest usually chokes me when sitting in the police vehicle… As far as being something to protect me against a knife, there are plenty of areas accessible to anyone who wanted to do serious harm.”
“When we wear such heavy and needed PPE it should be properly fitted and suitable to individuals if not at least suitable for the female body shape,” she says.
Women in the US Army, now recruited for previously ‘male-only’ elite units, have had to reconfigure their body armour including ‘removing protective side panels’ and putting pieces of foam under straps to ‘reposition gear and ensure their organs are protected’.
In an interview with Evoke, Criado-Perez explains how bias inherent in product design persists even in the age of innovation:
“Because the datasets on which we train algorithms are hopelessly male-biased, voice recognition software doesn’t recognize female voices, translation software translates female doctors into male doctors, and image-labelling software labels men as women if they are standing next to an oven. And these are the least harmful examples.”