Patricia Gestoso argues that gender discrimination in the interests of men is baked into artificial intelligence by design.
In discussions around gender bias in artificial intelligence (AI), there is little or no acknowledgment that it might be intentional.
We talk about discriminatory datasets and algorithms but avoid mentioning that humans – software developers – select those databases or code the algorithms. Any attempts to demand accountability are crushed under exculpating narratives such as programmers’ “unconscious bias” or the “unavoidable” opacity of AI tools, often referred to as “black boxes”.
Moreover, the media has played a vital role in infantilising tech male leaders as a means of protecting them from harm. They are often portrayed as naughty young prodigies unaware of the unintended consequences of the tools they develop, rather than the astute executives they are, who have notorious encounters with justice for data breaches, antitrust violations, or discrimination at work. There is nothing unintentional or fortuitous in what the men in tech do.
Patriarchy is much older than capitalism; hence, it has shaped our beliefs about those who have purchasing power and how they use it. So patriarchy wants us to believe that women don’t have money or power, and that if they do, they’ll spend it on make-up and babies and put up with services and products designed for men. Moreover, that women are expendable in the name of profits. All this while in 2009 women controlled $20tr in annual consumer spending and 2023 they own 42% of all US business,
Tech, where alpha males run rampant, has completely bought into this mantra and is using artificial intelligence to implement it at scale and help others to do the same (See box: AI-powered misogyny beyond tech). That’s the reason it disregards women’s needs and experiences when developing AI solutions. And why it deflects its accountability in automating and increasing online harassment; purposely reinforces gender stereotypes; operationalises menstrual surveillance; and sabotages women’s businesses and activism.
It’s only through understanding the pervasiveness of patriarchy, meritocracy, and exceptionalism in tech that we can explain why the sector dares to brag about its limitless ability to tackle complex issues at a planetary scale with an extremely homogenous workforce, mainly comprising white, able, wealthy, heterosexual, cisgender men.
For example, Iran has announced the use of facial recognition algorithms to identify women breaking hijab laws. The baby-on-board market is a goldmine and technology is instrumental in helping vendors to exploit it. It has become habitual that retailers use AI algorithms to uncover and target pregnant girls and women. Then, there is sexual exploitation. According to the United Nations, for every ten victims of human trafficking detected globally, five are adult women and two are girls. Overall, 50% of victims are trafficked for sexual exploitation (72% in the case of girls). Traffickers use online advertisements, social media platforms, and dating apps – all powered by AI – to facilitate the recruitment, exploitation, and exertion of control and pressure over the victims. And thanks to generative AI, it has never been easier for individuals to create misogynistic content, even accidentally. Examples include: ChatGPT replicating gender stereotypes when writing professional profiles, stressing communal skills for women whereas highlighting financial achievements for men; Playground AI making women’s headshots look “more professional” by altering facial features to conform to Caucasian ideals of beauty; and Stable Diffusion generating 94% of the time images of male chief executives and engineers. The answer from tech leaders to their responsibility about generative AI fostering biases has been to issue letters focusing on a dystopian future rather than addressing the present harms. Even better, they have perfected the skill of putting the onus on governments to regulate AI while lobbying to shape those same regulations.
Tech thinks that there is no problem that digital technology cannot solve and, when you plan to save the world, AI is the ultimate godsend. It’s only through understanding the pervasiveness of patriarchy, meritocracy, and exceptionalism in tech that we can explain why the sector dares to brag about its limitless ability to tackle complex issues at a planetary scale with an extremely homogenous workforce, mainly comprising white, able, wealthy, heterosexual, cisgender men.
For instance, recruiting AI tools are portrayed as promising the end of biased human hiring. The results say otherwise. Notably, Amazon had to scrap its AI recruiting tool because it consistently ranked male candidates over women. The application had been trained on the company’s ten-year hiring history, which was a reflection of the male prevalence across the tech sector.
Another example is the assumption by manufacturers of smart, internet-connected devices that danger typically comes from outside the home. Hence, the need to use cameras, VPNs, and passwords to preserve the household’s integrity. But if you’re a woman, the enemy may be indoors. One-in-four women experience domestic violence in their lifetime yet tech companies are oblivious to it. One way perpetrators control, harass and intimidate their victims is by taking advantage of AI to manipulate their victims’ wearable and smart home devices. Faced with this design glitch, women don’t have another option than to become their own cybersecurity experts.
Tech is also a master at deflecting its responsibility on how AI enables bullying and aggression towards women. For example, we’re told that we must worry about deepfakes threatening democracies around the world based on its capacity to reproduce voices and images from politicians and world leaders. The reality is that women bear the brunt of this form of AI.
A 2019 study found that 96% of deepfakes are of non-consensual sexual nature, and of those, 99% are made of women. This is content aimed to silence, shame, and objectify women. And tech expects the victims to uncover and report the material. For example, it’s left to women to proactively request the removal of the harmful pages from Google Search.
And then, we have the online harassment of female journalists, activists, and politicians fostered by algorithms that promote misogynistic content to users prone to engage with it. It is important to note that black women are 84% more likely than white women to be the target. Research by the Inter-Parliamentary Union about online abuse of women parliamentarians worldwide found that 42% of them have experienced extremely humiliating or sexually-charged images of themselves spread through social media.
When men in tech are asked to take responsibility for online harassment, they hide behind freedom of speech or their powerlessness to police their own creations, while financially benefiting from the online abuse of women.
When men in tech are asked to take responsibility for online harassment, they hide behind freedom of speech or their powerlessness to police their own creations, while financially benefiting from the online abuse of women.
But how do machines know what a woman looks like? The Gender Shades study showed that face recognition algorithms used to predict race and gender were biased against darker females, which showed up to a 35% error compared to 1% for lighter-skinned males. While Microsoft and IBM acknowledged the problem and improved their algorithms, Amazon blamed the auditor’s methodology.
Tech has a long tradition of capitalising on women and gender stereotypes to anthropomorphise its chatbots. The first one was created in 1966 and played the role of a psychotherapist. Its name was not that of a famous psychotherapist such as Sigmund Freud or Carl Jung, but Eliza, after Eliza Doolittle in the play, Pygmalion. The rationale was that through changing how she spoke the fictional character created the illusion that she was a duchess.
Following suit, tech companies have intentionally designed their virtual home assistants to perpetuate societal gender biases around feminine obedience and the “good housewife”. Their default female voice, womanly names — Alexa, Siri, and Cortana — and subservient manners are calculated to make users connect to those technologies by reproducing patriarchal stereotypes such as a submissive attitude towards verbal sexual harassment, flirting with their aggressors, and thanking offenders for their abusive comments.
Tech has also profited from helping to automate and expand society’s control and influence over women’s reproductive decisions. While there is nothing new in society depriving women from their bodily autonomy – there are myriad examples of government-sanctioned initiatives forcing women’s sterilisation and reproduction. What’s frightening is that the use of AI brings us closer to a dystopian future.
Microsoft has developed applications used across Argentina, Brazil, Colombia, and Chile with the promise to forecast the likelihood of teenage pregnancy based on data such as age, ethnicity, and disability.
AI is an ally of “pro-life” groups too. An analysis of the results shown to women searching for online guidance about abortions revealed that a substantial number of hits produced by the algorithm were actually adverts styled as advice services run by anti-abortion campaigners. Google’s defence? The adverts had an “ad” tag.
Last but not least, tech actively sabotages women in areas such as self-expression, healthcare, business, finances, and activism.
AI tools developed by Google, Amazon, and Microsoft rate images of women’s bodies as more sexually suggestive than those of men. Medical pictures of women, photos of pregnant bellies, and images depicting breastfeeding are all at a high risk of being classified as representing “explicit nudity” and removed from social media platforms.
It can escalate too. It’s not uncommon that women’s businesses relying on portraying women’s bodies report being shadow banned – their content is either hidden or made less prominent by social media platforms without their knowledge. This practice decimates female businesses and promotes self-censoring to avoid demotion on the platforms.
Algorithms also flag women as high-risk borrowers. In 2019, tech founders, Steve Wozniak and David Heinemeier Hansson, disclosed in a viral Twitter thread that the Apple Card had offered them a credit limit ten and twenty times higher than to their wives even though the couples shared their assets.
Tech doesn’t appear to think that female activism is good for business either. For years, digital campaigns have highlighted that Meta’s hate speech policies result in the removal of posts calling attention to gender-based violence and harassment. The company continues to consider those posts against their policies – in spite of their Oversight Board overturning their decisions – and suspending the accounts of black women activists that have reported racial abuse.
Tech is embracing the patriarchal playbook in its adoption and deployment of artificial intelligence tools. Hoping to reap massive financial returns, the sector is unapologetically fostering gender inequity and stereotypes.
In summary, tech is embracing the patriarchal playbook in its adoption and deployment of artificial intelligence tools. Hoping to reap massive financial returns, the sector is unapologetically fostering gender inequity and stereotypes. And that is before you consider the wider impacts of hardware (see box: The other women in tech).
While AI is naturally associated with the virtual world, it is rooted in material objects. Moreover, most tech software and platform giants – Apple, Google, Amazon, Microsoft, and Meta – are hardware providers as well. Datacentres, smartphones, laptops, and batteries rely heavily on special metals and women often play a key role in their extraction and recycling.
For example, the Democratic Republic of Congo supplies 60% of the world’s cobalt. The mineral is extracted via artisanal and industrial mines. Some sectors welcome the integration of women into the artisanal mines as a means to empower them financially and as a substitute for child labour. However, the activities females perform in the mines are the most dangerous as they involve direct contact with toxic minerals, leading to cancer, respiratory conditions, miscarriage, and menstrual disruption. Women working in some of those artisanal mining sites report daily violence and blackmail. What’s worse is that adult females earn half of what adult males make (an average of $2.04 a day).
What has tech done about this? Software-only companies continue to look the other way, while those manufacturing hardware avoided their responsibility as much as they could.
Most companies have taken moderate or minimal action and, in some cases, they have denied knowledge of breaches in human rights. Still, it’s clear that the bulk of the action is directed toward eradicating child labour and that the particular challenges that women miners face are left unaddressed.
There is also a gendered division of labour in electronic waste, a €55bn sector. Women frequently have the lowest-tier jobs in the e-waste sector. They are exposed to harmful materials as they pick and separate electronic equipment into its components, which in turn negatively affects their morbidity, mortality, and fertility.
Again, the focus of the efforts goes to reducing child labour and women’s work conditions are lumped with those of “adult” workers. An additional challenge compared to mining work, is that hardware manufacturers control the narrative, highlighting their commitment to recycling materials across their products for PR purposes.
What’s the fix? As black feminist, Audre Lorde, wrote: “The master’s tools will never dismantle the master’s house.” While tech continues to be run by wealthy, white men who see themselves as the next Messiah, misogyny and patriarchy will be a feature and not a bug of AI applications.
We need a diverse leadership in tech that sees women as an underserved market with growing purchasing and executive power. Tech also needs investors that understand that outdated patriarchal beliefs about women being a “niche” don’t serve them well. On the bright side, it’s encouraging to see categories such as Femtech, which focuses on female healthcare innovation, reaching $16bn in investment and projected to be $1.2tr by 2027.
Finally, Tech needs to assume responsibility for the tools it creates and that goes beyond monitoring an apps’ performance. It starts at the ideation stage by asking uncomfortable ethical questions such as “Should we build that?”. The “move fast and break things” approach has been tested to destruction.