Hi everyone,
I don’t write often about AI. For someone who deals with the future of work for a living that may seem surprising. In fact it’s a conscious choice. I do think AI has played a significant role in changing the way we live and work (notice how I don’t use the future tense when mentioning “AI”). But too many people already focus essentially (if not exclusively) on AI when dealing with the future of work.
I’ll make an exception today to discuss AI voice assistants, because they are at the intersection of several themes that matter more to me, like culture, feminism and the sexual division of labour. By default, most of these voice assistants (Alexa, Siri, Google Assistant, navigation systems, etc) are made female. How can you be a good, submissive, self-effacing assistant with no ego if you’re not female?
Much has been written about the “gender problem” of virtual assistants. There’s even a Wikipedia article about “female gendering of AI technologies”! And I’ll add a few more thoughts on the subject by discussing a recent piece of news: Apple announced in April 2021 that it was adding new male Siri voices and that soon Siri won’t default to a female voice anymore:
We’re excited to introduce two new Siri voices for English speakers and the option for Siri users to select the voice they want when they set up their device. This is a continuation of Apple’s longstanding commitment to diversity and inclusion, and products and services that are designed to better reflect the diversity of the world we live in.
You may think that this is a small detail that matters little in a world full of much bigger problems than the default option of voice assistants. But it’s actually much more significant than you may think because it exemplifies a lot of what’s wrong in today’s world of work: biases in AI, the gender gap in technology, gender inequalities at work, the devaluation of care work, sexual harassment, violence against women… So I’ll try and convince you it’s actually a big deal👇💡
Why I don’t usually write much about AI
The reason why I don’t write much about AI even though I focus on the future of work is not a lack of interest. In fact I’ve read a lot on the subject. I also enjoy science fiction (last year I wrote this newsletter titled “science fiction to the rescue”). But I’m convinced that reducing the future of work to the influence of artificial intelligence is a mistake. In other words focusing on AI shouldn’t be about the future of work.
AI-obsessed future-of-work people often fall prey to a series of fallacies and misplaced obsessions:
The lump of labour fallacy: it’s this idea that AI-fuelled automation must lead to the end of work because there’s a fixed amount of work (lump of labour). So if it gets replaced by machines, then there’s less work left. I’m more focused on the quality of work than the end of it. (Although sometimes the two are intertwined).
The engineer’s fallacy: engineers want to come up with solutions to solve problems (we need people like that); they tend to believe that great solutions will necessarily be adopted by people. They think great (tech) products will always find a market and that the things that are technologically possible will necessarily happen at a large scale.
The obsession with technological singularity: it’s a hypothetical point in time after which AI will become uncontrollable, super powerful (and evil). It’s the stuff of sci-fi (2001: A Space Odyssey, Terminator, Matrix). Elon Musk is obsessed with it. It leads these people to ignore today’s essential subjects like the biases in the data used to shape our algorithms, for example.
A lack of interest in cultural subjects: as I wrote in a previous newsletter, “culture eats technology for breakfast”. When it comes to the future of work, cultural changes (like the impact of activist movements) often have a more profound impact than technological changes.
A lack of interest in what doesn’t change: I love this Taleb quote (Antifragile): “we notice what varies and changes more than what plays a larger role but doesn’t change. We rely more on water than on cell phones, but because water does not change and cell phones do, we are prone to thinking that cell phones play a larger role than they do.”
The blind spot that ignores care work: millions and millions of (essentially female) workers are busy caring for children and the elderly but their work is rarely of interest to AI-obsessed futurists. There’s this huge blind spot among future-of-work specialists: a majority ignore care work and/or do not see the sexual division of labour. It can largely be explained by the fact that there are too few women among them.
Why voice assistants matter
I don’t use voice assistants (for reasons that will be made obvious in this newsletter) so I have my own blind spot: I’ve probably often underestimated their influence (mea culpa). But the numbers are staggering: in 2020, roughly 4.2 billion voice assistants are used in devices around the world. The number of digital voice assistants may double by 2024 and become higher than the world’s population!
They’re everywhere. They’ve become a commonplace feature of many devices. There are over 110 million voice assistant users in the United States alone (about one third of the population). They use voice assistants primarily on their smartphones. But many more people than I thought also use smart speakers (70% of whom use Amazon’s): as many as 90 million Americans have such speakers in their home (really??)
Apparently, Covid could have accelerated their use in many contexts. As more companies seek to offer interfaces that enable touchless transactions (in response to demand for safety and hygiene). A 2020 Adobe Voice Survey reported a rapid increase in the usage of voice assistants:
The COVID-19 pandemic created a need for safe, sanitary alternatives for day-to-day tasks that involve elements of touch, especially in public spaces. One in three voice users (31 percent) count sanitation, like not needing to touch high-traffic surfaces, as a benefit of using voice technology.
Voice assisting software supported by all kinds of devices responds to commands, provides information, and assists in the control of other connected devices. The software may already have a huge impact on customer service. I wouldn’t make the mistake to say they won’t matter in the future of work because they matter in the present of work. As everything becomes “smart”, more and more elements of our lives will be impacted.
When it comes to voice assistants, I’m a (super) late adopter. I find them frustrating, inaccurate, and intrusive. I prefer the silence of written interactions. And I often prefer to read. But I understand I have a blind spot and many people do use them. So they matter. That’s why the type of voice chosen also matters. And the fact that nearly all voice assistants are made female by default is in no way trivial.
By default, a woman
What is an assistant? When you look it up in a dictionary you’ll find phrases like “a person who ranks below a senior person’, “a person who helps”, subordinate, auxiliary, number two… What are the skills required to be a great assistant? Active listening, attention to detail, adaptability, a flexible personality… and little or no ego. A good assistant is somebody who will not pay too much attention to their own needs and strive to meet the needs of others, even anticipate those needs.
It’s also somebody who doesn’t necessarily have ideas of their own but is really good at highlighting other people’s ideas. When they do have ideas of their own, someone else can take the credit with impunity. We’ve all met assistants who seemed to be doing all the work and wondered what the person they were assisting must be doing all day.
The subject of assistants was close to David Graeber’s heart. The late anthropologist wrote about a special category of bullshit jobs called the flunkies. “Flunkies serve to make their superiors feel important, e.g., receptionists, administrative assistants, door attendants.” This does not mean that flunkies don’t do any work. Quite the reverse. Often they end up doing the bosses’ jobs for them, as explained by Graeber:
The flunkies end up effectively doing the bosses’ jobs for them. This, of course, was the traditional role of female secretaries (now relabeled “administrative assistants”) working for male executives during most of the twentieth century: while in theory secretaries were there just to answer the phone, take dictation, and do some light filing, in fact, they often ended up doing 80 percent to 90 percent of their bosses’ jobs, and sometimes, 100 percent of its nonbullshit aspects. It would be fascinating—though probably impossible—to write a history of books, design, plans, and documents attributed to famous men that were actually written by their secretaries.
Unsurprisingly, this is a predominantly female job. In 2017 in the US, 95% of all secretaries and administrative assistants were female, so were 93% of medical assistants and 93% of dental assistants. The same is true about nurses (about 90% of whom are women in the US). I wrote a newsletter about nurses as the ultimate “assistants” and how the job was originally designed around an “artificial separation between caring and knowing” and the idea that male doctors needed to be assisted by female subordinates. (Today many more doctors are female but not many more nurses are male.)
Enough with female-voiced AI assistants
It shouldn’t come as a surprise that AI assistants are designed around the female archetype of the assistant. By default, that archetype is female. But given the massive influence of voice assistants (and the fact that the default option is just a design feature), we should care about challenging this stereotype in AI. Voice assistants perpetuate it. That’s why Apple’s decision to put an end to Siri’s female-by-default voice is a significant step.
There have been quite a few research papers published recently on the subject, a clear sign that a lot of people do take the subject seriously. A 2019 United Nations report insists that female-by-default voice assistants perpetuate the idea that “women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command.”
“Our technologies reflect our culture,” Dr. Miriam E. Sweeney told MarketWatch. “And the fact that we end up with female voices, or females portrayed in these various types of service roles, actually does reinforce the feminization of a certain labor force of servitude [like being a personal assistant or working a call center] that is often seen as less skilled, less valuable and that can be paid less.”
Many sociologists regard this as a major socialisation tool that teaches people about socially constructed gender roles and perpetuate them. As you can read in this PwC piece:
A sociology professor at the University of Southern California recently described it as “a powerful socialisation tool that teaches about the role of women, girls and people who are gendered female to respond on demand”. In comparison to the simple tasks these virtual assistants are asked to do, the question-answering computer IBM Watson is used to play the US quiz show Jeopardy and make complex business decisions. Funnily enough, he has a masculine voice.
Alan Winfield, co-founder of the Bristol Robotics Laboratory at the University of the West of England, Bristol, regards AI’s “gender problem” as “one of the top two ethical issues in robotics and AI” (the other being wealth inequality). Winfield was one of the authors of the principles of robotics, published by the Engineering and Physical Sciences Research Council in 2010. One of the five rules states that robots should not be designed to deceive.”
Sexual violence and female rivalry
The nefarious influence of female-by-default voice assistants goes beyond the world of work. It also perpetuates sexual violence. I’ve read quite a few articles that deal with sexual harassment and AI voice assistants, like this one funnily titled “I tried to sexually harass Siri, but all she did was give me a polite brush-off”:
Virtual assistants, the Unesco report said, are “obliging, docile and eager-to-please helpers”, who respond to sexual harassment the way many of us were forced to all the way through high school: by brushing it off.
The report is called “I’d blush if I could”: one of Siri’s classic responses to sexual harassment. This issue was raised months before the beginning of the #MeToo scandal, after Leah Fessler, a writer for Quartz, ran an experiment in which she sexually harassed Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana, and Google’s Google Assistant. She wrote: “By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated.
And there’s this Quartz “experiment”, “we tested bots like Siri and Alexa to see who would stand up to sexual harassment”. It turns many users have tried that too, without the pretext of conducting an experiment. Apparently female AI assistants get sexually harassed all the time (and a report found that a quarter of users actually fantasize about having sex with their voice assistant!)
One may say it’s as a harmless thing because with AI at least no human gets hurt. But the reality is that it perpetuates expectations and stereotypes that affect real women (assistants or otherwise) in real life. And it reinforces a culture of rape based on the negation of consent. The beauty of female robots is that you don’t need to pretend to listen to them. That’s why there are so many “love” stories with female AIs in science fiction (Blade Runner 2049, Her). There’s even that fantasy of replacing women with life-like robots (used by Ira Levin in The Stepford Wives).
Voice assistants have been programmed to give passive, polite and even delightful responses to abusive remarks, which has consequences beyond AI:
For example, before the #MeToo movement, if you called Siri a “bitch,” she would respond with “I’d blush if I could,” or, “There’s no need for that.” (...) Siri has since been updated to say “Language!” in response. What emerges is an illusion that Siri — an unfeeling, unknowing, and non-human string of computer code — is a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment. It projects a digitally encrypted ‘boys will be boys’ attitude” (I’d Blush If I Could).
The few times I have used voice assistants, I’ve witnessed myself become violent and abusive. What is it about AI female voices that brings out so much violence even in normally fairly peaceful people (like me)? In fact, female users may be even more frustrated and angry at voice assistants because voice recognition works much better with men than it does with women. “Voice Recognition Still Has Significant Race and Gender Biases”:
Speech recognition has significant race and gender biases. As with facial recognition, web searches, and even soap dispensers, speech recognition is another form of AI that performs worse for women and non-white people.
So voice assistants listen to men better than they listen to women, which fuels the idea of female rivalry, i.e. when a woman uses her power to keep another woman down or mistreat her, when there’s only room for one woman and the different women present have to fight each other for a position. I have no use for female rivalry in my life.
🚀 Nouveau Départ, the media I started with my husband Nicolas Colin, celebrated its first anniversary last week 🎂 We’ve recorded many new podcasts since my last newsletter, among which: Faillites d’entreprises : une tempête à venir ?, La solitude : l’autre pandémie, Rendre le monde plus accessible, Suppression de l’ENA : une mesure populiste ? Subscribe!
👩💻 For Welcome to the Jungle, I wrote new pieces: Parentalité au travail : les 6 leçons de la Nouvelle-Zélande, Et si "viser petit" était la clé du changement ?, Recrutement : 5 bonnes raisons de bannir les tests de personnalité, Le monde du travail est-il une nouvelle religion ?
🎙️ There are many new Building Bridges podcasts! The Art of Productive Disagreements with Ian Leslie, Discussing Europe (and Rabbits) with Noah Smith, Brave New Home with Diana Lind 🎧 Subscribe to Building Bridges.
📺 My next “Café Freelance” event with Coworkees is about freelancing and personal branding 🇫🇷 Join us on April 30 at 9:30 CET!
Miscellaneous
⚔️ It’s Time to Break the Cycle of Female Rivalry, Mikaela Kiner, Harvard Business Review, April 2021: “a big driver of female rivalry is the concept of “one seat at the table” [it] comes from a belief that diversity is mandated, but not useful.”
🍼 Why universal childcare is the crucial infrastructure the economy needs, Kathleen Davis, Fast Company, April 2021: “The U.S. lags behind many other countries in both paid leave and accessible childcare, and it’s costing women and hurting the economy.”
🥂 The Swedish Idea of “Little Saturday” Will Get You Through the Workweek, Lauren Allain, Forge, April 2021: “The Scandinavian countries have a tradition that might just offer some relief: breaking up the workweek with a celebration on Wednesday nights.”
Until next time, try and be gentle with all your assistants 🥺