© Microsoft

Sign of the Times

Gendered Chatbots

In our new semiotic series the spotlight is on objects. We let them tell their story and ask how they represent a new kind of culture, reflect the state of our current society and change how we behave and interact. Today semiotician Max Leefe wonders why the chatbots that speak through devices like the Amazon Echo or the iPhone are so often female.

Recently the presence of chatbots in our lives has become harder and harder to ignore. Not only because they are increasing in number (there are over 34,000 on Facebook Messenger according to the British newspaper The Guardian), but also because they have taken on more and more of an identity. They appear in our cars, talk to us through our phones and respond happily and sometimes with humour to existential questions:

“Siri, Who are you?”

“Who I am isn’t important.”

Despite Siri’s answer, who these chatbots are is important, for these bots have names and through these names, they have a culturally inscribed existence. And the large majority of these names appear to be female.

But why is the name of a robot important? A chatbot has no body, no physicality, in fact it is questionable if they even “exist” as such. But let us compare the soft and gentle tones which Amazon has given Alexa, their chatbot in the Echo device, with HAL 9000, the robot which takes over the spaceship in the movie Space Odyssey 2001. On the one hand, you have a youthful, pleasant and helpful woman whose job is to serve and on the other an older man who it has to be true maintains an irritatingly conversational tone but who is clearly in command. Female, male, assistant, boss. What do we see here? A replication of the binary cultural roles that many of us are trying to break down in the real world.

There is no need for AI to be gendered. There are plenty of examples of non-gendered robots, including Aibo, Sony’s dog. Yet, for the world of developers, if you want a chatbot to appear non-threatening and helpful, then you give it a name which is gendered female and you make the default voice feminine. One developer who tried to avoid this was routinely required to correct colleagues who referred to Kai, her chatbot, as she, rather than it.

This extreme anthropomorphisation also opens up the bots to the same type of abusive behaviour that many young women experience online. Last year Microsoft’s chatbot Tay was retired within hours of launch after users taught her to respond with racist and inflammatory political statements referencing Hitler.

The need to personalise objects is not a new one, but it would be sad if we allow the male dominance of the tech scene to reinforce gender perceptions which many of us disagree with. Let’s not throw away the chance of actually creating a real Brave New World online. Maybe we humans cannot overcome gender division but surely the chatbots can. If we want to be good parents, then we could start by choosing the right name.

References

Image © Microsoft