It's always Siri, Alexa, and Cortana, but never Tom, , and Harry. Female voices are the sound of our sat navs, laptops, mobiles, and many more technologies – which is somewhat sardonic given that technology is a male-dominated industry.
So, how have female voices and personalities become the default sound of our devices, and why does it matter?
The answer is simply gender imbalance.
"oH, hErE wE gO," some will yelp. "jUsT cHanGE tHe SeTTinGs To A MaLe VoIcE tHeN!?!?"
However, in the same way you can't put a plaster on a bullet wound, you can't mask the objectification of women by changing Siri's voice to "English (UK) male".
As flattering as it is to be considered the default gender for voice technology (although it's not really, is it?), the natural association between AI and female personalities is far from a compliment. You see, the nature of these technologies are often virtual assistants, which is highly telling of the tasks that we expect them to carry out. From scheduling, to researching, and to generally being submissive and at your beck-and-call, female virtual assistants play into the archaic stereotype of the subservient woman and, as a result, reinforces it too.
In particular, virtual assistants and servant robots fortify the idea of women responding to requests. It brings obedience and submission to the surface, as well as other antiquated traits once attributed to women. As UNESCO's report I'd blush if I could identifies, "the more that culture teaches people to equate women with assistants, the more real women will be seen as assistants – and penalized for not being assistant-like."
Technology is supposed to drive the world forwards, but in its endeavour to do so, it's sending women's progress backwards. However, the companies that spearhead their own virtual assistants are not at fault, exactly. Obviously, much research would have gone into finding the optimal voice for their audience, and more often than not, it lands on female. Generally speaking, female voices are found to be more polite, more caring, and more nurturing. Businesses aren't the only ones that think so – time and time again, consumers are found to trust female voices more. In turn, it's indicative of a more societal problem in which we are still unknowingly perpetuating narratives that women are fighting so hard to be rid of.
As we become better informed and educated about injustice and discrimination in the world, we also become more aware of bias in AI. In turn, we should go forth by ensuring that our technologies are a representation of humankind as a whole, and not of what those building it (who are, let's face it, predominantly blokes) deem appropriate for the gadget. A great example of inclusive AI is Q, the world's first genderless AI voice. Let's see some more of that!