Recently, I came across an article in the Engineering and Technology magazine that made me realize up to what extent artificial intelligence (AI) is mirroring our gender biases, conscious or unconscious. Think about the ubiquitous female voice in our home-assistants: Google Home, Microsoft’s Cortana, Amazon’s Alexa, and Apple’s Siri perpetuate the stereotype of female obedience.
What was even more disturbing was to learn that this submissive attitude goes beyond the voice. In Feb 2017, Leah Fessler shared her experiments to test the behavior of the four home-assistants to verbal sexual harassment. Google Home typically replied with “don’t understand”, whereas Alexa and Siri would flirt or even thank the comment! Cortana mostly preferred to perform a search on Bing…
This year Fessler tested Alexa again, and she “kind of” stood up to the challenge. In spring 2017, “her” programmers gave her a “disengage mode” and now Alexa replies with “I’m not going to respond to that,” or “I’m not sure what outcome you expected.” to verbal harassment. So, whilst there is substantial progress since last year, it’s clear that still the top priority is that “Alexa doesn’t upset customers”.
We may think that this is funny or inconsequential, in the end, we are talking about bots. But those computers are in our houses, talking to our kids… If we are giving them gender to mimic humans, shouldn’t they stand for what we believe? What’s more, why should they even have a gender?