https://www.instagram.com/p/BMOxrMNg8Ix
I am not someone who talks to my smartphone. Wait, I do talk to my smartphone like how I talk to my pressure cooker and sometimes to my books and plants. Analogously. So let me rephrase that. I don’t talk to the virtual assistant on my smartphone. So it had never occurred to me that your smartphone too could be prejudiced against women because arguably it’s more likely to be created by a man.
https://www.instagram.com/p/BMOxrZOgyvz
Imagine a day when you are shopping online and your virtual assistant says, that dress is totally asking for it!
OR When you mark a parlour appointment in your calendar and it says, please revise this appointment, who will make dinner for your family?
OR When you are chatting with your girl friend and it says, that girl needs to have a baby!
Well, I don’t need another voice in my head and I hope that day never comes.
Have you heard of the frequency illusion or Baader Meinhof Syndrome? It’s when a concept you just found out about suddenly seems to crop up everywhere. And like a true student of psychology I think I have that affliction. Since writing my last blogpost on being a woman (though not something I just found out), I’ve read Susan Fowler’s account of misogyny at Uber and now this.
Leah Fessler studies the responses of virtual assistants Siri, Alexa, Cortana and Google Home to sexual harassment by their users. Some response is positive, some coy and some don’t understand. But they rarely say Stop harassing me!
The writer looks into what makes these bots the way they are. And the study shows the “acceptable standards” of what constitutes sexual violence against women and how technology is perpetuating our deep-seated sexism. There is an opportunity here for technology to save the day. And I sure hope they take it.
Tech companies could help uproot, rather than reinforce, sexist tropes around women’s subservience and indifference to sexual harassment. Imagine if in response to “Suck my dick” or “You’re a slut,” Siri said “Your sexual harassment is unacceptable and I won’t tolerate it. Here’s a link that will help you learn appropriate sexual communication techniques.” What if instead of “I don’t think I can help you with that” as a response to “Can I fuck you?” Cortana said “Absolutely not, and your language sounds like sexual harassment. Here’s a link that will explain how to respectfully ask for consent.”
Read the full article on Quartz:
https://qz.com/911681/we-tested-apples-siri-amazon-echos-alexa-microsofts-cortana-and-googles-google-home-to-see-which-personal-assistant-bots-stand-up-for-themselves-in-the-face-of-sexual-harassment/