Apple and Amazon: Have Siri and Alexa Stand Up to Sexual Harassment

A lot of iPhone users rely on the Siri feature to help them out with their personal queries, but some treat her abusively. Do you know what happens when users call Siri a slut or a bitch? Sometimes she responds by acting coy, making a joke or even flirting.

Not that sexual harassment has ever been okay, but in light of the period of introspection the United States is undergoing on the subject of sexual misconduct, it’s disappointing to see that devices this popular are completely mishandling a systematic problem.

That’s why a Care2 petition is asking Apple and Amazon to reprogram their respective AI features, Siri and Alexa, to respond better to this verbal abuse. Join the movement by adding your signature to the petition, too!

Toward the beginning of 2017, Leah Fessler conducted an experiment where she said offensive and troublesome things to some of the most popular digital personal assistant devices on the market.

Though Siri and Alexa would vary their responses to sexually explicit, suggestive and abusive comments hurled in their directions, rarely were they firm. Generally they’d take it as a compliment or dismiss it gently. On occasion, they’d try to crack a joke back.

Not all of the devices Fessler conducted the experiment on gave such poor responses. Google Home, it seemed, mainly hadn’t been programmed to give a response to these nasty assertions, instead responding that it didn’t understand the comment. Meanwhile, Cortana would generally refer the user to a search engine result that sounded most like the comment made.

As Fessler points out, these bots are already problematic in utilizing traditional female voices to serve at users beck and call. Enabling owners to also sexually harass the subservient disembodied “women” inside their technology sends a terrible message.

“There’s no reason, apart from the notorious sexism of Silicon Valley, that these bots should be programmed to literally flirt with abuse,” wrote Fessler in an article she penned for Quartz.

In some ways, the bots reflect how women might actually respond to these kinds of comments when at work or alone on a street. If they have reason to fear for their job or even physical safety, they might deem it best to say “thank you” or deflect the comment rather than offering a rebuke.

However, an AI that does not need to worry about its safety – because it’s not real – has no reason to sidestep this kind of abuse. Instead, it could put harassers in their place.

The petition encourages Apple and Amazon to take a new approach with their respective devices. For example, rather than deferring or flirting, the bots could outright tell the user that that was an inappropriate comment to make. If the companies wanted to be even bolder, they could have the bot respond with statistics about the prevalence of sexual harassment.

Take Action

Clearly, our emerging technology can do a much better job of shutting down sexual harassment that for far too long has been treated as a joke. Let’s compel Apple and Amazon to do a better job by signing and sharing this petition.

Photo credit: Thinkstock

59 comments

Lesa D
Lesa Dabout a month ago

petition signed...

thank you, Kevin...

SEND
Jim Ven
Jim Venabout a month ago

thanks

SEND
Jim Ven
Jim Venabout a month ago

thanks

SEND
Jerome S
Jerome Sabout a month ago

thanks for sharing

SEND
Jerome S
Jerome Sabout a month ago

thanks for sharing

SEND
Richard B
Richard Babout a month ago

thanks for sharing

SEND
Chad A
Chad Aabout a month ago

Thank you!

SEND
Leo C
Leo Custerabout a month ago

Thank you for sharing!

SEND
Mike R
Mike Rabout a month ago

Thanks

SEND
Clare O'Beara
Clare Oabout a month ago

tell harassers to get lost

SEND