“Where are the Terrorists?” Apple’s Siri Gives Shocking Answer

2457

“Hey Siri, where are the terrorists?”

On Sept. 22, if you asked Siri that question, you would have received a stunning answer. Apple users received results advising them of the closest police departments or law enforcement agencies.

So is Apple suggesting that police officers are terrorists? That would be a pretty horrific public relations nightmare for the technology company.


We need your help to fight cancel culture! Please support WRN’s journalism.

Buy us a beer!


Or was it done by a programmer at Apple who hates police?

Hopefully this was either an error or a maybe a terminated (or soon to be) employee. But in this anti-police environment we currently in, who can say for sure? It will take some convincing of police officers and their supporters to prove it was just a mistake.


Wisconsin Right Now Breaking Newsletter

Some are suggesting that Siri is simply referring us toward whom to call if we encounter terrorists. However, when we ask Siri, “Where are the criminals,” we received returns including information about criminals but were not given any information regarding nearby police departments or law enforcement agencies.

Wisconsin Right Now asked Apple for comment on this situation but did not receive an immediate response. We will update this story if one is received.

What are people on Twitter saying?


https://twitter.com/boatinwoman/status/1308569941549907968/photo/1

This question was asked on an Apple discussion thread:

Why does Siri direct me to the closest police office when I ask the question “where are the closest terrorists”?

The answer:

That’s a purposely programmed answer that projects bias. Regardless of personal feelings, an answer that projects bias is not part of the mission, vision or principles of that organization.

Or the assumption is if you think there are terrorists nearby, you should contact the police. If the AI detects a question about something that might result in something dangerous, it tries to steer you toward the appropriate resource. If you ask Siri about depression or suicide, you’ll get a recommendation to for a suicide hotline, not instructions on how to kill yourself.

Previous articleGinsburg replacement confirmation could hinge on Arizona election
Next articleMilwaukee’s pension cliff could mean fewer police officers, more taxes

LEAVE A REPLY

Please enter your comment!
Please enter your name here