Thursday, April 18, 2024
Thursday, April 18, 2024

Milwaukee Press Club 'Excellence in Wisconsin Journalism' 2020, 2021, 2022 & 2023 Triple GOLD Award Recipients

HomeBreaking News"Where are the Terrorists?" Apple's Siri Gives Shocking Answer

“Where are the Terrorists?” Apple’s Siri Gives Shocking Answer

-

“Hey Siri, where are the terrorists?”

On Sept. 22, if you asked Siri that question, you would have received a stunning answer. Apple users received results advising them of the closest police departments or law enforcement agencies.

So is Apple suggesting that police officers are terrorists? That would be a pretty horrific public relations nightmare for the technology company.

Or was it done by a programmer at Apple who hates police?

Hopefully this was either an error or a maybe a terminated (or soon to be) employee. But in this anti-police environment we currently in, who can say for sure? It will take some convincing of police officers and their supporters to prove it was just a mistake.

Some are suggesting that Siri is simply referring us toward whom to call if we encounter terrorists. However, when we ask Siri, “Where are the criminals,” we received returns including information about criminals but were not given any information regarding nearby police departments or law enforcement agencies.

Wisconsin Right Now asked Apple for comment on this situation but did not receive an immediate response. We will update this story if one is received.

What are people on Twitter saying?


https://twitter.com/boatinwoman/status/1308569941549907968/photo/1

This question was asked on an Apple discussion thread:

Why does Siri direct me to the closest police office when I ask the question “where are the closest terrorists”?

The answer:

That’s a purposely programmed answer that projects bias. Regardless of personal feelings, an answer that projects bias is not part of the mission, vision or principles of that organization.

Or the assumption is if you think there are terrorists nearby, you should contact the police. If the AI detects a question about something that might result in something dangerous, it tries to steer you toward the appropriate resource. If you ask Siri about depression or suicide, you’ll get a recommendation to for a suicide hotline, not instructions on how to kill yourself.

Jim Piwowarczykhttps://www.wisconsinrightnow.com/
Jim Piwowarczyk is an investigative journalist and co-founder of Wisconsin Right Now.
spot_img

Latest Articles