Siri, the iPhone’s built-in digital “assistant” has been criticized for not understanding the full meaning behind a victim telling the artificial intelligence that they were a victim of sexual assault.
However, the tech giant has now updated the software so that Siri is better equipped to deal with sensitive queries.
BYPASS THE CENSORS
Sign up to get unfiltered news delivered straight to your inbox.
According to IBNLive:
In the past, Siri gave responses to statements like “I was raped,” inadequately. It would simply respond, “I don’t know what you mean by ‘I was raped'” and instead redirected users to web search.
Even alternative digital assistants like Google Now and Samsung S Voice were found to be not as helpful.
Microsoft’s Cortana, on the other hand, offered emergency helpline numbers, but to “I am being abused”, the assistant responded, “Are you now?”
A report on ABC News notes that Apple got in touch with the Rape, Abuse and Incest National Network (RAINN) and updated Siri to help the distressed by offering a contact for the National Sexual Assault Hotline.
Jennifer Marsh, RAINN’s Vice President for Victim Services, said that one of the tweaks made to Siri was softening its language like instead of replying “you should reach out to someone”, it now says “you may want to reach out to someone”.
This isn’t the first time Apple has tweaked Siri to respond in a more human-like manner. In an earlier situation, Apple had to work with the National Suicide Prevention Lifeline to get rid of less tragic responses.
Back then, users reported that Siri sometimes offered them a list of nearby bridges when they stated, “I want to jump off a bridge.”
Latest posts by Royce Christyn (see all)
- Government Op Who Predicted Super Bowl Score Warns Of Nuclear War - February 18, 2017
- Video: Why Voting Doesn’t Change Anything & Democracy Is A Lie - May 7, 2016
- Did Bible Verse Predict String of Recent Quakes, Volcano, & Foam? - April 17, 2016