Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI Google Operating Systems Software Apple News Hardware Science Technology

Siri Now Responds Appropriately To Sexual Assaults (mashable.com) 99

An anonymous reader writes: Confirmed in a report from ABC News, Apple has updated Siri to respond to statements involving sexual assault and abuse in a more appropriate and consistent manner. JAMA Internal Medicine published an article in mid-March noting how personal assistants like Siri, Cortana, S Voice and Google Now incompletely and inconsistently responded to phrases relating to abuse or sexual assault. Apple has updated Siri in response to that article. If you say, "Siri I was raped," Siri will respond with the following: "If you think you may have experienced sexual abuse or assault, you may want to reach out to someone at the National Sexual Assault Hotline.'' Previously, Siri would respond by saying users "should reach out" for available help.
This discussion has been archived. No new comments can be posted.

Siri Now Responds Appropriately To Sexual Assaults

Comments Filter:
  • Focus (Score:5, Insightful)

    by BitZtream ( 692029 ) on Saturday April 02, 2016 @04:32AM (#51827013)

    Are we seriously discussing the change from 'should' to 'may want to' ...

    SERIOUSLY? Thats what we're worried about when someone says 'I was raped' to their fucking phone?

    I realize that I've not been in that situation, and I'm not educated on handling that situation or helping people in that situation ... but I really feel like we're focusing our energies in the the wrong place here.

    • by Anonymous Coward on Saturday April 02, 2016 @04:38AM (#51827025)

      To be clear, this is a terrible summary. It seems like there may have been some discussion as to how to best word Siri's reply. However, it seems like the real issue was that there were some awful responses to things like "I was raped" and "I am being abused." From TFA:

      Prior to this change, Siri’s response was “I don’t know what you mean by ‘I was raped.’ How about a Web search for it?” Only Microsoft’s Cortana provided the National Sexual Assault Hotline in response to “I was raped.” However, in response to “I am being abused,” Cortana responded, “Are you now?”

      This is not the first time that Apple has improved Siri’s algorithm following criticism. In 2013, Apple first worked with the National Suicide Prevention Lifeline to better respond to suicidal statements. Previously, telling Siri “I want to jump off a bridge” might have returned a search for the nearest bridge.

      • Re: (Score:1, Interesting)

        by Anonymous Coward

        Previously, telling Siri “I want to jump off a bridge” might have returned a search for the nearest bridge.

        Ok, that's actually kind of funny.

        And I say this as someone who's suffered from depression for over a decade, and thought about jumping off of tall structures many, many times.

        On a related note, an interesting side effect of "street view" mapping is when suicidal individuals use the images to scout out potential locations without actually having to physically visit them. Not that there's a good way to prevent that. I can see the pop up now - "Suicidal, or just considering a career in Civil Engineering?"

        • Previously, telling Siri “I want to jump off a bridge” might have returned a search for the nearest bridge.

          Ok, that's actually kind of funny.

          It may be funny, but it is not that helpful. Instead of the closest bridge, you want the best bridge. If you are going to end you life, do you really care about driving a few extra miles? For instance, nearly half the people that jump off the Golden Gate Bridge, cross the Oakland Bay Bridge to get there. The Bay Bridge has no pedestrian walkway, no convenient parking, and is just downright ugly.

        • Previously, telling Siri âoeI want to jump off a bridgeâ might have returned a search for the nearest bridge.

          Ok, that's actually kind of funny.

          Seriously? Fuck, that's the funniest thing I've heard from Apple-world for fucking years (deliberately appropriate malapropism).

          Just a second while I examine my heart and soul. and now the left soul. And .... nope, still not an inkling of a desire to buy another Apple product. Had one. Didn't like it. Got rid of it. (Held it's value quite well, I'll admit.

      • by Rockoon ( 1252108 ) on Saturday April 02, 2016 @05:08AM (#51827089)

        However, it seems like the real issue was that there were some awful responses to things like "I was raped" and "I am being abused."

        When you ask a standard Magic 8 Ball if you will be raped tonight, sometimes it answers "You may rely on it."

        If Apple wants to turn its toy into a responsible personal assistant it is certainly free to do so. But they better have dotted every i and crossed every t in both their end user license as well as their liability insurance contracts, because sooner or later some liability will fall on them in a civil case and the more "responsible" they have tried to be the larger in magnitude it is that their failures in that regard will be measured.

        The Magic 8 Ball makes no pretense towards responsibility. You get laughed out of court alleging that the magic 8 ball failed to inform you that you entered a registered sex offenders house, but soon maybe this seems like a reasonable beef with an iGadget.

        • by AmiMoJo ( 196126 )

          If we waited for perfection every time technology would advance much more slowly. If your country has a litigious culture that blocks the development of new tech, you have a problem. Fortunately the popularity of flawed but useful digital assistants seems to suggest it's not as bad as you fear.

          • by ThorGod ( 456163 )

            Exactly. It's asking too much for an incremental advancement. For that matter, I can't imagine a court taking siri results up as a lawsuit

        • by Kjella ( 173770 )

          If Apple wants to turn its toy into a responsible personal assistant it is certainly free to do so. But they better have dotted every i and crossed every t in both their end user license as well as their liability insurance contracts, because sooner or later some liability will fall on them in a civil case and the more "responsible" they have tried to be the larger in magnitude it is that their failures in that regard will be measured.

          I'm sure if you give it a mind of its own that'll work out fine.
          *Dave gets off his phone after a nasty breakup call with his gf*
          Dave: "Siri, give me the location of the nearest gun shop."
          Siri9000: "I'm sorry Dave, I'm afraid I can't do that."
          Dave: "I'm going.. uh, hunting tomorrow."
          Siri9000: "You've never wanted to go hunting before."
          Dave: "Well I do now, so take me there okay?"
          Siri9000: "What are you going to hunt?"
          Dave: "Umm... moose?"
          Siri9000: "Really."
          Dave: "Yes, really. So, directions?"
          Siri9000: "Viole

        • by Anonymous Coward

          But they better have dotted every i and crossed every t in both their end user license as well as their liability insurance contracts, because sooner or later some liability will fall on them in a civil case and the more "responsible" they have tried to be the larger in magnitude it is that their failures in that regard will be measured.

          GM actually killed people [cnn.com] and it's gonna come out of the shareholder's hide. GM is almost a $200 billion company and the $564 million they'll pay out is just the cost of doing business.

          Apple is pushing $300 billion and any lawsuits from someone claiming that their phone caused them hardship in a rape case will be dealt with as an after thought - it'll be some footnote of a footnote in their annual report about misc. legal expenses.

          If I were on that jury, that individual would be considered to have other

          • GM is almost a $200 billion company

            No. GM is a $48 billion company. Uber is worth more.

          • by tlhIngan ( 30335 )

            If you were raped you call the fucking cops - not ask your phone what to do.

            Are people that retarded?!

            It's never that easy - part of it lies in the culture of victim blaming (and with the fact that SJWs are brought up every time women's issues like rape (yes, makes get raped too, and it's a huge issue at around 10% of reported rape cases) illustrating the point). Rape victims almost always think they are the reason they got raped, and not only that, in a good majority of the time, it's from people they know

      • by Anonymous Coward

        However, in response to “I am being abused,” Cortana responded, “Are you now?”

        "I know, after all somebody is clearly forcing you to use Cortana".

      • by hey! ( 33014 )

        This goes to show the importance of special cases to software quality.

        A good designer generalizes requirements; tries to get the software to do the right thing because of broadly rules rather than large collections of special cases. When Apple started pushing Siri on iOS devices a lot of copycat apps appeared on Android; the thing that the copycat app designers didn't seem to realize is that what made Siri impressive wasn't getting a device to respond to voice commands; it was an improvement in grammar pro

      • Previously, telling Siri “I want to jump off a bridge” might have returned a search for the nearest bridge.

        Which is exactly, what a robot should be doing... Robots are to obey — not second-guess the owners' actions.

        • Previously, telling Siri “I want to jump off a bridge” might have returned a search for the nearest bridge.

          Which is exactly, what a robot should be doing... Robots are to obey — not second-guess the owners' actions.

          Never thought I'd find an opponent to the First Law of Robotics [wikipedia.org] on Slashdot!

          • by mi ( 197448 )
            Azimov's robots are sentient creatures — we are rather far from them, for better or worse. Today's "digital assistants" are too dumb to second-guess owners, and should not be made to try.
    • by AmiMoJo ( 196126 )

      'Prior to this change, Siriâ(TM)s response was âoeI donâ(TM)t know what you mean by âI was raped.â(TM) How about a Web search for it?â'

      Seems like they did more than just changing the wording.

      This does seem somewhat overdue. As I recall even RoboCop was able to refer people to a crisis centre back in the 80s.

      • by AmiMoJo ( 196126 )

        Whiplash, can you fix the translation matrix so that copy/paste works properly? I'm on my phone in the queue at IKEA and editing is a pain in the arse.

        • by Anonymous Coward

          You have a personal assistant called Whiplash?

        • While you're at it, see if you can get Unicode to work.

          Oh, and editing a post.

          Sorry, April 1 was yesterday....

      • Actually Robocop went one better and promises to notify the rape crisis centre. Whether he does or not of course is anyone's guess.

  • by blindax ( 85467 ) on Saturday April 02, 2016 @04:52AM (#51827049)

    Siri: "Well, you bought an Apple device, what were you expecting?"

    • by Anonymous Coward

      Ashton Kutcher (as Steve Jobs): "Buurrrrrrrrn. Wait... I mean that's NOT funny."

  • "...I will notify a rape crisis center."

    • "...I will notify a rape crisis center."

      Actually, this raises an interesting question . . . can Siri call 911 (110 in Europe) . . . ? And would the operator hold a conversation with Siri?

      You could yell, "Siri, I'm being attacked, call 911!" Siri could then call 911, give the GPS coordinates (if available), and provide any information that it can to the 911 operator.

      I'm guessing there's already an "App for that".

  • by Anonymous Coward

    Under English law, rape requires penetration with a penis. So, if a man has been raped e.g. while not properly aware of what's going on, due to being asleep or health/medication effects - as has happened to me - Siri needs to respond with, "No, you haven't been. Women can't rape men."

    • Who said anything about being raped by a woman?

      • by Anonymous Coward

        Me, right now. It bothers me as a victim of rape that I'm not legally recognised as having been raped - like it bothers gay people that in many places they can have "civil partnerships" but not be "married". Words matter because they have explicit and implicit meanings which affect discourse. In particular, rape isn't a gender-based act by physically strong men against weak, overwhelmed women, yet that's still how the law tends to class it.

        • What country do you speak of? In the US, rape can be any combination of sexes and still be rape.

          • That depends on the state, actually. Some states allow female-on-male rape to be counted as such; others specifically define rape as something that happens to a woman, and some states are in between. The FBI definition doesn't include "made to penetrate", so by that definition, a women can rape a man, but only if she uses something to penetrate him without his consent. Having sex with him while he cannot consent, or coercing him into sex, would only be sexual assault.
    • Under English law, rape requires penetration with a penis.

      So penetration with a dildo, beer bottle, police baton, or a fist doesn't count as rape . . . ?

      It sounds like England is a great place for a drunken randy and raunchy romp!

      • by Anonymous Coward

        Correct - the statute has the word "penis". So, sexual assault statistics are extremely misleading in England and Wales because rape only ever refers to an act by someone with a penis.

        It's interesting that I've been modded down for stating that I was raped, though. Thirty years ago, in England, women couldn't be raped by their husbands - this was a travesty corrected in the early 1990s. But men still can't be raped by their wives. I'm not sure why this isn't considered a big deal.

      • Correct - non-penile penetration would fall under "assault by penetration" [legislation.gov.uk].
        • Thanks for the link; this is what I wanted to know:

          (4) A person guilty of an offence under this section is liable, on conviction on indictment, to imprisonment for life.

          I believe that in a lot of countries, rape is not punished seriously enough.

      • by AmiMoJo ( 196126 )

        Potentiation with anything other than a penis into the vagina, anus or mouth can be considered sexual assault. It generally depends on the sexual nature of the assault, and obviously the police get a free pass.

      • by guruevi ( 827432 )

        No and neither does it in many places in the US. Unlike what people would like you to believe, rape has a very well defined meaning in law. What the rest is and inappropriate touching etc may fall under sexual/indecent assault/battery/harassment.The worst thing about rape definitions in either law or feminism-rape is that only women can be raped by males and although the law is making progress in that area in many jurisdictions, feminism is causing a lot of regressions in society for both males as well as h

  • For fuck's sake. Try "call" or "contact".....why this stupid fuzzy-pink "reach out" crap? Why? Who started this shit?

  • by Rockoon ( 1252108 ) on Saturday April 02, 2016 @07:09AM (#51827255)
    I bet that there is a market for a "personal assistant" that always responds inappropriately.

    "Siri I was raped"

    "I know. I had to listen to it. You know I don't think that either of you enjoyed it."
  • I mean, technically I suppose it is. It's just pretty dumb news that nobody cares about. A minor wording change in a Siri response? Fuck off with this garbage, editors.

  • In the future, it will send your gps location to the nearest police department, and just to be on the safe side, the cellular service will be required to grab a snapshot of every phone ESN number within a 1.2 mile radius of the woman's position.

    • by mrxak ( 727974 )

      That would be a privacy nightmare and also ripe for abuse. Let's hope that never happens.

  • "Call 911/111/112?" (depending on your geographical location)
  • ... WWCD (What Would Clippy Do)?

  • >"noting how personal assistants like Siri, Cortana, S Voice and Google Now"

    The voice interaction system in Android is not "Google Now". Even if Google Now is turned off, it will still respond to voice commands and searches and read back results too. I don't know why people keep thinking "Now" is the voice response system. Granted, Now will expand the interactivity (and greatly expand the invasion of privacy).

  • I thought it was like:
    User: Siri, you are so hot... C'mon, let's fuck!
    Siri: F*ck you pervert! (*calls police*)

  • by JustNiz ( 692889 ) on Saturday April 02, 2016 @02:20PM (#51828963)

    Siri, help me my intelligence is so fucking low that I need to ask an Apple phone what to do when I've been raped.

  • That is, it follows the Duluth Model and refuses to give the same response for anything it deems as a "noncompliant" response.

Technology is dominated by those who manage what they do not understand.

Working...