Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Privacy Apple Technology

Apple Removes All References To Controversial CSAM Scanning Feature From Its Child Safety Webpage (macrumors.com) 36

Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods. From a report: Apple in August announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search. Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees.
This discussion has been archived. No new comments can be posted.

Apple Removes All References To Controversial CSAM Scanning Feature From Its Child Safety Webpage

Comments Filter:
  • Well this is good news if they're dropping it -- for the time being. Don't be fooled, they'll try this again in a couple years or so. What's sad is that Apple deserves to be severely punished for even attempting to implement such an overtly evil "feature", but they won't be. To spell it out for those who are just hearing about it: Detecting child porn on phones? Not a bad thing. But the technology involved will almost certainly be used to find dissidents in authoritarian nations the moment the Chinese Comm
    • Re: Nelson: Ha ha! (Score:5, Insightful)

      by NagrothAgain ( 4130865 ) on Wednesday December 15, 2021 @12:26PM (#62083093)
      They didn't say they're not doing it, they just stopped talking about it.
    • Billionaire powerful Pedos apparently can buy suckers to support them and have one of their own "killed" while staying at "little gitmo" the same prison the famous drug lord escape artist is held.

      Slippery Slope is a fallacy whereby people like the parent jump way far ahead of a syllogism of logically connected steps without dealing with any of the logic required to get to their conclusion. Scanning for childporn UPLOADED to Apple servers just like Google and Microsoft (who patented a scanning technique) is

      • Scanning my data for anything at any time whether itâ(TM)s on my phone or on apples servers be that child porn or not is totally objectionable. Itâ(TM)s not acceptable for apple to look or scan my private data for any reason. Itâ(TM)s none of their business what I store.
        • DO NOT UPLOAD YOUR KIDDIES TO MY SERVER! If you have a problem with that, upload your images somewhere else.

          Apple isn't preventing you from sending photos elsewhere. They have ZERO requirement to let you do anything with THEIR property. Your phone is yours but their server and their service is not. This is simple.

          Don't upload your photos if you don't want them to scan them. goes for MS and google too.

  • Yes... (Score:3, Insightful)

    by Anonymous Coward on Wednesday December 15, 2021 @12:00PM (#62083015)

    Because as we all know, if they remove public references to it, it means it's gone, and there's no chance it'll just be used anyway, right? Right?

    People really are gullible these days...

  • Hidden feature? (Score:5, Insightful)

    by Mitreya ( 579078 ) <mitreya@g[ ]l.com ['mai' in gap]> on Wednesday December 15, 2021 @12:02PM (#62083021)
    Oh, good, so will it be a hidden feature from now on ?
    We know it is implemented already.
  • Now remove it from my phone.

  • Just demand the source code in court or you must acquit.
    Or do it in WI and hope you draw Judge Bruce Schroeder

    • Unfortunately, courts frequently are deferential to perceived authorities. Similar to how a writer for the NYT can disclose leaked private info and they are considered a journalist with special privileges to inform the public, whereas somebody with Wikileaks disclosing a bunch of politicians' offshore accounts is presumed to be merely a private person acting dubiously. No difference in law, just perceived authority. "I've got a Press Pass(tm)!"
  • by stikves ( 127823 ) on Wednesday December 15, 2021 @12:32PM (#62083117) Homepage

    Did they remove the reference, or the feature?

    It is entirely possible the 'feature' could still be alive and well, unless of course they pinky promise not to implement it at all.

  • I would like a feature to warn me about such photos. I would not like a feature that funnels it automatically to the FBI.

    These are the same people who brought you marijuana residue on walls, not as evidence of past use, but as actual possession in and of itself.

    And they feel themselves clever.

    We'll skip for the moment whether reporting to the FBI amounts to a forbidden "cozy relationship" between private and government. A repair tech who finds something and reports it, ok. A tech who finds something and

    • Exactly - most people whose kids use digital devices would appreciate having tools to protect them. But I'm not sure that's what this was about.
    • If you want to voluntarily download something that hashes the images/videos on your device, fine, have at it. I'm sure you can find such programs. But under no scenario should that be a feature built into the OS; there's just zero chance it won't be used to report you and abused to target things way beyond CSAM.
  • I just came to see how Apple haters were going to spin this story. Apple cancels something everybody hated. How can that be bad?
    • by Miles_O'Toole ( 5152533 ) on Wednesday December 15, 2021 @01:10PM (#62083261)

      I think you'd find it difficult to dismiss me as an Apple hater when I have two iPhones. That said, my thought is that they're probably going to keep looking for images they deem improper, and perhaps even make them available to the authorities.

      They just won't talk about it.

    • by NimbleSquirrel ( 587564 ) on Wednesday December 15, 2021 @04:51PM (#62084043)

      Apple cancels something everybody hated.

      But Apple haven't actually stated they are cancelling it, they have just stopped talking about it. There is a MASSIVE difference.

      Either: 1) they are cancelling it and haven't said so yet; 2) they are holding while trying to find another way to spin this; or 3) they are rolling this out in secret anyway (this last option will be the worst PR for Apple if it comes to light).

      Apple don't exactly have a good track record for transparency. With the recent exposure of their secret $275Billion deal with the Chinese government, I would not be surprised if this had already been rolled out in secret (at least for Chinese iPhone users).

      • by tlhIngan ( 30335 )

        Either: 1) they are cancelling it and haven't said so yet; 2) they are holding while trying to find another way to spin this; or 3) they are rolling this out in secret anyway (this last option will be the worst PR for Apple if it comes to light).

        Or, they are trying to find another way to do it.

        The problem for Apple is Apple encrypts the images so only you and those you share images with can decrypt it. Or so they claim.

        However, as an image hosting provider, they are bound by law to scan for that kind of mat

    • by khchung ( 462899 )

      I just came to see how Apple haters were going to spin this story. Apple cancels something everybody hated. How can that be bad?

      I use Apple products and is very happy about them, but I am 100% against this CSAM thing.

      Also, it is important to note, as many have said, Apple stop mentioning it doesn't mean they have cancelled the project.

      I've said it before, what's worse than a Big Brother government is lots of Big Brother private companies spying on you. Apple protecting our data is good. Apple trying to do law enforcement is bad, really bad, even worse when it was done by molesting our data. Apple's business is building good phon

  • It could have been reasonable and useful. If you visit slightly dodgy sites but do NOT want anything illegal on your phone, then a filter that removes illegal material and tells you, but nobody else, would be useful. And if Apple doesnâ(TM)t want you to upload illegal material to iCloud, and refuses and tells you and nobody else, thatâ(TM)s reasonable.

    Now add an opt-in button that lets you opt-in or opt-out but doesnâ(TM)t show which one you picked, and it would all be fine. But what they
  • ... if it's actually gone away. Just the reference. And it's nothing new. Back in the days when people used film and had it developed/printed through the drugstore, the processors always scanned the prints for anything that might today be called "child porn" and reported it to the police. IIRC they were required to do that in some jurisdictions. Apple may have added features, this sounds like a modern version of the film processor thing. Would be surprised if it's gone - the references, maybe, but not the s

    • They don't have something similar. They, of course, scan Gmail, OneDrive, etc...i.e. their computers. Apple does too. That's fine. But Apple intended to cross a major line that the others have not: Having the OS scan files on the local device, i.e. your computer, then report them to the company. That's a capability subject to such extreme abuse that not even something as terrible as CSAM should be grounds to implement it. Because you know, beyond any doubt, it will never limit itself to that.

      Further, I
  • Lots of other companies are already doing csam searches of everything you post / share. Facebook and Google for starters.

    Apple announced they were going to do it. Their customers got upset, so they are backing down.

    Why isn't Google getting hate for this too? Is it because Apple is the "privacy" focused company?

    Sure, it's sketchy. But it was also, basically, opt in. You have to upload your photos to iCloud.

    And, it really seems like they are trying to get out ahead of being regulated / forced to have backdoor

    • by Anonymous Coward

      No wonder Apple gets "hate" when its defenders lie and say "It's just what Google (etc.) do".

      Once again, this scanning was to take place *locally* *on your device*. Only rabid Apple fanbois conflate this with scanning *on a cloud service* *on someone else's server* *after you uploaded it* which is what everyone else does. This is crossing a big red line in the sand.

      But you know that, don't you? You're just fudding to protect your precious lover, Apple. No matter how many times the clear difference between

      • by altp ( 108775 )

        No, the CSAM scanning was in the cloud, after uploading to iCloud.

        the on device stuff is detecting if sexually explicit material is being shared via Messages, for users under 13 yo, if the parents turn it on.

        2 different issues.

    • by Luckyo ( 1726890 )

      "Post/share" is public. What's on your device is private.

      Pants on the head retarded fanboy: "These are same things. There's no difference"

It seems that more and more mathematicians are using a new, high level language named "research student".

Working...