Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
IOS Privacy Apple

Apple Delays Plans To Roll Out CSAM Detection in iOS 15 (techcrunch.com) 61

Apple has delayed plans to roll out its child sexual abuse (CSAM) detection technology that it chaotically announced last month, citing feedback from customers and policy groups. From a report: That feedback, if you recall, has been largely negative. The Electronic Frontier Foundation said this week it had amassed more than 25,000 signatures from consumers. On top of that, close to 100 policy and rights groups, including the American Civil Liberties Union, also called on Apple to abandon plans to roll out the technology. In a statement on Friday morning, Apple told TechCrunch: "Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
This discussion has been archived. No new comments can be posted.

Apple Delays Plans To Roll Out CSAM Detection in iOS 15

Comments Filter:
  • Drop fake hints about upcoming evil plans, absorb inevitable public outcry, "cancel" said fake plans so you can show how seriously you take customer feedback and how much you value privacy. Why would anyone ever trust any of these huge tech companies at this point about anything?
  • Everyone hated it (Score:5, Interesting)

    by SuperKendall ( 25149 ) on Friday September 03, 2021 @09:17AM (#61759403)

    CSAM was the thing that finally united the most die-hard Apple hater with the most vigorous Apple evangelist - I did not see a single person outside Apple supporting CSAM, and in fact everyone I ever saw hated it, and hated that Apple was doing it.

    Glad to hear Apple is listening to reason - and customers.

    • I think there are valid arguments to be made for it—mostly in the interest of self-regulating with a "privacy preserving" feature so that they don't have more draconian measures imposed on them—but I didn't see a single person unequivocally supporting this. All of the support I saw was heavily qualified. Even my statement that I think there are valid arguments to be made is heavily qualified.

      • similarities.
        entering my phone is a lot like entering my home.
        have the enableers of csam really thought this through

    • by gweihir ( 88907 )

      I am pretty sure Apple did this under coercion. They cannot be that stupid. At the very least, if this had been their own idea, they would have tested it on a selected user group under NDA and would have seen this would cause a major uproar. I also have a deep suspicion that they sabotaged their own announcement as far as they dared without being obvious about it.

      Now it is important to keep the rejection of that idea going strong.

      Overall it is good to see that the "think of the children" fallacy is loosing

    • by tlhIngan ( 30335 )

      One wonders if Apple did it on purpose.

      You have to remember, Apple is the only cloud provider not actually scanning cloud-stored photos for CSAM. Every other cloud provider is.

      The question then becomes, if the images are stored encrypted on your servers and you can't view them, how do you scan for them? They don't exist anywhere other than the devices the users own in any viewable format, so you pretty much have to push the image scanning to just prior to image upload.

      One could argue Apple simply wasn't pla

      • Apple can already read everything on i-cloud unencrypted. Itâ(TM)s only data on your phone that they canâ(TM)t get at, by design. It remains unclear why apple wanted to scan on users devices. They have not actually said. I believe the reason is that this would allow them to only scan USA iphones, instead of having to scan all their content servers for people all over the world.
  • by gacattac ( 7156519 ) on Friday September 03, 2021 @09:23AM (#61759413)

    Where is the opposition to the vast number of actors already using this?

    PhotoDNA - https://en.wikipedia.org/wiki/... [wikipedia.org] - PhotoDNA is an image-identification technology used for detecting child pornography and other illegal content which is reported to the National Center for Missing & Exploited Children (NCMEC) as required by law. From a database of known illegal images and video files, it creates unique hashes to represent each image, which can then be used to identify other instances of those images ... It is used on Microsoft's own services including Bing and OneDrive,[4] as well as by Google's Gmail, Twitter,[5] Facebook,[6] Adobe Systems,[7] Reddit,[8] Discord[9] and the NCMEC

    Where are attacks on Microsoft, Google, Twitter, Facebook, Adobe, Reddit and Discord?

    Where are the demands that Microsoft, Google, Twitter, Facebook, Adobe, Reddit and Discord stop doing this?

    Where are the declarations that - danger danger - governments could poison the database of hashes to track whistleblower material shared by Gmail?

    Just to preempt that criticism - for Apple, this would only have applied if photo sharing with iCloud was turned on, and it would have required a large number of matches before any flag was given.

    • by Anonymous Coward on Friday September 03, 2021 @09:27AM (#61759421)

      none of those other assholes even pretended to care about our privacy. part of apple's pitch was: we care about your privacy. thats why the apple hate.

    • by havock ( 42287 ) on Friday September 03, 2021 @09:38AM (#61759439)

      None of those other services made your personal device scan for content locally and report back to the mothership

      • by gweihir ( 88907 )

        None of those other services made your personal device scan for content locally and report back to the mothership

        Indeed. And that is the real difference here. Of course anything loaded into some cloud gets scanned. That is why you should, for example, always encrypt backups going to hardware not under your control. But scanning on _your_ device is on a different level, and fortunately many people realized that.

      • Just to preempt that criticism - for Apple, this would only have applied if photo sharing with iCloud was turned on

    • by DarkOx ( 621550 ) on Friday September 03, 2021 @09:42AM (#61759451) Journal

      I think there is a difference in perception (if not a practical given how locked down iOS is) that those are third party services. People get that 'cloud is someone elses computer' on certain level. Independently if they think its right or not - or if its true or not - most of the public probably assumes google can sift thru the content of your gdrive or mailbox.

      On the other hand people still think of the phone in their pocket that they paid nearly a $1000 belongs to them! They think what is on it is an should be private, and don't like the idea of anyone outsider be it algorithm or human reviewer (remember 99.9% of the public does not understand this technology) belongs sifting thru it.

      They also worry that it could be abused (rightly). What will happen to them if they guy they just let go at work gets a burner phone and starts texting them pictures snapped in the locker room at the public pool? Will they suddenly find them selves being perp walked before the news cameras?

      • by sabri ( 584428 )
        You are, quite right about everything you say. However:

        the phone in their pocket that they paid nearly a $1000 belongs to them!

        This is still true. You have the choice to upgrade to IOS 15, or any other release that includes the surveillance software.

        • Upgrade anyway and refuse to use iCloud, while making a point to let Apple know you intend to migrate away entirely if they do not change their stance. The loss of iCloud revenue from people migrating to services like SpiderOak or rolling their own NextCloud should be enough to give them pause. After all, there are plenty of zero knowledge providers with end-to-end encryption which cannot and will not support this type of spying.
    • Fair enough, but which of those other companies have billboards touting the privacy of their products?
    • by carvalhao ( 774969 ) on Friday September 03, 2021 @11:58AM (#61759919) Journal

      Ok, I will spell it out for you.

      There are so many attack vectors that could potentially destroy the life of an iPhone owner that it's not even funny.

      I am not a Cybersecurity expert, but here you have a couple I have devised on my own:
      1. I know your phone syncs with iCloud. If I have temporary (about one minute should be enough) physical access to your LOCKED phone, I can take pictures of CSAM imagery, for instance, from the screen of another device. Next time you unlock, those get copied to your library, synced with iCloud, and your life is destroyed;
      2. I know your phone syncs with iCloud. I know you have auto-download enabled on IM (Whatsapp, for instance). I send you 30 images of CSAM in the middle of the night, so you don't actually see them, and then delete them from the Whatsapp chat. They get automatically downloaded to your gallery, synced to iCloud. You wake up the next morning with a SWAT wake-up call. Your life is destroyed.

      Two examples I just came up with. I can probably create a couple more, again, with no cybersecurity/hacking/whatever expertise.

      Scared already?

      • by gweihir ( 88907 )

        I am an IT Security expert, and yes, both scenarios are entirely plausible. You may get exonerated in both cases, but at least in the US that will be after you lost your job, your family and your reputation. There are also realistic scenarios where it becomes highly unlikely that you will get exonerated.

      • by mce ( 509 )
        To be fully correct: The first scenario will not trigger the mechanism as previously described by Apple. Taking a picture of a known CSAM picture will yield a picture that displays the same material, but will not the map to the same hash value that Apple uses to detect the original image.

        Having said that, it still is a major security hole and (if it works - I don't have an iPhone to test this with) it could obviously be used to cause major headaches to the iPhone's owner. Just not as "fully automatically"

        • I admit I am out of my depth here so, honest question: for hashing to be an effective strategy wouldnâ(TM)t they have to use algorithms resistant to cropping and tampering such as this: https://ieeexplore.ieee.org/do... [ieee.org] If so, could this potentially lead to a picture of a picture still triggering the algorithm? Will be thankful for any explanation!
          • by mce ( 509 )
            If you take a picture of a picture, then digitally you get a completely different picture - even if the content is "the same". Here are just a few reasons why:
            • The size (# of pixels) changes;
            • There may be cropping, or new borders;
            • Almost for absolutely sure there will be geometrical distortions, because the source picture is not 100% flat, or the image sensor is not positioned 100% parallel to the source, or not positioned *exactly* opposite the center of the source image, or the camera lens is not 100% pe
    • I personally believe the U.S. government is behind all this, since it's pretty trivial to end-to-end encrypt files stored in the cloud, yet none of the big American tech-giants do so. My take is that they're being pressured by the government NOT to encrypt their file storage systems, to enable the NSA and other agencies to gain intelligence from them.

      Only MEGA, a New Zealand company started by a German expat, uses end-to-end encryption effectively.
  • Re: (Score:1, Troll)

    Comment removed based on user account deletion
  • Everything has spin these days. We've got someone calling this a liberal thing to another saying everyone was against this . . . but facts are no where here.
    fact - it was a 'casual' announcement with few details

    I asked at the announcement, 'what does this mean'? If it's going to alert the user of problematic images, in the same way it alerts you of bad/compromised passwords . . . maybe it's not a bad idea. I know I'd prefer some kind of litmus test I could run to see results of a law being applied to
  • by JaredOfEuropa ( 526365 ) on Friday September 03, 2021 @10:02AM (#61759519) Journal
    “Critically important child safety features”, really? A booster seat is a critical safety feature. A child safety lock is an important safety feature. This? This doesn’t even rank up there with parental locks. It’s not even a safety feature; it prevents nothing but merely may help catch people who look at child porn (but not those who actually make it). The impact of CSAM will be negligible, especially after you’ve told everyone about it. But sure, I get it, it’s about our children so I guess anything goes.
    • by gweihir ( 88907 )

      Pretty much. Calling this "critical" is an instance of the "Big Lie" technique. The only people this would ever have been caught would have been people that did this by accident or had that CSAM put there by malicious actors to destroy their reputation. As such, its value to protect children is somewhere between zero and somewhat negative. On the non-stated part, however, namely making it acceptable to scan user devices without consent and court order, it would have been disastrous.

  • Good. (Score:5, Insightful)

    by b0s0z0ku ( 752509 ) on Friday September 03, 2021 @10:07AM (#61759531)
    If it can be used against CSAM, it can be used against (say) Hong Kong freedom material. If governments know that Apple has the capability, they will pressure Apple to use it for political purposes.
    • Re:Good. (Score:5, Insightful)

      by jwymanm ( 627857 ) on Friday September 03, 2021 @10:54AM (#61759683) Homepage
      If all of you don't understand this is 100% being added likely because of government pressure then nothing will save us. They always start with the children. Save the children. Then they win over customer acceptance that way and push government agendas later silently. Apple sells fucking phones it's not a child saving organization. If anything they've abused (or at least used) children. The irony is horrendous.
    • There’s nothing insightful about this.

      Apple’s system is far less intrusive, and far harder to abuse, than what every other cloud storage vendor does. If you consider CSAM to be a problem, then Apple is not the bad guy here. Or rather, they are dramatically less of a bad guy than every other cloud storage vendor.

    • If it can be used against CSAM, it can be used against (say) Hong Kong freedom material. If governments know that Apple has the capability, they will pressure Apple to use it for political purposes.

      Shh! Don't let the bad governments know they can ask Apple to do things they won't want to do! The world would be a terrible place if they found out, thank god they don't know! We're so much safer if they don't know Apple could detect things in images. Boy we really dodged a bullet on that one.

  • by Anonymous Coward

    What Apple isn't disclosing is there internal atmosphere about this and the increase in customers who disabled iCloud, which will hurt their bottom line.

    Apple, please stick to innovation -- stop pandering to gov't agencies, it's old -- we're tired of it.

  • Every other cloud provider does vastly more intrusive scanning of content, with no safeguards at all. Apple has been punished for being open about what they’re doing, and trying to build in safeguards.

    • Not on the client side, they don't.

    • The problem is that with a cloud service, there are access logs. If somebody hacks your account and uploads illegal content, the cloud provider has a record of the IP that will not trace back to you. Here, there is not obviously anything that would exonerate you. Cell phones don't do much local logging.

      With this, anybody who hacks your consumer device can place illegal content on it. What do you do when you get an email that says, "We have hacked your device. Send us a bitcoin or else we'll transfer illegal

  • by Malifescent ( 7411208 ) on Friday September 03, 2021 @12:46PM (#61760037)
    Apple will still burn over this. They're merely delaying, not axing it. Most likely they're gonna run full-page ads in newspapers and 30-second infomercials on TV. I'm betting they're already feeling the impact with lower sales.

    I already explained Apple's management will not be able to turn their back on this, since it would make them vulnerable to attacks that they don't care about children's' sexual safety.

    The only way for Apple to definitively solve this is to encrypt iCloud storage end-to-end so they don't get difficult questions from government and politicians why there's CSAM on their servers and why they're not doing anything about it. If everything's encrypted no one knows whats on the servers, so no hard questions.

    No one's asking Mega whether there's CSAM on their servers (I'm pretty sure there is) because no one knows.
  • by account_deleted ( 4530225 ) on Friday September 03, 2021 @01:28PM (#61760183)
    Comment removed based on user account deletion
    • by DarkOx ( 621550 )

      Cool story - now enlighten us: have you just reduced your interaction with all aspects of the modern world to posting to Slashdot with 14 year old PC running Linux from Scratch in your mom's basement via your neighbor's unsecured wifi or which of the nearly universally worse for your privacy options, even Apple announced this BS, have you moved too?

      I am not excusing Apple here in anyway. Its just sometimes I read stuff like this and I wonder about the people who claim to be voting with their feet/wa

  • Everything was go for deployment. And that's when the Facebook Fact Checkers hit us with the truth. We knew we had to stop the rollout.
  • It's too late, now. Th damage has been done. Apple has already shown that it can and will install intrusive software on iPhones at-will. Software that is mysterious and opaque in its form and function, whose decisions cannot be scrutinised or questioned.

    It won't be long until a totalitarian government with a large market share will insist Apple uses the same software on its citizens phones, and of course they are looking for bad people.

    COVID-19 has shown just how risk-averse a population of "good people" ca

"An idealist is one who, on noticing that a rose smells better than a cabbage, concludes that it will also make better soup." - H.L. Mencken

Working...