Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Cloud Privacy Apple

PSA: Apple Can't Run CSAM Checks On Devices With iCloud Photos Turned Off (imore.com) 62

An anonymous reader quotes a report from iMore: Apple announced new on-device CSAM detection techniques yesterday and there has been a lot of confusion over what the feature can and cannot do. Contrary to what some people believe, Apple cannot check images when users have iCloud Photos disabled. Apple's confirmation of the new CSAM change did attempt to make this clear, but perhaps didn't make as good a job of it as it could. With millions upon millions of iPhone users around the world, it's to be expected that some could be confused.

"Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content," says Apple. "The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account." The key part there is the iCloud Photos bit because CSAM checks will only be carried out on devices that have that feature enabled. Any device with it disabled will not have its images checked. That's also a fact that MacRumors had confirmed, too. Something else that's been confirmed -- Apple can't delve into iCloud backups and check the images that are stored there, either. That means the only time Apple will run CSAM checks on photos is when it's getting ready to upload them to iCloud Photos.

This discussion has been archived. No new comments can be posted.

PSA: Apple Can't Run CSAM Checks On Devices With iCloud Photos Turned Off

Comments Filter:
  • "Confirmed" (Score:5, Insightful)

    by Known Nutter ( 988758 ) on Friday August 06, 2021 @07:20PM (#61665609)

    Something else that's been confirmed -- Apple can't delve into iCloud backups and check the images that are stored there, either.

    Why? How was this "confirmed"?

    • by AmiMoJo ( 196126 )

      They claim that iCloud data is encrypted and they don't have the key, but in certain jurisdictions (e.g. China) even that isn't true and both Apple and the government have full access to data stored there.

      Apple collaborates with the CCP so it's hard to take their insistence that Western user's data is secure when they are clearly more than willing to put profit before the interests of the user.

      • you do realize that different encryption methodologies an be applied in different regions right?

        I mean, I know it's fun to think Apple is the big bad and Coming To Get You, but this is more checking photo "dna" against a known dataset of problem images than actually reviewing all of your photos. Not like there's a person looking at all your twitter nude PMs. Some meta level checksums are done against the images and if they get a suspiciously high set of matches, it'll trip a flag.

        What they do for the US h

  • Ridiculous (Score:4, Insightful)

    by nyet ( 19118 ) on Friday August 06, 2021 @07:24PM (#61665625) Homepage

    Anything that only works if you aren't allowed to disable it is, by definition, malware.

    • That's a very tortured statement - where did the "aren't allowed to disable it" come from? By that odd definition, ssh would be considered malware - after all, it only works if you "aren't allowed to disable it".

      • by nyet ( 19118 )

        I'm speaking of the larger goal of "think of the children" as a whole, which is the purpose of CSAM.

        There is no larger goal of ssh that is defeated if you chose to disable public ssh access to your machine.

        • I'm speaking of the larger goal of "think of the children" as a whole, which is the purpose of CSAM.

          I don't think Apple is "thinking of the children," I think they are thinking of protecting themselves from liabilities from politicians who claim to "think of the children."

          Unfortunately, when it comes to encryption, you can't have your cake and eat it, too. If it's secure, the "think of the children" crowd will criticize it. It seems that in this case, Apple wants to eat their cake and have it, too. They thought they could roll something out that will appease both the "think of the children" crowd and the

    • Re: (Score:2, Interesting)

      by Mitreya ( 579078 )

      Anything that only works if you aren't allowed to disable it is, by definition, malware.

      It's all good, I think they don't even have to code a solution
      They'll probably publicize the software, allow disabling this feature, and then just report everyone who disabled the filter.

  • My assumption was that the check occurred when photos were transmitted as part of iCloud. This effectively puts it on the internet and as we know anything on the internet it potentially public. Not only that, but it could be shared. So any child exploitation could be part of illegal activity.
  • Protip (Score:4, Informative)

    by bobstreo ( 1320787 ) on Friday August 06, 2021 @07:35PM (#61665649)

    Don't underestimate the bandwidth of a station wagon full of blu-ray disks.

  • by ameline ( 771895 ) <ian.amelineNO@SPAMgmail.com> on Friday August 06, 2021 @07:37PM (#61665657) Homepage Journal

    It's exposing and publicizing their capability to scan for "bad" content on user devices. Just because they are only scanning pictures, and only when they are being uploaded to iCloud doesn't mean some horrible state somewhere won't tell them to scan all files on all devices for content critical of the dictator.

    This is far too slippery of a slope.

    • A friend found naked pictures of his young son on his phone, taken by the son. He quickly deleted them.

      Under this brave new world this could get people into serious trouble. Make absolutely sure nobody can use your phone to take pictures. (Most phones allow taking of pictures without login.)

      • While a situation like that is a very real concern, it's totally not relevant in this case. Apple's system uses perceptual hashes of images and compares those hashes against a database of hashes from known abusive material. Privately taken images can't be detected since they won't be in the database.

  • That means the only time Apple will run CSAM checks on photos is when it's getting ready to upload them to iCloud Photos.

    There's dumb criminals, and there are smart criminals.

    This will only catch the dumb ones - the ones that would actually have their images uploaded to a service they don't themselves control.

    • by mark-t ( 151149 )

      Oh, it won't only catch the dumb criminals. Don't forget about the false positives who will still have their privacy unjustifiably invaded because of this..

      All it will take is a handful of sufficiently high profile people for that to happen to and this whole thing will go away as quickly as it came.

    • by Mitreya ( 579078 )

      This will only catch the dumb ones - the ones that would actually have their images uploaded to a service they don't themselves control.

      Not to mention
      1) False positives (Yes, Apple has some manual review process, but unless there is a significant jail penalty for actually reporting a false positive, that doesn't mean much).
      2) Someone sending such images to people they don't like. Maybe with some steganography trick, so it is not easily visible to the recipient.

      What could possibly go wrong??

      • Not to mention 1) False positives (Yes, Apple has some manual review process, but unless there is a significant jail penalty for actually reporting a false positive, that doesn't mean much). 2) Someone sending such images to people they don't like. Maybe with some steganography trick, so it is not easily visible to the recipient.

        What could possibly go wrong??

        Good points - and in line with an apropos post I just read: On Apple’s “Expanded Protections for Children” – A Personal Story [wordpress.com]

  • 20 years later... Hey Mom, do you have any pics of us as children? No honey. We had (insert vile names) trying to be righteous. It was dangerous. (I'm being sarcastic for those that have no humor.)
    • There was a time, not to long ago, when people dropped of rolls of film to little kiosks. An hour or two later, when they went to pick up their photos of their little Johnie and Susie taking a bath, they were met by police all because someone was overzealous in their interpretation of child exploitation and kiddie porn.

      Now, in the digital age, the photo Nazis have returned.

  • Right. This has nothing to do with whether companies should be doing these types of scans. People are only upset about their privacy and security because they are "confused."

  • don't forget with the new apple M927 chip we can now do that all on-device to respect your privacy while we invade your privacy.
  • And it's hard to find any company that is willing to protect pedophiles if false positives can be made very improbable. However, there is a bigger picture of cloud computing undoing a big promise of PC revolution where you don't have to follow someone else's rules all the time. Facebook blocks plenty of stuff that is perfectly legal. There is a value in rebuilding offline / decentralized / open source solutions like social networks on top of end to end encrypted e-mail. Among other things, competition will

  • They built iCloud Photos and have billions of photos uploaded from users. The FBI and other law enforcement agencies have been finding this garbage via subpoena and Apple has become alarmed. Now they are making a good showing with law enforcement to get the stuff off their servers or flag it proactively.

    Their public positioning in advance of the rollout will get the stuff removed by the users.

    Old-school offline backups will come back into style.

  • by Darinbob ( 1142669 ) on Friday August 06, 2021 @10:15PM (#61666011)

    The story is less than a week old and now everyone is automatically assumed to know what CSAM means without outside help?

  • by Gimric ( 110667 )

    So who exactly is the "public" that this announcement is meant to serve? People who have CSAM material on their phone and don't want to be found out?

    Is there something about the Slashbot user base I don't know about?

    • Can you prove they are only checking for CSAM? Apple themselves cant. They are supplied with hashes but they did not generate them. This means they could be blocking more than just actual CSAM and they do not have the original images to make a decision as to what the images really represent before reporting someoneâ¦

      Also, who sets the standard? At one point Wikipedia got blocked by the IWF because of a naked baby on a Nirvana cover album. Also, what about images which are not representative of
      • by _xeno_ ( 155264 )

        Can you prove they are only checking for CSAM? Apple themselves cant.

        It's worse than that. The hashes your phone gets are "blinded" - they're encrypted with a public key. There's no way to prove that the hashes your phone is checking against are, in fact, part of some public database of hashes, even assuming it was possible to get access to the list of hashes. (Source: Apple's own description of the system [apple.com])

        I'm not sure why the "blinding" step. You'd think there'd be nothing wrong with just having the hashes be publicly available. I suppose it's possible if the final hashes a

  • example [example.com]
  • by joe_frisch ( 1366229 ) on Friday August 06, 2021 @11:37PM (#61666127)
    Apple has repeated the one in a trillion per account per year. With day a billion accounts, that' a thousand years between false accusations. Will they be posting a bond for say a billion dollars to be paid out to anyone falsely accused? That will average a million a year - trivial compared to the advertising value.

    What - are they not really that sure the false positive rate is that low? Maybe they are worried about human error, and intentional hacks. Well, SO ARE THEIR USERS.
    • That would be great if all accounts were equal. I assume that some people will have 5 photos on their account, and others will have 50000. The more you use the higher your chance of being flagged incorrectly.

  • First, now that they've created the tool, they won't be able to resist a subpoena to ignore the "iCloud enabled" flag. I can't imagine 'only the iCloud folder' won't get changed too.
    Second, this doesn't solve the problem that governments can and will order them to add hashes of files that have nothing to do with CSAM or anything that should be illegal, though tbf Gmail, FB and every other cloud service already scans for CSAM so are subject to that too.
    Finally, Apple is still sending very mixed messages ab
  • Please save your depravity to the cloud because it definitely won't come back to haunt you. Alternatively just jump into the nearest open mineshaft or quarry lake.
  • If you upload CSAM to iCloud Photos without using an iPhone, it also canâ(TM)t be scanned, since thereâ(TM)s no on-device hashing to fulfil the first step. Also, this seems to be marketed entirely at the US, perhaps Uncle SAM wants YOUR children?
  • by argStyopa ( 232550 ) on Saturday August 07, 2021 @07:06AM (#61666611) Journal

    "You've turned off your cloud sharing, what are you hiding?"

    • "You've turned off your cloud sharing, what are you hiding?"

      It is not enabled by Default. It requires a specific, and clearly-labeled, "switch" to be switched "On".

      And since anything more than 5 GB storage requires a Paid Subscription, iCloud Photos is hardly something of which a reasonably-diligent User would be unaware.

  • Another PSA

    Apple turns unencrypted syncing of all your photos to the cloud on every time you do some or all of these:

    - Update iOS
    - Turn on find my iPhone
    - Turn on copy-paste between your iPhone and Mac
    - Turn on screen sharing between your iPad and Mac

    unless you turn it off after each of those cases. And even still, the uploading of your photos begins immediately and the time between when you turn on, say iPad screen sharing, and when you turn off photo sharing may upload some photos and it is not clear how

    • Another PSA

      Apple turns unencrypted syncing of all your photos to the cloud on every time you do some or all of these:

      - Update iOS - Turn on find my iPhone - Turn on copy-paste between your iPhone and Mac - Turn on screen sharing between your iPad and Mac

      unless you turn it off after each of those cases. And even still, the uploading of your photos begins immediately and the time between when you turn on, say iPad screen sharing, and when you turn off photo sharing may upload some photos and it is not clear how to remove them.

      ---

      In this case ^ "unencrypted" means employees have access to your photos and can print them, modify them in your account, and do anything they want with them.

      There is SO much BULLSHIT out there about this, and this post is 100% BULLSHIT.

      Apple NEVER turns iCloud Photo syncing in ANY of the cases listed, and the last item "screen sharing between your iPad and Mac" doesn't even exist! It's a feature coming with the next OS release, but fuck - it doesn't exist now.

      And Jesus Fucking Christ - "it is not clear how to remove them". HOW stupid can you be? If you ever do turn on iCloud photos, and there's a photo synced to the cloud that you don't want, you simply

      • Correcting myself-- using an iPad as a second display, not "sharing a screen".

        And yes, if you take an account which does not have "iCloud" enabled, then when you turn on iPad as a second display it will at that point start uploading all your photos to Apple.

        I am very stupid. And it is not obvious to me that deleting a photo and then deleting that from recently deleted photos will delete the same item from all of: Apple's servers, Apple's backup servers, and the many copies of Apple backup servers which are

  • There is no technical reason for the CSAM scanner to leave you alone under any circumstances. This software would be a rootkit if deployed by any other company, and its no different that it's Apple in this case doing it themselves.

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...