PSA: Apple Can't Run CSAM Checks On Devices With iCloud Photos Turned Off (imore.com) 62
An anonymous reader quotes a report from iMore: Apple announced new on-device CSAM detection techniques yesterday and there has been a lot of confusion over what the feature can and cannot do. Contrary to what some people believe, Apple cannot check images when users have iCloud Photos disabled. Apple's confirmation of the new CSAM change did attempt to make this clear, but perhaps didn't make as good a job of it as it could. With millions upon millions of iPhone users around the world, it's to be expected that some could be confused.
"Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content," says Apple. "The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account." The key part there is the iCloud Photos bit because CSAM checks will only be carried out on devices that have that feature enabled. Any device with it disabled will not have its images checked. That's also a fact that MacRumors had confirmed, too. Something else that's been confirmed -- Apple can't delve into iCloud backups and check the images that are stored there, either. That means the only time Apple will run CSAM checks on photos is when it's getting ready to upload them to iCloud Photos.
"Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content," says Apple. "The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account." The key part there is the iCloud Photos bit because CSAM checks will only be carried out on devices that have that feature enabled. Any device with it disabled will not have its images checked. That's also a fact that MacRumors had confirmed, too. Something else that's been confirmed -- Apple can't delve into iCloud backups and check the images that are stored there, either. That means the only time Apple will run CSAM checks on photos is when it's getting ready to upload them to iCloud Photos.
"Confirmed" (Score:5, Insightful)
Something else that's been confirmed -- Apple can't delve into iCloud backups and check the images that are stored there, either.
Why? How was this "confirmed"?
Re: "Confirmed" (Score:2)
A follow up question would be what about iCloud Photo Stream? Do photos that sync with your other devices a-la Photo Stream also get scanned, or is it strictly iCloud photo libraries?
Re: (Score:3)
That doctrine has been weakened by recent decisions. And it always applied to law enforcement, never to private companies. Not to mention that I'm sure something allowing it is buried in the iphone or photo app's terms of use.
Re: "Confirmed" (Score:2)
That doctrine has been weakened by recent decisions. And it always applied to law enforcement, never to private companies. Not to mention that I'm sure something allowing it is buried in the iphone or photo app's terms of use.
Also, AFAICT, Apple isn't doing anything other than Suspending the User's AppleID. Anything else would probably just make any "evidence" inadmissible, anyway.
Re: "Confirmed" (Score:2)
Apple hosting photos of naked children would open them up to more lawsuits and investigations from the FBI and Congress, so itâ(TM)s better for Apple to disallow such content than to get caught hosting it.
Re: "Confirmed" (Score:1)
This has nothing to do with legitimate investigations when that would matter.
Its about things like if you have a picture if tankman on your phone the Chinese authorities will know not to do business with you.
Or if you have lots of pictures of guns the fbi can know you are a good candidate for a set up.
etc.
Re: (Score:3)
They claim that iCloud data is encrypted and they don't have the key, but in certain jurisdictions (e.g. China) even that isn't true and both Apple and the government have full access to data stored there.
Apple collaborates with the CCP so it's hard to take their insistence that Western user's data is secure when they are clearly more than willing to put profit before the interests of the user.
Re: (Score:2)
you do realize that different encryption methodologies an be applied in different regions right?
I mean, I know it's fun to think Apple is the big bad and Coming To Get You, but this is more checking photo "dna" against a known dataset of problem images than actually reviewing all of your photos. Not like there's a person looking at all your twitter nude PMs. Some meta level checksums are done against the images and if they get a suspiciously high set of matches, it'll trip a flag.
What they do for the US h
Re: (Score:2)
What is CSAM?
child sexual abuse material [wikipedia.org]
Re: CSAM? (Score:2)
Officials introduced the term CSAM (Score:3)
Not just "because it's Apple". Over the past two decades, officials have been playing with the terminology [wikipedia.org], first using "child abuse images", then since 2015 "child sex abuse material". The change may also reflect a change in attitude on the part of some officials toward the non-photographic depictions sometimes called "hentai", as their production does not involve child sex abuse.
Ridiculous (Score:4, Insightful)
Anything that only works if you aren't allowed to disable it is, by definition, malware.
Re: (Score:2)
That's a very tortured statement - where did the "aren't allowed to disable it" come from? By that odd definition, ssh would be considered malware - after all, it only works if you "aren't allowed to disable it".
Re: (Score:2)
I'm speaking of the larger goal of "think of the children" as a whole, which is the purpose of CSAM.
There is no larger goal of ssh that is defeated if you chose to disable public ssh access to your machine.
Re: (Score:3)
I'm speaking of the larger goal of "think of the children" as a whole, which is the purpose of CSAM.
I don't think Apple is "thinking of the children," I think they are thinking of protecting themselves from liabilities from politicians who claim to "think of the children."
Unfortunately, when it comes to encryption, you can't have your cake and eat it, too. If it's secure, the "think of the children" crowd will criticize it. It seems that in this case, Apple wants to eat their cake and have it, too. They thought they could roll something out that will appease both the "think of the children" crowd and the
Re: (Score:2, Interesting)
Anything that only works if you aren't allowed to disable it is, by definition, malware.
It's all good, I think they don't even have to code a solution
They'll probably publicize the software, allow disabling this feature, and then just report everyone who disabled the filter.
Obviously (Score:2)
Protip (Score:4, Informative)
Don't underestimate the bandwidth of a station wagon full of blu-ray disks.
It's not how it works now... (Score:5, Insightful)
It's exposing and publicizing their capability to scan for "bad" content on user devices. Just because they are only scanning pictures, and only when they are being uploaded to iCloud doesn't mean some horrible state somewhere won't tell them to scan all files on all devices for content critical of the dictator.
This is far too slippery of a slope.
Children making child porn (Score:3)
A friend found naked pictures of his young son on his phone, taken by the son. He quickly deleted them.
Under this brave new world this could get people into serious trouble. Make absolutely sure nobody can use your phone to take pictures. (Most phones allow taking of pictures without login.)
Re: (Score:3)
While a situation like that is a very real concern, it's totally not relevant in this case. Apple's system uses perceptual hashes of images and compares those hashes against a database of hashes from known abusive material. Privately taken images can't be detected since they won't be in the database.
Can only catch stupid criminals, then? (Score:1)
That means the only time Apple will run CSAM checks on photos is when it's getting ready to upload them to iCloud Photos.
There's dumb criminals, and there are smart criminals.
This will only catch the dumb ones - the ones that would actually have their images uploaded to a service they don't themselves control.
Re: (Score:2)
Oh, it won't only catch the dumb criminals. Don't forget about the false positives who will still have their privacy unjustifiably invaded because of this..
All it will take is a handful of sufficiently high profile people for that to happen to and this whole thing will go away as quickly as it came.
Re: (Score:3)
This will only catch the dumb ones - the ones that would actually have their images uploaded to a service they don't themselves control.
Not to mention
1) False positives (Yes, Apple has some manual review process, but unless there is a significant jail penalty for actually reporting a false positive, that doesn't mean much).
2) Someone sending such images to people they don't like. Maybe with some steganography trick, so it is not easily visible to the recipient.
What could possibly go wrong??
Re: (Score:1)
Not to mention 1) False positives (Yes, Apple has some manual review process, but unless there is a significant jail penalty for actually reporting a false positive, that doesn't mean much). 2) Someone sending such images to people they don't like. Maybe with some steganography trick, so it is not easily visible to the recipient.
What could possibly go wrong??
Good points - and in line with an apropos post I just read: On Apple’s “Expanded Protections for Children” – A Personal Story [wordpress.com]
Good thing I do not take photo's of children. (Score:1)
Re: Good thing I do not take photo's of children. (Score:2)
There was a time, not to long ago, when people dropped of rolls of film to little kiosks. An hour or two later, when they went to pick up their photos of their little Johnie and Susie taking a bath, they were met by police all because someone was overzealous in their interpretation of child exploitation and kiddie porn.
Now, in the digital age, the photo Nazis have returned.
User error, as usual! (Score:2)
Right. This has nothing to do with whether companies should be doing these types of scans. People are only upset about their privacy and security because they are "confused."
Re: (Score:2)
You almost wonder if leaking the original story is not a convoluted plot to get people to reduce the use of iCloud so Apple can save on storage fees...
No, not really.
the next generation chip though (Score:2)
Apple's servers, Apple's rules (Score:2)
And it's hard to find any company that is willing to protect pedophiles if false positives can be made very improbable. However, there is a bigger picture of cloud computing undoing a big promise of PC revolution where you don't have to follow someone else's rules all the time. Facebook blocks plenty of stuff that is perfectly legal. There is a value in rebuilding offline / decentralized / open source solutions like social networks on top of end to end encrypted e-mail. Among other things, competition will
They just don't want this stuff on their servers (Score:2)
They built iCloud Photos and have billions of photos uploaded from users. The FBI and other law enforcement agencies have been finding this garbage via subpoena and Apple has become alarmed. Now they are making a good showing with law enforcement to get the stuff off their servers or flag it proactively.
Their public positioning in advance of the rollout will get the stuff removed by the users.
Old-school offline backups will come back into style.
CSAM? (Score:3)
The story is less than a week old and now everyone is automatically assumed to know what CSAM means without outside help?
Re: (Score:3, Informative)
Certified Software Asset Manager
https://www.google.com/search?... [google.com]
PSA? (Score:1)
So who exactly is the "public" that this announcement is meant to serve? People who have CSAM material on their phone and don't want to be found out?
Is there something about the Slashbot user base I don't know about?
Re: PSA? (Score:3)
Also, who sets the standard? At one point Wikipedia got blocked by the IWF because of a naked baby on a Nirvana cover album. Also, what about images which are not representative of
Re: (Score:2)
Can you prove they are only checking for CSAM? Apple themselves cant.
It's worse than that. The hashes your phone gets are "blinded" - they're encrypted with a public key. There's no way to prove that the hashes your phone is checking against are, in fact, part of some public database of hashes, even assuming it was possible to get access to the list of hashes. (Source: Apple's own description of the system [apple.com])
I'm not sure why the "blinding" step. You'd think there'd be nothing wrong with just having the hashes be publicly available. I suppose it's possible if the final hashes a
Intro (Score:1)
One in a trillion huh? Will they post a bond? (Score:5, Insightful)
What - are they not really that sure the false positive rate is that low? Maybe they are worried about human error, and intentional hacks. Well, SO ARE THEIR USERS.
Re: (Score:2)
That would be great if all accounts were equal. I assume that some people will have 5 photos on their account, and others will have 50000. The more you use the higher your chance of being flagged incorrectly.
Three issues (Score:2)
Second, this doesn't solve the problem that governments can and will order them to add hashes of files that have nothing to do with CSAM or anything that should be illegal, though tbf Gmail, FB and every other cloud service already scans for CSAM so are subject to that too.
Finally, Apple is still sending very mixed messages ab
Pro-tip for pedophiles (Score:2)
Think different (Score:2)
the next step is logical (Score:3)
"You've turned off your cloud sharing, what are you hiding?"
Re: the next step is logical (Score:2)
"You've turned off your cloud sharing, what are you hiding?"
It is not enabled by Default. It requires a specific, and clearly-labeled, "switch" to be switched "On".
And since anything more than 5 GB storage requires a Paid Subscription, iCloud Photos is hardly something of which a reasonably-diligent User would be unaware.
PSA (Score:2)
Another PSA
Apple turns unencrypted syncing of all your photos to the cloud on every time you do some or all of these:
- Update iOS
- Turn on find my iPhone
- Turn on copy-paste between your iPhone and Mac
- Turn on screen sharing between your iPad and Mac
unless you turn it off after each of those cases. And even still, the uploading of your photos begins immediately and the time between when you turn on, say iPad screen sharing, and when you turn off photo sharing may upload some photos and it is not clear how
Re: (Score:2)
Another PSA
Apple turns unencrypted syncing of all your photos to the cloud on every time you do some or all of these:
- Update iOS - Turn on find my iPhone - Turn on copy-paste between your iPhone and Mac - Turn on screen sharing between your iPad and Mac
unless you turn it off after each of those cases. And even still, the uploading of your photos begins immediately and the time between when you turn on, say iPad screen sharing, and when you turn off photo sharing may upload some photos and it is not clear how to remove them.
---
In this case ^ "unencrypted" means employees have access to your photos and can print them, modify them in your account, and do anything they want with them.
There is SO much BULLSHIT out there about this, and this post is 100% BULLSHIT.
Apple NEVER turns iCloud Photo syncing in ANY of the cases listed, and the last item "screen sharing between your iPad and Mac" doesn't even exist! It's a feature coming with the next OS release, but fuck - it doesn't exist now.
And Jesus Fucking Christ - "it is not clear how to remove them". HOW stupid can you be? If you ever do turn on iCloud photos, and there's a photo synced to the cloud that you don't want, you simply
Re: (Score:2)
Correcting myself-- using an iPad as a second display, not "sharing a screen".
And yes, if you take an account which does not have "iCloud" enabled, then when you turn on iPad as a second display it will at that point start uploading all your photos to Apple.
I am very stupid. And it is not obvious to me that deleting a photo and then deleting that from recently deleted photos will delete the same item from all of: Apple's servers, Apple's backup servers, and the many copies of Apple backup servers which are
Apple told you so. (Score:1)