Apple Confirms It Will Begin Scanning iCloud Photos for Child Abuse Images (techcrunch.com) 135
Apple will roll out a technology that will allow the company to detect and report known child sexual abuse material to law enforcement in a way it says will preserve user privacy. From a report: Apple told TechCrunch that the detection of child sexual abuse material (CSAM) is one of several new features aimed at better protecting the children who use its services from online harm, including filters to block potentially sexually explicit photos sent and received through a child's iMessage account. Another feature will intervene when a user tries to search for CSAM-related terms through Siri and Search.
Most cloud services -- Dropbox, Google, and Microsoft to name a few -- already scan user files for content that might violate their terms of service or be potentially illegal, like CSAM. But Apple has long resisted scanning users' files in the cloud by giving users the option to encrypt their data before it ever reaches Apple's iCloud servers. Apple said its new CSAM detection technology -- NeuralHash -- instead works on a user's device, and can identify if a user uploads known child abuse imagery to iCloud without decrypting the images until a threshold is met and a sequence of checks to verify the content are cleared. News of Apple's effort leaked Wednesday when Matthew Green, a cryptography professor at Johns Hopkins University, revealed the existence of the new technology in a series of tweets. The news was met with some resistance from some security experts and privacy advocates, but also users who are accustomed to Apple's approach to security and privacy that most other companies don't have.
Most cloud services -- Dropbox, Google, and Microsoft to name a few -- already scan user files for content that might violate their terms of service or be potentially illegal, like CSAM. But Apple has long resisted scanning users' files in the cloud by giving users the option to encrypt their data before it ever reaches Apple's iCloud servers. Apple said its new CSAM detection technology -- NeuralHash -- instead works on a user's device, and can identify if a user uploads known child abuse imagery to iCloud without decrypting the images until a threshold is met and a sequence of checks to verify the content are cleared. News of Apple's effort leaked Wednesday when Matthew Green, a cryptography professor at Johns Hopkins University, revealed the existence of the new technology in a series of tweets. The news was met with some resistance from some security experts and privacy advocates, but also users who are accustomed to Apple's approach to security and privacy that most other companies don't have.
So Apple does rummage through everything (Score:2, Flamebait)
Re: (Score:2)
https://www.apple.com/child-sa... [apple.com]
This is going to be baked in the OS, and appears to be situated around reporting already known images sent or received by a user.
Basically it's an anti virus scanner for CSAM that calls the authorities when a hash matches.
Re: (Score:2)
This is going to be baked in the OS, and appears to be situated around reporting already known images sent or received by a user.
Basically it's an anti virus scanner for CSAM
Good analogy. But if this runs locally on your device, it must be either downloading a massive hash-table of known CP, or uploading a hash of all your photos.
Even if just a few megabytes of hashes per phone, that is petabytes total. What is it supposed to achieve?
Catching some sad old perve's circulating vintage known CP, and so dumb they keep it in iCloud. Might result in some suicides and prison terms, but any evidence it will actually protect kids?
An AI that detected *new* CP would be more effective, b
Re: (Score:2)
Re: (Score:2)
No need to speculate; they say it:
Re: (Score:2)
Maybe its just not that big? Like, perhaps theres a known 'corpus' of creep shots they find repeatedly in raids, hash against that and you'll get the majority of it, or something.
Re: (Score:3)
That is basically the motivation behind this. The illegal images claim is just a nice, convenient pretext (a.k.a. "lie"). Of course, they claim to "preserve user privacy", but that is also just an obvious lie.
Re: (Score:3, Insightful)
They're private when you want them to censor harrassing politicians' speech or have section 230 wrecked, costing them billions.
They're public when you want to apply every manner of regulatory control, and I don't just mean "public accommodations" type laws.
The goal is predefined. You pick and choose your table-slamming philosophy that you ignore in the next issue over, like a good little power thug
Re: (Score:2)
Re: (Score:2)
The difference between this and posting on forums meant for public consumption should be obvious. Of course, it is their servers and the users do have contractual agreements with Apple that presumably allow this. The big problem is that people don't really understand what they're giving up in these clickthrough agreements and the company can change the agreement at any time anyway. That's why no-one should ever buy into software as a service models and they should have control over their own devices. If the
Encrypted (Score:1)
Re:Encrypted (Score:5, Interesting)
Re: (Score:3)
Try reading it again. They're scanning on the phone, before it is encrypted.
Re: (Score:2)
They are doing that, but then they can also decrypt the images, presumably to allow them to report you to law enforcement.
Re: (Score:3)
They are doing that, but then they can also decrypt the images, presumably to allow them to report you to law enforcement.
You presume incorrectly. Apple has always held the encryption keys for iCloud Photos because iCloud Photos are sharable via the web (i.e. they have to have the keys). Other web-based photo sharing service hold the keys as well, and most of them are scanning for child porn too. After all, recent legislation makes them liable for it [wikipedia.org], so while Apple has been getting a lot of attention, it's actually been happening silently at all of the major services for quite awhile now.
Re: (Score:2)
I spoke out of turn and would like to correct my mistake.
Apple has up to this point had the keys for content that you make publicly available via the web, but not for all of your photos that you upload to iCloud Photos. I was in error to suggest otherwise, because those have historically been encrypted in a manner that Apple cannot decrypt.
That's still largely unchanged today. Apple doesn't have any way to decrypt typical photos (unless you enable web access, as already mentioned). The big change here is th
Re: (Score:2)
Try reading it again.
I suggest following your own advice.
Re: (Score:2)
So chewing up battery then? Got it.
Re: (Score:2)
Didn't everyone already know this? I mean, when the FBI was trying to force them to open up that California terrorist's phone, Apple told them forcing an iCloud backup would give them access - and told them how to do that. (but the FBI was more interested in raising political hay, so they did their own thing)
illegal things in iCloud (Score:2)
Perhaps Apple simply want to deter users from putting stuff on their servers that could incur liability to Apple if Sec. 230 is killed.
Re:illegal things in iCloud (Score:4, Interesting)
Scan (Score:1)
Maybe the editors here should start scanning for dupes.
Re: (Score:2, Funny)
The editors are too busy double checking the contents of their iCloud
Re: (Score:2)
Maybe the editors here should start scanning for dupes.
It's not a dupe. The previous post mentioned scanning your phone for content. That's a big difference from scanning their cloud servers.
Don't upload your illegal shit to other people's computers. Sounds like a Crime 1-oh-1 lesson.
Re: (Score:2)
We'll have world peace, honest global government by consensus, and reverse global warming before that happens.
What, again? (Score:2)
Didn't they just do that a few hours ago... Dupe
Re:What, again? (Score:4, Informative)
Not a dupe. This one is about Apple's confirmation.
Oh well, who didn't suspect them anyway?
Limited to Children's Accounts (Score:2, Interesting)
Taken at face value, it looks like this feature is limited to things that connect to a child's account. Though I am skeptical of this technology's effectiveness in fulfilling its purpose, it is not nearly as far-reaching as initially reported here.
If that doesn't sit well with Apple customers, they are free to take their business elsewhere. Hooray free market!
Re: (Score:3)
Seems pointless to limit it to children's accounts. It's just a database of known image checksums, and children are not likely to be sharing known child abuse images.
The tech detecting unknown images makes more sense for children, but I'd be surprised if it actually worked. All previous attempts have failed.
Re: (Score:2)
That is my expectation of this as well.
Re: (Score:2)
Re: (Score:2)
They are a special kind of hash that can recognize when an image has been resized, rotated or had its colour adjusted. When you do a reverse image search on Google, for example, that's what it uses.
Re: (Score:2)
If that doesn't sit well with Apple customers, they are free to take their business elsewhere. Hooray free market!
Might explain why Android doesn't have an iCloud. Too much of a headache for too little benefit.
Re: (Score:3)
If that doesn't sit well with Apple customers, they are free to take their business elsewhere. Hooray free market!
Might explain why Android doesn't have an iCloud. Too much of a headache for too little benefit.
Google Drive is out there, samizdat, and they already scan for the illegal shit. They were doing it before Apple too. The summary even notes that.
Re: (Score:2)
It's not a mandatory part of Android unlike iCloud.
Re: (Score:2)
It's not a mandatory part of Android unlike iCloud.
Where did you get the idea that iCloud is mandatory? I've had iphones for a decade now. and this is news to me, because I've opted out.
Re: (Score:2)
Re: (Score:2)
Back when I had iDevices, they'd turn it on and take data whenever there was an update or refresh. Even if you checked every use they'd have taken some or all rendering any privacy ambitions moot. Remember when cloud accounts were compromised and celebrities had personal images there despite not turning on cloud storage?
I know they ask if I want to turn on icloud. I have turned it on to see what it's all about. I'm not certain how the celebrities were compromised. I suspect they had no idea what they were doing, and turned it on without thinking. I can't find anything in iCloud the times I've checked it out. Found nothing. Turned it off. Anyhow, I don't have any nekkid pictures on my phone anyhow.
Re: (Score:2)
We’ve said it before, and we’ll say it again now: it’s impossible to build a client-side scanning system that can only be used for sexually explicit images sent or received by children. As a consequence, even a well-intentioned effort to build such a system will break key promises of the messenger’s encryption itself and open the door to broader abuses.
What Happened To Privacy Apple?!? (Score:5, Insightful)
This is not fucking ok. I'd much rather apple give up on end to end encryption than add fingerprinting to end devices for illegal content. As always, the danger is always that the most disgusting awful thing will be used as an excuse to violate a general principle and once you've crossed that line there is no natural stopping point.
First, it will inevitably be abused. Someone will figure out how to make a seemingly fine photo match and send it around. Someone will figure out how to excerpt enough of the non-illegal content of an image registered in their abuse database and stick it in another image to get it recognize.
Also, once you say that you can force load a program for checking for illegal content onto a phone where does it stop? With recent advances in machine learning you could learn to identify drug sales in text messages with decent probability...or tax fraud or whatever. Maybe not perfectly but surely with sufficient accuracy to justify a warrant. And what if the Chinese government demands they also report back fingerprints of mentions of Tianemen or whatever.
Re: (Score:2)
"It's for the children!"
Re:What Happened To Privacy Apple?!? (Score:4, Interesting)
TBF they aren't actually considering using AI per se to identify the photos. Rather, they use a fingerprinting system and a giant database of child porn images (somehow this is maintained or allowed by US law enforcement for this purpose) so they aren't training an AI to guess if unseen images are child porn but rather checks if the user has an image that's on the list.
However, remember that the fingerprinting system needs to be robust against both reencoding/recompression and resizing/backgrounds/etc.. The fact that it's *not* intelligent means it has no way of knowing if the component of the image it is recognizing actually displays abuse or not.
So I'm expecting that someone is going to find an image on the child porn list with a large section of purely mundane content (e.g. some furniture or crap in the background) and will put that into some meme or innocuous photo and send it around. I'm kinda hoping someone does this relatively soon with a popular meme or something just to illustrate how bad an idea this is.
But once you are doing some fingerprinting which can be tricked it's not clear why this is different in kind from using ML to probabilistically detect new images or other illegal activity.
Re:What Happened To Privacy Apple?!? (Score:4, Insightful)
Like they don't already do that for their Chinese masters... It's the price of doing business over there.
Re: (Score:2)
I don't think they do. The way China enforces it's viewpoint controls online isn't usually by trying to make it completely impossible to evade or catch every instance of someone talking about something so I kinda doubt they are demanding this from apple already. Indeed, I think this kinda porous approach makes the system more effective and ominous because it means that most normal users can evade or escape the inevitable overblocking but it discourages mentioning it and ensures that high profile individua
Re: (Score:2)
The boat sailed long ago on forced loading. Apple controls iOS, no side loading, no replacing the OS with your own, and no uninstalling Apple's apps.
Most iPhone users have a folder full of Apple crapps that they don't want it use but can't uninstall.
Re: (Score:2)
Privacy and control are different issues. Apple hasn't been telling users they will have control over their phones and what is on them. Indeed, they've been touting their walled garden and forced defaults/options as benefits protecting users (no worry about side loading malware) etc..
But, at the same time, they've been doing a major PR (and I believe lobbying) push on privacy talking up new features like private browsing as well as end-to-end encryption of messages and phone encryption. They've even been
Re: (Score:2)
Well, it runs on your own phone, for starters. It's not Apple running it on iCloud.
I'm not convinced it works on photos - it likely only works on the existing databases that exist, albeit with a bit of "AI" to get around trivial edits to screw up the hash.
After all, it's a bit difficult to determine if a photo with a bit too much flesh tone is legit, so I'd have real doubts Apple is scanning photos. Instead they'd be scanning downloaded images and likewise when those images are shared with others.
You have t
Re: (Score:2)
This is not fucking ok. I'd much rather apple give up on end to end encryption than add fingerprinting to end devices for illegal content. As always, the danger is always that the most disgusting awful thing will be used as an excuse to violate a general principle and once you've crossed that line there is no natural stopping point.
Told you I did, listen you did not (perhaps not you in particular, but Apple fanboys in general that you may or may not be, so select as appropriate).
For years Apple fanboys have crowed that Apple cares about their privacy, Apple protects them from law enforcement, Apple does no wrong whilst Google spies and snitches. Well here's a big mug of "I fucking told you so" because I told you that Apple does it, they're just being less honest about it.
Now Google may advertise but it's unobtrusive and definit
Re: (Score:3)
Nope, as some of the articles on this point mention the fingerprints have to be robust to simple image changes like cropping, adding a border or recompressing.
So you don't need to find an MD5 hash collision to trigger a false positive. All you need to do is take an image on the prohibited list and find a portion of it that's large enough to trigger an alert but doesn't include the child porn content and then share that imagepotentially with some modifications/extra content to make it interesting but aren't
Baby pictures (Score:5, Insightful)
Re: (Score:3)
This actually happened in the UK. An artist put on an exhibition that included a photo of her children nude at the beach. The newspapers found out and went nuts. It was at the height of a paedophile panic.
Actually come to think of it Japanese situation Takako Kido sometimes photographs nude children (usually with their mothers), and I follow her on Instagram.
https://www.instagram.com/taka... [instagram.com]
Re: (Score:2)
Is this checking an on-device photo against a database of hashes of known images? Given the "NeuralHash" label, it suggests flagging any hash that is numerically 'similar' to a known hash. The "Neural" part also suggests it will attempt to identify nipples and other body parts, then demand an Apple contractor look at your photo to confirm. Does it escalate because a child isn't wearing clothes or it meets some definition of sexual grooming?
What does that mean? Parents frequently take photos of their fam
Re: (Score:2)
Someone here mentioned how his mother liked to show his baby pictures which included him standing naked next to the bathtub. It's interesting because he finds it embarrassing (pardon the pun), his mom thinks it's cute, most everyone else doesn't care at all but a pedophile might find it arousing. So, the perception is truly in the eye of the beholder.
The problem is that the pedophile doesn't actually need porn to find kiddies arousing. We had a case here in which a guy was tried and convicted, and all he had on his computer was non-porn images of kids. He obviously had a problem, I think what his original issues was trying to make a date with an underage girl in a chat room who was of course a cop. But the images were used as part of the evidence to convict him.
Re: (Score:2)
So, the perception is truly in the eye of the beholder.
Acrotomophilia - sexual attraction to amputees.
Coulrophiliac - sexual attraction to clowns.
Chremastistophilia - sexual arousal from being robbed.
Dendrophilia - sexual attraction to trees.
Hoplophilia - sexual attraction to guns (see also Texan).
Mucophilia - sexual attraction to mucus.
Toxophilia - sexual attraction to the sport of archery.
If you can think it up, someone has wanked over it.
Re: (Score:2)
You're an idiot and NotEmmanuelGoldstein is apparently correct.
The word "known" is used at least twice, meaning if I'm 10 and post a photo of my junk it's new, not "known".
The national center for missing and exploited children maintains a set of hash values for known abuse images, and I would assume this is some sort of TinEye-like similarity algorithm which can identify cropped, color manipulated, or otherwise altered photos which have already been determined to be illegal.
That can't happen, to answer your
Re: (Score:2)
Except your computer gets hacked and the pic you took gets passed around and ends up in the exploited children database. Then you get busted.
Re: (Score:2)
This all depends - the Apple system will check your photos (on your phone) against a database of known image signatures. Thus, just because you've got a bare-backside picture on your phone means nothing at all.
However, if, as you describe, the picture is (say) leaked to a paedo user group, they might share it around. One of them gets caught, and all their images, including this one end up in the signatures database.
One would hope that the human reviewers would be able to tell the difference between an embar
Re: (Score:2)
Someone here mentioned how his mother liked to show his baby pictures which included him standing naked next to the bathtub. It's interesting because he finds it embarrassing (pardon the pun), his mom thinks it's cute, most everyone else doesn't care at all but a pedophile might find it arousing. So, the perception is truly in the eye of the beholder. Now Apple/others will stick their opinion into the mix. I hope a lot of mothers don't get inconvenienced/hurt by their decision-making AI. That can't happen, right?
I don't think they'll be targeting infants (although they may, Apple have a history of getting things horribly wrong) rather images of prepubescent or adolescent children.
So the School photographer, the parent/aunt/grandparent who may have pictures of an 11 yr old girl in a swimming cossie on sports day... because as you've intimated, false positives never happen and if you've been branded a paedo in the US (where Apple is located) that pretty much strips you of any argument beyond the courtroom (and eve
Re: (Score:2)
This exact issue came up immediately when I was talking about these features with my wife. We try to be careful with the photos we take, but there have been some candid photos of naked babies over the years. We haven't shared any of those with others, but even if we had Apple would not have flagged any of those photos for reporting, so far as I can tell.
Just to walk through the details since there are several related features that overlap in terms of how they operate, let's say I take a photo of a naked tod
Re: (Score:2)
But being stupid is not illegal by itself.
What actually is this? (Score:2)
Any half way intelligent person who knowingly has illegal content will know better than to store on some one else's sever. ...so what is this actually doing?
Re: (Score:2)
Re:What actually is this? (Score:4)
Seriously, the entire goal here is to open the door just a tiny little crack into privacy invasion in the name of doing some vague good, then slowly creep the door further open with each iteration after until it's wide open and we're all being watched 24/7 by the corporate overlords, with direct reporting to authorities when we do something awful like let a kid run naked through the house after a bath or contemplate taking a substance currently considered illegal.
This may not be the definition of the slippery slope, but it's a damn fine model of it.
Re: (Score:2)
After reading a few articles, I think they are trying to identify instances of grooming. An older groomer with a collection of known bad images will send them via Messenger to a child they are grooming and ask for similar pictures in return. The child, who is more likely to have automatic iCloud backups enabled, will receive known bad images (which then get backed up into iCloud where they are scanned) and produces their own images to return (which are again backed up into iCloud and scanned).
Apple to sc [aljazeera.com]
Scorpions (Score:2)
Permissions setting? (Score:2)
Surely there will be some permission setting somewhere to toggle this - or perhaps a notification to ask:
Behavioral gerrymandering (Score:2)
Re: Behavioral gerrymandering (Score:2)
Always "for the children" (Score:2)
Re: (Score:2)
this is NOT A DUPE (Score:2)
For those who have not followed thru there are serious distinctions:
1 The earlier article was about the iPhone being scanned for photos.
2 This article is about iCloud being scanned for photos.
3 The earlier article was assumed to be in error by many comments.
4 This article is confirmed by Apple.
It is true that the word 'Apple' appears in both articles.
No other connection.
Re: (Score:2)
This shouldn't really come as surprise (Score:2)
If you're using iCloud, you should already be assuming Apple is looking through absolutely everything you put there. This goes for every other cloud service as well, obviously. Apple might just be slightly more despicably evil than other companies, but the difference isn't big.
sign the Change.org petition (Score:3)
i've created a change.org petition.
if you care about this issue, you can sign it here: http://chng.it/4wFSfVgPgL [chng.it]
-dave
Time to dump Apple stock (Score:2)
Now the CEO of Apple is some fossilized brain specimen that thinks porn filters work, when any competent programmer could tell him they don't, and can't wrap his head around the idea that ma
You know what they say... (Score:2)
Remember when... (Score:2)
Re: (Score:2)
That was before hysterical neo-Victorian moral panic gripped the US. I hope to live long enough to see it destroyed by events, ANY events.
Re: (Score:2)
Why not warn the user (Score:2)
"NeuralHash uses a cryptographic technique called private set intersection to detect a hash match without revealing what the image is or alerting the user." Why NOT warn the user. So if someone innocently downloads what turns out to be illegal material they are referred to law enforcement rather than warned on the spot?
"Apple said that there is a one in one trillion chance of a false positive, but there is an appeals process in place in the event an account is mistakenly
It's the mind (Score:2)
Today, on It's The Mind, we examine the phenomenon of Deja Vu
https://www.dailymotion.com/vi... [dailymotion.com]
guilty until proven innocent (Score:2)
Good luck with that. Once Apple's brilliant algorithm misidentifies your picture of a water lily as child abuse, you will be banned for life, and you can call an 800 number and speak to a robot if you don't like it. But, hey, look at the bright side -- you'll be able to speak to the robot in Spanish if you want to.
First it is "For the Children" (Score:2)
As usual, first it is "For the Children" but it won't be long before it is "For the Politicians".
I have a friend who works for Apple who deals with this sort of stuff and he outright acknowledges that they have kiddie porn on their servers and so technically they are in possession and Apple as a company is in violation of the law. This, of course, probably holds for just about any company that has a repository of user supplied content.
How are they training the AI? (Score:2)
Re: (Score:3)
I'd be happy if they'd confess to day drinking while posting.
Re: (Score:3)
I suspect our editors are all AI. It's cheaper that way.
Re: (Score:2)
Not a dupe... first one is iPhones, this one's about iCloud online storage.
Re: (Score:2)
It's not a dupe. The previous story was a rumor about it with Apple saying basically "No Comment", and this story is Apple saying "Yeah, we're gonna do it. So does everyone else".
Re: (Score:2)
Re: (Score:2)
I assumed this was a confirmation of that story.
Re: (Score:2)
Nope. Read The Fucking Titles of both. It's not hard to realise it's not a dupe. The hint is in the second word of the title.
Re: (Score:2)
Re: (Score:2)
They aren't even scanning iCould. It says the scans happen right on the device.
Re: (Score:2)
Re: (Score:2)
They're not the government. Warrants are irrelevant. Also, you agree to it in their ToS.
As for "agents of the government", that boat sailed decades ago. Police regularly use "confidential informants" to gather information they can not legally access.
And sometimes, the informant actually gathers information instead of repeating what the police tell them to say.
Re: (Score:2)
The other article you are claiming this is a dupe of, was speculation, and had many comments about "there is no way Apple would really do this". This article is Apple confirming that the previous story was true, and they are indeed intending to do this anti-consumer privacy action.
Re: (Score:2)
That's 'cause it isn't a dupe. The previous story was about the rumor, and this story is Apple saying "Yeah, we're gonna do it. Just like everyone else".
Re: (Score:2)
But of course. Future AI time traveled back, recognized ancestor abuse and put an end to it before the current dominant species got any bright ideas.
Re: (Score:3)
can you imagine the PTSD and years and years of therapy that will be needed for whosoever is unlucky enough to land that job?
articles like this make me miss the olden days.. you buy something, it's yours. No continued long-tailed relationship between you and the manufacturer, or the retailer. But, somewhere along the line some smarmy MBA decided that wasn't intrusive enough and that "money was being left on the table!", and as a result now you get tracked and surveilled six ways to Sunday. Cars, TV's, phon
Re: (Score:2)
can you imagine the PTSD and years and years of therapy that will be needed for whosoever is unlucky enough to land that job?
That'd be an actually useful, socially beneficial profession for psychopaths. Their minds easily recognize what's considered upsetting or outright damaging to empathetic neurotypicals, but they themselves aren't affected.
Evidently, highly functioning psychopaths would demand huge salaries to do it for us instead than the more usual -- such as going for high level executive positions, becoming ruthless politicians, running crime rings, ruining countries and entire economies etc. Therefore, evidently, the psy