Apple Says It Will Reject Government Demands To Use New Child Abuse Image Detection System for Surveillance (cnbc.com) 96
Apple defended its new system to scan iCloud for illegal child sexual abuse materials (CSAM) on Monday during an ongoing controversy over whether the system reduces Apple user privacy and could be used by governments to surveil citizens. From a report: Last week, Apple announced it has started testing a system that uses sophisticated cryptography to identify when users upload collections of known child pornography to its cloud storage service. It says it can do this without learning about the contents of a user's photos stored on its servers. Apple reiterated on Monday that its system is more private than those used by companies like Google and Microsoft because its system uses both its servers and software running on iPhones.
Privacy advocates and technology commentators are worried Apple's new system, which includes software that will be installed on people's iPhones through an iOS update, could be expanded in some countries through new laws to check for other types of images, like photos with political content, instead of just child pornography. Apple said in a document posted to its website on Sunday governments cannot force it to add non-CSAM images to a hash list, or the file of numbers that correspond to known child abuse images Apple will distribute to iPhones to enable the system.
Privacy advocates and technology commentators are worried Apple's new system, which includes software that will be installed on people's iPhones through an iOS update, could be expanded in some countries through new laws to check for other types of images, like photos with political content, instead of just child pornography. Apple said in a document posted to its website on Sunday governments cannot force it to add non-CSAM images to a hash list, or the file of numbers that correspond to known child abuse images Apple will distribute to iPhones to enable the system.
Well if Apple says so.... (Score:5, Funny)
I mean, we would never see technology where a company isn't forthright with governmental interference and use of their capabilities in order to do sketchy and immoral and illegal things. Never.
Re: (Score:2)
I mean, we would never see technology where a company isn't forthright with governmental interference and use of their capabilities in order to do sketchy and immoral and illegal things. Never.
So... what, do we all run to Google and pretend like their business model isn't considerably more worrisome in this regard? Absolutely, by all means, criticize Apple! But please can we handle this better than the slave labor thing and make sure the rejection of this is across the board? I mean, really, Facebook's been fingerprinting images for at least five years now.
Re: Well if Apple says so.... (Score:4, Informative)
They will want to know every reporter, politician, soldier, police officer⦠hell every citizen who has https://images.app.goo.gl/KFEU... [slashdot.org]>this pic in their photo library. The results of which will be devastating. That will not be the only photo either.
Re: Well if Apple says so.... (Score:3)
Re: (Score:2)
Re: (Score:2)
Apple on the other hand have no reason to read your data (other than to provide you with service).
Then why must I provide a phone number in order to activate some local features that don't require any of their services?
As long as the laws stay as they are nothing protects your privacy on any phone. Google and Microsoft are more blatant than Apple but Apple still uses information they collect about you for their own benefit. It's in their term of service.
Re: (Score:2)
Google doesn't operate in China because they refused to cooperate with the Chinese government.
Apple does operate in China, and fully cooperates. Chinese user's data is stored in China, and the CCP is given full access to it.
It seems very likely that now Apple has built this powerful technology for finding photos on user's devices they will deploy it to do the work of the CCP.
Re: (Score:2)
Re: (Score:2)
> So... what, do we all run to Google and pretend
What's this "we" business, that was your idea.
Translation (Score:2)
Here's a truck full of killer bees we've been breeding for years. We promise we won't release them.
And when the FISA court orders otherwise (Score:5, Insightful)
Apple will comply. In my mind that's the chief issue here.
Re:And when the FISA court orders otherwise (Score:5, Insightful)
Not to mention repressive regimes. Not to mention that bad people will probably find a way to exploit it, hack-injecting their own definitions to cause problems for political dissidents, etc.
The only way to prevent back doors from being abused is to not build them into your operating system in the first place. I'm ashamed of Apple.
Re: (Score:3)
Re: (Score:3)
But I've never heard an Apple user say: "I wish they would scan our photos for kiddie porn".
I can see a lot of parents really liking the feature where they will get notified if their kids start sending naked pictures of themselves to other people. So I'm sure there probably are a decent number of Apple users that do say those sorts of things.
The problem is that it's part of the operating system, rather than as a third-party add-on that parents have to opt into, which means there's a nonzero risk of false positives causing real people serious harm, and there's no way to fully opt out of it. Even
Re: (Score:2)
I can see a lot of parents really liking the feature where they will get notified if their kids start sending naked pictures of themselves to other people.
I would guess that they would like the feature right up until it happens, when they find out what actually happens to their kids at their point. Lots of parents don't want their kids to do drugs either, but they sure don't want their kids going to federal prison.
Of course, this is creating hashes to compare against known hashes of CP images, so it won't actually trigger when their kids send naked pictures of themselves. Not right away anyway. If they store these hashes and compare them against future additi
Re: (Score:3)
I can see a lot of parents really liking the feature where they will get notified if their kids start sending naked pictures of themselves to other people.
I would guess that they would like the feature right up until it happens, when they find out what actually happens to their kids at their point. Lots of parents don't want their kids to do drugs either, but they sure don't want their kids going to federal prison.
There are actually two separate features built on the same tech. The first is parental notification, which notifies parents if their kids send something that looks like it might be a nude photo. The second is known child porn notification, which alerts an authority if someone possesses a photo that is known to be child porn (from some database, presumably). The kids won't get in trouble (except by their parents).
Of course, this is creating hashes to compare against known hashes of CP images, so it won't actually trigger when their kids send naked pictures of themselves. Not right away anyway. If they store these hashes and compare them against future additions to the db of known cp images. So, when the kids break up with their SO and the SO sends out those pictures as revenge and they end up out on the Internet, get collected by the authorities and hashed, I imagine this system will trigger then. Then probably everyone involved: the parents, their child, their child's SO, the people the SO sent the images to, etc. will go through a lengthy legal process and some of them (sometimes seems almost arbitrary which ones) will go to prison and/or end up in a registry. It does not seem like most parents would really want that.
I would argue that if the former SO posts naked photos of an underage person on the Internet,
Re: And when the FISA court orders otherwise (Score:1)
Re: And when the FISA court orders otherwise (Score:2)
Re: (Score:2)
This is two different features, both involving on-device image (and, presumably, video) scanning, both announced the same day. In theory, the feature designed for parents is mostly harmless, because it can notify only the true owner of the device, and because it uses machine learning to identify problem images visually, rather than looking for specific data. The feature designed for detecting known child porn, however, is not, because of the risk of leaking corporate data on false positives, coupled with
Re: (Score:1)
"And so on. The more I think about this mis-feature, the more horrors I see lying at the bottom of Pandora's box."
One of the problems is that the "mis-feature" will be used as a weapon by the same class of people (kids for the most part) who send a swat team to your house with false murder claims. I don't know if there is any way to avoid someone sending you a text message with an illegal image attached.
It's not a new idea, the post office used to send people they didn't like kiddy porn and use that as an
Re: (Score:2)
And I cannot for the life of me understand what would posess Apple to make such a move.
True, there is little business logic that would dictate making this move. So it makes one wonder what the motivating factor could actually be.
We know the Trump DOJ wanted Apple to build them a back door. A terrorist incident was even leveraged in an attempt to justify their request. And the Biden DOJ is probably even worse. Apple is in constant danger of having to shelve their plans of having all user content always encrypted and unavailable to everyone. Just like how they had to cave with regard to
Re: And when the FISA court orders otherwise (Score:3)
"We know the Trump DOJ wanted Apple to build them a back door. A terrorist incident was even leveraged in an attempt to justify their request. "
Not that it matters, but this was under Obama.
Trump wanted the same thing, of course.
Re:And when the FISA court orders otherwise (Score:4, Insightful)
And I cannot for the life of me understand what would posess Apple to make such a move.
My half-assed guess? They are building this in for China to use on images THEY think are 'bad', but using the smokescreen of kiddie-porn to do it.
Re: (Score:3)
And I cannot for the life of me understand what would posess Apple to make such a move.
My half-assed guess? They are building this in for China to use on images THEY think are 'bad', but using the smokescreen of kiddie-porn to do it.
Well, that's certainly a half-assed guess, so kudos for being honest.
Re: (Score:2)
Apple continues to operate in China, and in order to do that they continue to do as they are told. It's not that half-assed, it's at least three-quarters assed.
Re: (Score:2)
My best guess is that it's because they're being stored on iCloud and it's a liability issue. In previous reports they mentioned that it won't scan your pictures if they're not stored on an iCloud account. I also think the whole thing is problematic, but that's the whole problem with the whole "think of the children" political plea—it's so effective at eroding liberty that very few are able to successfully stand up to it. People will give Apple a free pass for encryption that can be used by drug deale
Re: (Score:2)
Apple will have no choice, and won't even be able to tell the users what they have been forced to share with the US government.
Re: (Score:2)
That is the whole point of the 'four riders of the infocalypse', or 'a terrorist with a nuke'. You take an extreme case of very bad people doing very bad things to get an agreement on the principle. This justifies the buildup of the whole infrastructure and technology for surveillance and censorship. Then changing settings becomes only a tiny operation - which in some countries is not even an issue.
I don't want to be too radical about measures which can be abused but this one is pretty extreme.
Re: (Score:2, Insightful)
Exactly (Score:5, Insightful)
Apple will reject demands - until they are forced to comply.
The problem is, with such a capability in place they WILL be forced to, by courts or by threat.
Re:Exactly (Score:5, Insightful)
No court or threat needed. They'll gladly do whatever the Chinese government demands so they can sell more phones.
Re: (Score:2)
It's amazing how many don't understand the truth in the simplicity of your comment. Snowden showed that we no longer live in the land of the free.
Re: (Score:2)
As the joke goes, now they're just haggling about the price.
China (Score:1)
Until it gets a National Security Letter (Score:5, Insightful)
..at which point it won't be able to tell us about the government using it.
Re: (Score:3)
Re: (Score:2)
Or just don't build the capability in the first place, then no one can ask you to subvert it.
Re: (Score:2)
Or just don't build the capability in the first place, then no one can ask you to subvert it.
That's like asking MADD mothers, to not be mad at drunk drivers. Certain topics, are a natural societal trigger. Child abuse, is certainly one of them.
Sadly, it's also an area ripe for abuse and false accusations when you bring forth any type of automated system. I can envision people being put on some kind of "list" even if the automated systems trigger on naked bathtub pics from a proud new Mom that turn out to be false positives after a manual review. One too many false positives in a 12-month period
Re:Until it gets a National Security Letter (Score:4, Insightful)
That's what a Canary Page [wikipedia.org] is for. Maybe Apple should create one.
Maybe?
Maybe citizens will wise up enough to demand it.
I doubt it.
Re: (Score:2)
It is much more difficult to compel an action such as maintaining a website than it is to compel turn over of records. You really can't compel "don't have a server failure". You definitely can't compel the one guy who knows how to reset the canary every day to not quit or go on extended leave.
Re: (Score:2)
What if Judges learn how to read and find out that you updated it, or didn't, in order to warn people about something that you were required to keep confidential?
Canary pages are fine as long as you believe that Judges will never discover reading.
Re:Until it gets a National Security Letter (Score:4, Interesting)
What if Judges learn how to read and find out that you updated it, or didn't, in order to warn people about something that you were required to keep confidential?
Canary pages are fine as long as you believe that Judges will never discover reading.
I think it's an interesting concept. Can a judge forge you specifically to lie to the public? Not withhold comment or refrain from releasing information, but actually force you to actively speak falsehood? Like if your canary mechanism was not automated and required you to type plain text into a form and post it to a website, could a court order compel you to continue doing that? It's a far more significant abrogation of free speech than a simple gag order. I'd be curious to hear if that has ever been tested. I've never heard of such a thing.
Re: (Score:2)
They can read it all they want. An affirmative action like changing the page could get you in trouble, but it's harder to compel a lack of inaction.
The best of the least. (Score:2)
Not an easy situation when one involves another party in ones actions. The simplest solution is if every iPhone user had their own personal iCloud. Their cloud, their responsibility. Not Apple. Not the government. The citizen. It doesn't solve the child pornography problem, but then the other would have been ineffective at best, and unworkable at the worst. So you work with what you have, preserving the most important.
Re: (Score:2)
If one buys a NAS, that is pretty much close to a personal iCloud, although having it able to be accessed from the Internet freely is a recipe for disaster, and a storage array full of crypto-locked files. However, if one is physically on the NAS's Wi-Fi network, it isn't hard to have an app back pictures and such to the NAS. From there, the NAS can run a tool like borgbase to do offsite, encrypted backups, so one has true 3-2-1 protection of what's on their phone.
I think you can do that today... (Score:1)
The simplest solution is if every iPhone user had their own personal iCloud.
Not 100% sure what apps (if any) do their own storage, but I thought Adobe at least had a camera app that would upload to Adobe's photo cloud... not that I'm sure Adobe is any better in terms of scanning or releasing photos from your cloud library to third parties.
But that points the way to camera apps having a leg up if they introduce their own cloud storage solution, especially if they encrypt any data sent to the server.
Quite a s
how about a pool (Score:2)
When will the back door be released (cracked) to the public ? How about a pool ?
Yes, trying to find these images and people who abuse is great, but a backdoor is not the way to go. How about hashing the image as the i takes the pic and send that hash "home" for validation.
1984 Superb Bowl commercial in reverse (Score:5, Insightful)
They already comply with the Chinese government (Score:5, Insightful)
It's your f-ing phone. You should have the final say on what runs on it, period. Not the the government, not the manufacturer, not the service provider, not the OS author. When you buy a product (as opposed to lease or rent it), I feel that's a line that just shouldn't be crossed. If the manufacturer wants to insist on post-sale control over the device, then they should be required to structure it as a lease, not a sale; with the manufacturer being responsible for repair costs (warranty extended through the term of the lease) and disposal costs. Not the end-user.
Re:They already comply with the Chinese government (Score:5, Interesting)
But its not your f-ing phone. It's Apples that you are just leasing till Apple decides that you need to buy a new one.
And there's no way to verify it (Score:5, Informative)
If you've read the way the system works [apple.com], there's absolutely not way to determine if any given photo matches the hashes they're checking against, let alone what hashes they're checking against.
The way it works is that your device receives a hashtable of NeuralHashes to blinded NeuralHashes - that is, a hashtable where the keys are based on hashes of the original NeuralHashes (but the original hashes are NOT stored) and the values are the blinded NeuralHashes that were encrypted with a encryption key only Apple knows.
At this point, the phone creates a "safety voucher." The safety voucher has two parts: an unencrypted header that indicates the row in the hash table it used (I assume - Apple is extremely unclear on this) and then an encrypted payload that's encrypted with a key based on a combination of the original NeuralHash and the blinded key. If the image does, in fact, match the hash that the blinded secret referred to, it will decrypt. If it doesn't, it'll produce garbage.
What this means that your phone can never know if a given image matched a hash or not, and you can never know if the hashes Apple are checking against are, in fact, the hashes Apple claims they're checking against.
Re: (Score:2)
Re:And there's no way to verify it (Score:4, Interesting)
and you can never know if the hashes Apple are checking against are, in fact, the hashes Apple claims they're checking against.
Correct, and here is just one way this can be abused. Say the FBI is looking for someone (for whatever reason they want - maybe you haven't been vaccinated enough times yet), and they find some online image - any image - that they know came from that person. For example, a Facebook profile picture. They then add the hash of that image to the database. Since the majority of people move their photos with them from phone to phone, as soon as a phone with that exact image in the photo gallery has a hash match, there you go. You've got your person. Friends of that person wouldn't be downloading that picture into there phone's photo gallery - only viewing it in the various FB apps. So there's an extremely high probability that a phone with that image in the actual photo gallery is the person who took the photo.
Re: (Score:2)
Funny how your vision of possible tyranny is a public health measure. I would have used the example of being labelled a "terrorist" or being someone who has a pro-Palestinian protest poster in my Downloads folder.
Does anyone really believes that? (Score:2)
Do they believe someone believes it? Or are they just saying it for plausible deniability?
Re:Does anyone really believes that? (Score:5, Insightful)
They are saying this for marketing reasons. Actual facts do not come into play for that unless they can be successfully sued if lying.
What's next (Score:2)
As if no one could hack the database (Score:2)
You wouldn't even need to force Apple to do anything. The database is outsourced. Hack, pay or threaten the the database manager and you can insert any images you want. Next time Apple updates from the DB your target is found.
Re: (Score:2)
Indeed. And it may not even need to be an image. Hashes can be applied to any type of file. Apple for sure cannot verify what the hashes will match until they have a match.
Re: (Score:2)
Nope. ISPs cannot actually do that. Some understanding of Internet technology required. Cloud storage providers do these scans. But that is different because it is data-at-rest.
Re: (Score:2)
* we will not comply with any government demand for extended requests unless the law mandates it.
The fact that they should even have to say this... (Score:2)
Let me fix that for them: "We will _try_ to ..." (Score:5, Insightful)
Because Apple cannot really "reject" anything here. As soon as they get an NSL or a FISA court order, they are backed against the wall. And they are not even allowed to tell anybody.
Seriously, creating this capability means it _will_ be abused and Apple can do fuck-all about it.
Attention all Frogs (Score:5, Insightful)
Attention all Frogs:
This notice is to inform you that, effective September 1, 2021, we will be increasing your water temperature by 1 degree. This minor change, which should be completely un-noticeable by you, is necessary for your safety. Think of the tadpoles!
Thank you for your understanding,
Management
"Flagged for review" (Score:2)
Too late, the cat is already out of the bag (Score:3)
However it is much worst, the request will start now. Having told the world it is possible to scan user's photos with some future feature to governments means the data they want is available today. With all those public announcements the cat is out of the bag with respect to what is possible and there is no putting it back now.
surreptitious inserts (Score:2)
Checksums/values get into a database one way or another. One needs to calculate the checksum for a few iOS system files and then get them loaded into the system - suddenly everyone is a false positive at the same time.
Time for the Pine Phone? (Score:2)
Re: (Score:2)
Child porn is not a national security issue.
Re: (Score:2)
Except in China (Score:2)
Apple wil reject "Government Requests" (Score:1)
I was considering switching from android (Score:2)
I thought apple was concerned about privacy. They might not always have lived up to it, but at least they were saying the right things. I was considering making my next phone an iphone.
This technology will be misused. There's no question about it.
Now they aren't even saying the right things. I no longer have any desire to switch from android.
They'll deny "requests" (Score:2)
But when a country passes a law requiring it? Apple will fold like a cheap suit.
poor journalism (Score:4, Insightful)
As always, the journalist asked the wrong question.
The correct question is:
"Apple, will you stop doing business in a country when they require you to add images to your image-checking program which you have not confirmed is actually child porn?"
Re: (Score:2)
What is needed is coordinated action by governments to block trade with such authoritarian markets on human rights and/or strategic/security grounds. And maybe not just the US government. One government blocking that market has si
Totally Believable (Score:1)
Your privacy is important to us (Score:2)
Painting itself into a corner (Score:2)
I wonder what Steve Jobs would've done?
apple (Score:1)
Why is it legal for them? (Score:2)
So it is legal for Apple Employees to collect and view child porn, but it is evil if someone else does it?
Why is that?
Does someone check to make sure they do not get aroused or do we just take their word for it. Apple is not law enforcement. It should be just as illegal for them to possess.
Darn it .... (Score:2)
The reason I decided to never use Apple products started when I discovered you couldn't build your own Apple based PC and that their license forbids usage of their OS on on-Apple products. That's just 100% BS. To paraphrase Henry Ford, we can have any color we want as long as they deem it good for u
Seems like a good compromise to me (Score:1)
They opened the Pandora box (Score:1)
called it (Score:2)
Refuse? (Score:1)