In Internal Memo, Apple Addresses Concerns Around New Photo Scanning Features (9to5mac.com) 101
Sebastien Marineau-Mes, a software VP at Apple, talks about the company's upcoming controversial photo scanning features in an internal memo to employees: Today marks the official public unveiling of Expanded Protections for Children, and I wanted to take a moment to thank each and every one of you for all of your hard work over the last few years. We would not have reached this milestone without your tireless dedication and resiliency.
Keeping children safe is such an important mission. In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal, Product Marketing and PR. What we announced today is the product of this incredible collaboration, one that delivers tools to protect children, but also maintain Apple's deep commitment to user privacy.
We've seen many positive responses today. We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we've built. And while a lot of hard work lays ahead to deliver the features in the next few months. [...]
Keeping children safe is such an important mission. In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal, Product Marketing and PR. What we announced today is the product of this incredible collaboration, one that delivers tools to protect children, but also maintain Apple's deep commitment to user privacy.
We've seen many positive responses today. We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we've built. And while a lot of hard work lays ahead to deliver the features in the next few months. [...]
Permissions setting? (Score:5, Insightful)
Surely there will be some permission setting somewhere to toggle this - or perhaps a notification to ask:
This feels like the start of the very slippery slope.
"reserve" not "reverse" (Score:3)
I guess I'm in good company(?) [slashdot.org]
Re: "reserve" not "reverse" (Score:5, Insightful)
Re: (Score:1)
But but... Noooes! This is different: it's for the children!
Next project on the agenda: scan your device for terrorism material. Because terrorism is horrible and anything goes against terrorism. Just like anything goes for the children.
Re: "reserve" not "reverse" (Score:4, Insightful)
Subverting freedoms of others. Coercion or authoritarianism. Hitler, the worst person in human history didnâ(TM)t rape children. He didnâ(TM)t personally kill. He stole freedom from others. The most free country at the time marshaled an unstoppable force to destroy his government. His government spied, threatened, robbed, and genocided. They robbed freedom from millions.
I am sorry, but someone has to say it. Raped children, dead innocents, sometimes the bad guy gets away⦠it is all worth it, because freedom is worth everything. Many people have what Lincoln called the last full measure of devotion that you might have freedom. Freedom is everything. Freedom to be secure in your person and effects is paramount. What Apple is doing is wrong. Even if it saves a child from rape or murder. It is not worth giving up freedom.
Re: "reserve" not "reverse" (Score:2)
Re: (Score:3)
You know whatÃ(TM)s is worse than murder? Do you know what is worse than raping children? Do you know what is worse than terrorism?
Murdering child-raping terrorists? That sounds pretty bad.
But not to worry: at least we can take comfort in knowing that murdernig child-raping terrorist iPhone users will be caught and dealt with swiftly by the Apple political correctness police.
Re: (Score:3)
You know whatâ(TM)s is worse than murder? Do you know what is worse than raping children? Do you know what is worse than terrorism?
Subverting freedoms of others. Coercion or authoritarianism.
All of those things are coercion or authoritarianism.
Re: "reserve" not "reverse" (Score:2)
Re: (Score:1)
Re: "reserve" not "reverse" (Score:3)
Re: (Score:2)
They'll use a dark pattern instead. (Score:1, Interesting)
Nah - they'll probably use a dark pattern for this, because "children", or "courage"; so something more like:
Allow - OK
Re: (Score:1)
Re: (Score:3)
this gives every thinking person (I know, small subset) the firm knowledge that apple does not care one whit about actual privacy; they want headlines and brownie points (uhm, should I have chosen a different term? hmmmm.)
invasive spying is invasive spying, period.
fuck apple. there has never been a valid reason to believe anything they say, that you can't directly verify yourself. we dont have access to chips or code, so really we cant trust a damned thing they say.
like politicians, when someone says som
Totally agree (Score:5, Insightful)
this gives every thinking person (I know, small subset) the firm knowledge that apple does not care one whit about actual privacy
I totally agree.
In the past I've defended Apple's record on privacy because they have done a lot of things that truly help user privacy.
On this news of Apple scanning photos for illicit content, I was waiting for more evidence Apple itself was actually considering it - and here it is.
Without doubt Apple can no longer be considered to be truly supportive of user privacy.
What a shame, because where else is left to turn?
At the very least I will no longer user Apple's camera app if at all possible, or store photos where iOS can find them.
Not because I have anything to hide, but because they have no right to look.
Re: Totally agree (Score:2)
At the very least I will no longer user Apple's camera app if at all possible, or store photos where iOS can find them.
All you have to do to avoid scanning is to turn off iCloud Photos.
And it doesnâ(TM)t scan for photos that might look like child porn. It compares photos to a database of known child porn images.
These articles explain the system in more detail:
https://www.macrumors.com/2021... [macrumors.com]
https://www.macrumors.com/2021... [macrumors.com]
Do I like it? No. But it does look like Apple has still tried to strike a privacy balance in several waysâ¦
Re: (Score:2)
Re: Totally agree (Score:2)
While I agree that seems to be the way the thing works -- scanning iCloud content against known images, who the hell takes existing CP from the net and routes it to iCloud? It doesn't make a lot of sense.
One of the "Sharing" options if you Click on an image is to "Save to Photos".
At that point, if iCloud Photos is Enabled, it is automagically copied up to the User's iCloud Storage. IIRC, At that point (prior to Encryption) is where this (on-device) "comparison" takes place.
Re: (Score:2)
Indeed. My take is this is probably a deal with some law&order, aehm, "people", to avoid something even more intrusive, but once the capability is there it is very easy to use for other stuff. That is essentially assured to happen. The "children" argument is bogus.
Re:Permissions setting? (Score:4, Funny)
Slippery slope? This is a flume ride with a frictional coefficient approaching zero and descent angle approaching 90 degrees.
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Well, they did make all that fuss about not helping the FBI in their investigation of the San Bernadino shooter's iPhone... At the time, I thought it was total bull, because if it was the Chinese gov, they would have rolled over and asked their belly to be rubbed as they begged to give away the encryption keys.
https://en.wikipedia.org/wiki/... [wikipedia.org]
How do you test thing? (Score:2)
Seriously. Loading up a bunch of iPhones with a random sampling of images, a few of which are actual kiddie porn, and testing to see if this catches at least X% of the kiddie porn sounds like a dangerous idea.
Re:How do you test thing? (Score:4, Informative)
Re: (Score:2)
Re: (Score:2)
No because they don't share the photos, just like your phone doesn't store an actual picture of your fingerprint.
Re:How do you test thing? (Score:4, Insightful)
Resize, crop (Score:5, Insightful)
Not to mention that such hashes could be easily defeated by resizing or changing the cropping of pictures, changing their hash. Depending on how the source material is hashed, just adding a border would probably prevent a match.
This will, of course, mean that older phones will be slower, and need to be upgraded.
Re: (Score:2)
Re:Resize, crop (Score:4, Funny)
They use a neural net to handle all sorts of obfuscations. So it's a neural hashing technique. It would possibly identify new CP.
Or, you know, be wrong.
Have fun explaining how a neural net works to your friends and neighbours after you're arrested for owning child porn that was actually a picture of the Pink Panther's car or something.
Re: (Score:1)
You joke, but my partner actually got suspended from Facebook for a day, for uploading a picture of a new (at the time) Samsung phone. These neural detection algorithms are not infallible, and you never know what will trigger a false positive.
Re: (Score:2)
You joke, but my partner actually got suspended from Facebook for a day, for uploading a picture of a new (at the time) Samsung phone. These neural detection algorithms are not infallible, and you never know what will trigger a false positive.
Well, I wasn't joking. Mistakes like this can happen even with humans monitoring or just looking at photos being printed in a shop. Once it's mentioned on the local news, your life is fucked no matter what happens at appeal.
And here come the false positive (Score:4, Insightful)
So it's a neural hashing technique. It would possibly identify new CP.
Then there are two risk:
- Generating bonkers false positive.
Think about Facebook whose adult filter wrongly recognized an elbow as boobs [cnet.com].
It might accidentally recognize wrong things falsely as naked children (Think of the Manneken Pis in Bruxelles in Belgium [wikipedia.org] - well, actually DO NOT think about him, depending on were you live you might be accused of wrong think paedophilia [telegraph.co.uk].
Which brings me to the next risk:
- Technically not entirely incorrect but judicially wrong.
Even if Apple's neural net is perfectly good at recognizing children's skin and only children, there another matter:
the legality of those pictures vary by jurisdiction.
It might be that any picture of a naked kid, no matter the circumstances, might be considered illegal in some more puritanical region of the globe (can someone from the US confirm ?).
In a lot of European jurisdictions, nudity per se isn't illegal. Context and intent do matter. Spencer Elden's picture or Nirvana's album is definitely naked, and definitely displaying his penis, but would never be considered obscene in Europe.
As another example, in Japan, only the actual real human beings are considered. So if adult, consenting, mangakas are drawing lolicon and consensting adults consumers are reading it, no actual real-world children was harmed, no matter what the imaginary stories in the pictures represent.
This above situation could lead to false positives, leading to false accusations, leading potentially to people getting kicked out of their backup storage, despite never having broken any actual CP law where they live and no children having been actually harmed.
Re: (Score:2)
I have to say that if no one told me that those were her elbows, I would have only seen boobs as well. It took me a bit to sort out the picture, and I'm not an AI.
Re: (Score:2)
They use a neural net to detect nudity being sent via Messages to children's devices and then blur it out behind a warning message, and even then, only if the parent of the child opted-in to the use of that feature. Apple doesn't get a report when that happens, nor does law enforcement. I'm not seeing any mention of them relying on neural networks for any of the flagging that leads to them getting a report. All of those systems seem to be based on matching against the hashes of known child porn.
Re: (Score:2)
No. It can only identify pictures it knows.
Re: (Score:2)
Well, apparently things are more fuzzy than that. Hash-lists are pretty much restricted to known pictures and some tolerate some modifications. It is unclear what this thing by Apple really is. Apparently Apple claims can identify "nude pictures" it has not seen before. That would make it far, far more dangerous, because it may well misidentify things and it carries a lot more active functionality. For example, it is very easy to quietly slip other types of pictures in there to find and really hard to find
Re: (Score:2)
Not to mention that such hashes could be easily defeated by resizing or changing the cropping of pictures, changing their hash.
Not how it's hashed. It's hashed using what Apple's calling "NeuralHash" [apple.com] (which is a name several other projects already use, because of course it is, because Apple always manages to reuse names). Their example shows that changing an image from full color to black and white doesn't change the hash. Resizing also shouldn't change the hash for the most part, until you start making it small enough to remove details.
Cropping might change the hash if it's sufficiently cropped.
Depending on how the source material is hashed, just adding a border would probably prevent a match.
Probably depends on the size of the
Re: Resize, crop (Score:2)
But basically it's looking for "features" in the photo and the hash is based on that.
Wrong!
It is looking for entire image matches to a high degree of certainty.
And it doesn't set a Flag until you cross a threshold of "hits", and then, it doesn't disable your AppleID until you cross a threshold of Flags.
So, it seems quite conservative in its design, actually.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
RTFA. The phone downloads a database of hashes for known images and checks your messages, albums, etc. against them. If the database is never updated, it'll only detect consumers, not producers; if it is frequently updated, then it will only detect producers whose work is distributed widely enough to end up in the database. Testing its efficacy would be more about seeing how many filters it takes to defeat the hashing algorithm, since it has zero chance of catching new material.
So Apple says, but would you trust them not to lie now? At best they have arranged plausible deniability while facilitating government spying.
Re: (Score:2)
There's nothing to be gained from bothering to entertain such intermediate slivers, quanta, and shades of doubt and paranoia. Apple has famously been uncooperative with governments, even forcing the FBI to go to a third party to crack phones for them [adn.com]. Until you see an actual headline that confirms a material change in that stance, all you're doing is fearmongering.
But let me show you how to do it properly: the spooks don't need to see inside your phone to spy on you. They control the networks [wikipedia.org]. Building back
Re: (Score:2)
There's nothing to be gained from bothering to entertain such intermediate slivers, quanta, and shades of doubt and paranoia. Apple has famously been uncooperative with governments, even forcing the FBI to go to a third party to crack phones for them [adn.com].
And Google was uncooperative, until it was discovered that law enforcement had access to the unencrypted links between their data centers.
The government *needs* Apple to appear to be uncooperative so that people will trust them with their data, putting it right where the government can get to it. Another example of this is the The Electronic Communications Privacy Act of 1986 which appears to make unencrypted email private so people will trust it, while in practice it is not. The same thing can be said fo
Re: (Score:2)
Re: (Score:2)
Are they encrypting their cloud? (Score:5, Interesting)
This only makes sense if they are adding end-to-end encryption to their cloud. Right now, they scan every file you upload to their cloud server side.
Moving that scanning to client side isn't going to be lucrative for them, especially if, as they have stated, they will ONLY use it to scan images that go up to their cloud (if you don't trust them on that, well, that's reasonable, but I'm going to pretend that's not a concern).
So what does Apple get out of this second scan? I think their plan is to add real encryption, with a key known only to the USER and not to Apple. This would make their servers impossible for them to scan, and they would have to rely on this client side scan to make that possible (the option with no scan is technically possible today and some services offer it, but Apple would probably get hammered by various governments if they switched that on).
So that's my guess- either they intended to add full encryption to the cloud (and this idea came from that), or they still intend to do so (and this idea precedes that).
It's still gonna have hash collision and false positives, but given that these costs are externalized to innocent customers, Apple is willing to pay them.
Re: (Score:1)
It's for the children (Score:2)
That justifies anything /s
Apple needs to read the comments to the previous article. There's no point in repeating all the reasons this is an abhorrent idea.
Re: It's for the children (Score:1)
Waiting for the news to drop... (Score:5, Interesting)
Ever since Apple announced that they feel a pressing need to scan iPhone users phones for what apparently is now called "CSAM," I've been waiting for the news to drop of some law enforcement agency connecting iPhone users to a large ring sharing those images.
That's the only reason I can think of why they'd be doing this. Because they know they're about to be hit with massive negative publicity for helping child abusers stay hidden from law enforcement, and are desperately trying to get out in front of it.
Re: (Score:3)
The depressing thing is that this new kiddie porn scanning technique is probably using a backdoor in iCloud's encryption that was actually developed for the Chinese government (or some other authoritarian regime) to track their citizens. I have a feeling that this is more of a cover story for why that backdoor exists to begin with.
Re: (Score:2)
Why would Apple need to use a backdoor? They own the software, and this scanning is done before it enters iCloud.
Re: (Score:2)
That's the only reason I can think of why they'd be doing this.
It might have something to do with a new law saying they're liable for it if it's happening on their platform: https://en.wikipedia.org/wiki/... [wikipedia.org]
Guilty until proven innocent? (Score:2)
This feels like they're treating everybody as criminals, and will only consider you "innocent" once they've scanned your device. Just like the Border Patrol does.
Re: (Score:2)
That's exactly what they're doing. Society no longer relies on parents to parent their kids and protect them from harm. We expect "big tech" to do it for us now, and with a sweeping, blanket method of forcing everybody using an iPhone to participate in constant verification that their photos comply with the law.
Apple OUs? (Score:2)
AI Training (Score:1)
Re: (Score:1)
Not as bad as I thought. (Score:2)
From the article, it seems like they've got a list of hashes of known child pornography. They're comparing that list with files stored on iCloud. If you're storing a lot of these images on iCloud, they notify the police. I'm OK with that. I am a little worried about the client-side aspect of this, but they already can identify pictures of dogs, boats, Christmas, etc. that I have on my phone, so I guess the image recognition stuff is already there. I'm a bit concerned about this, but if I were Apple I wouldn
Re: (Score:1)
As they say, "The road to hell is paved with good intentions."
Re: (Score:3, Insightful)
or bettet yet... (Score:2)
Re: (Score:1)
Re: (Score:2)
The downside for mislabeling a dog photo is small, the downside for mislabeling an innocent photo as CP is great. These models make "false positives" and "false negatives". There is a trade-off between the two, and how the threshold is set depends on Apple.
Re: (Score:2)
From the article, it seems like they've got a list of hashes of known child pornography. They're comparing that list with files stored on iCloud. If you're storing a lot of these images on iCloud, they notify the police.
For now, yes, it only triggers if it finds a match with known imagery. It's not attempting to classify the images or recognize new ones, it's only looking for existing flagged images.
For now.
I am a little worried about the client-side aspect of this, but they already can identify pictures of dogs, boats, Christmas, etc. that I have on my phone, so I guess the image recognition stuff is already there.
And this is where the real threat is. If Apple is OK with scanning for matching images, just wait until the police demand they leverage their facial recognition technology for known criminals. Both sides of the aisle should be worried: why not use it to look for images that look like they were taken by 1/6 protestors? W
Re: (Score:1)
I am a little worried about the client-side aspect of this, but they already can identify pictures of dogs, boats, Christmas, etc. that I have on my phone, so I guess the image recognition stuff is already there.
And it doesn't work very well. If you've got a decent size library of photos, there's sure to be more than a few false positives. I just searched for "candle" and among the photos that are, in fact, of candles - there's also a picture I took of my turntable playing a record.
Apple just lost the Education market (Score:2)
Re: Apple just lost the Education market (Score:1)
Re: (Score:3)
Re: (Score:1)
Re: (Score:2)
Keeping children safe is such an important mission (Score:5, Insightful)
Keeping children safe is NOT an important mission for Apple. For a parent, yes. For law enforcement, yes. For Apple, no. It is, at most, a minor aspect of the features they provide.
Worse, when an executive starts out with such a bold lie to justify what the company is doing, you know the rest is bullshit.
Re: (Score:2)
If they wanted to keep my children safe they would ensure their privacy rather than trawling through their stuff and reporting it to some authority.
Surprised? (Score:5, Insightful)
Re: (Score:1)
Mark my words, even when their sales go down the basement because users start leaving in droves they'll keep at it because "We must... we must do it for THE CHILDREN!!"
Apple Management will be completely powerless to stop the demise and the company will disappear in oblivion, just like it was about to do in the late '90's before miracle worker Steve Jobs came along.
Re: (Score:1)
Re:For the children (Score:4, Informative)
“The state must declare the child to be the most precious treasure of the people. As long as the government is perceived as working for the benefit of the children, the people will happily endure almost any curtailment of liberty and almost any deprivation.”
- Adolf Hitler
Hitler never said that. It's taken from a fantasy letter by a Rabbi imaging what Hitler would say to his supporters in a letter from beyond the grave.
https://www.wnd.com/2004/01/22... [wnd.com]
Alternatives (Score:1)
A new form of SWATing? (Score:3)
How soon until bad guys start surreptitiously sending offending images to the iPhones of people they don't like? Apple will do the rest of the work for them.
Operating systems at least try to keep malware at bay, but images? Consumer devices are designed to suck those up from the web, social media, e-mail, texts, and what have you.
And some tech savvy cretin with too little regard for humanity is going to code up a way to disguise the images so that the recipient never sees them, but the bit stream will match hashes in the government database.
Good luck explaining all that to a jury.
Explaining is not enough (Score:2)
I was actually starting to buy into the whole "privacy" focused Apple PR, but this charade has made me change my mind. If Apple believes its buyers care more for children's virginity than their privacy they're sadly mistaken or just misinformed.
Re: (Score:2)
Re: (Score:2)
Apple will see a huge exodus of users who don't like the company scanning their stuff.
That is extremely unlikely.
It's not a magic device (Score:2)
Will it flag Jeff Bezos... (Score:3)
Re: (Score:1)
Creepy to the extreme (Score:2)
Re: Creepy to the extreme (Score:2)
What if your wife or husband is a little person. Will your pictures interacting with them get flagged and viewed by Apple certified peepers? Sorry Apple, you've jumped the shark this time. Your company is just as creepy as Zuckerberg's now.
It doesn't work that way, and you should know that by now.
Troll.
anyone else? (Score:3)
Keeping children safe is such an important mission
do justifications like these for any action raise red flags for anyone else or is it just me?
Back to the Future (Score:1)
Shades of the 80's and "Think of the children!"... yeah we heard that from the politicians so much, they thought of the children back then all right and that is why we're so economically screwed now.