Apple Warns Staff To Be Ready for Questions on Child-Porn Issue (bloomberg.com) 63
Apple has warned retail and online sales staff to be ready to field questions from consumers about the company's upcoming features for limiting the spread of child pornography. From a report: In a memo to employees this week, the company asked staff to review a frequently asked questions document about the new safeguards, which are meant to detect sexually explicit images of children. The tech giant also said it will address privacy concerns by having an independent auditor review the system.
Earlier this month, the company announced a trio of new features meant to fight child pornography: support in Siri for reporting child abuse and accessing resources related to fighting CSAM, or child sexual abuse material; a feature in Messages that will scan devices operated by children for incoming or outgoing explicit images; and a new feature for iCloud Photos that will analyze a user's library for explicit images of children. Further reading: Apple's child protection features spark concern within its own ranks.
Earlier this month, the company announced a trio of new features meant to fight child pornography: support in Siri for reporting child abuse and accessing resources related to fighting CSAM, or child sexual abuse material; a feature in Messages that will scan devices operated by children for incoming or outgoing explicit images; and a new feature for iCloud Photos that will analyze a user's library for explicit images of children. Further reading: Apple's child protection features spark concern within its own ranks.
This is going to be a very slippery slope. (Score:1)
Google\Android next?
Re:This is going to be a very slippery slope. (Score:4, Informative)
Google\Android next?
Google already does it. Google made over 540,000 reports of child porn to law enforcement in 2020 alone [missingkids.org]. Same for Facebook (over 20 million reports). And plenty of others too.
People seem to be unaware that Congress carved an exception out of Section 230 last year [wikipedia.org]. Whereas these companies are typically protected against liability for content their users share, the EARN IT Act puts companies on the hook for child porn if they fail to police their platforms for the stuff. Facebook, Google, and others saw the writing on the wall and added these sorts of protections to their cloud services awhile ago.
Apple is late to the game and is sticking to on-device checks against hash fingerprints of previously seized child porn, so they're easily the least invasive, despite the reporting that's been going on this week (much of which was incorrect, but given Apple's abysmal handling of the subject I don't blame reporters for their frequent misreporting of what was going on).
Re: (Score:2)
Hopefully the EARN IT Act never passes because it will deal a devastating blow to internet privacy that's already teetering on the edge of non-existent. Some versions of it will result in a ban on non-backdoored encryption.
Re: This is going to be a very slippery slope. (Score:2)
Yep. Everybody copies Apple.
Privacy? HA! Not in tbe near future. You have to be searched up and down because you may be a criminal, and you will continue to be nagged about website cookies and that "your privacy means so fucking much to us!"
Orwellian doublethink and other such nightmares are coming to pass and the entire world population is just somwthing to be fucked around wirh and to make rich powerful bad men even richer, powerful, and bad.
Roll over dog! Good boy!
Re: Why double down? (Score:2)
>Can I seriously get my neighbor arrested if I have access to his phone and a child?
Its better than getting him shot by swat
Re: Why double down? (Score:4, Insightful)
Can I seriously get my neighbor arrested if I have access to his phone and a child?
Its better than getting him shot by swat
Well, maybe.
The odds of you getting shot by a SWAT is relatively low.
The odds of your life being destroyed by the accusation of having "CSAM" is 100%. There is no coming back, even if you are exonerated later.
Re: Why double down? (Score:1)
Not to mention, if you do get him shot you'll get jail time.
wrecking his life with a CP bust isnt worth the investigation time, plus there'll probably be a CVE that lets you take out whole groups of people you dont like.
Re: Why double down? (Score:2)
Also why wouldn't sexually mature underage people be able to share sexually explicit content of themselves?
Is that somehow against the law in America?
Re: (Score:3)
Also why wouldn't sexually mature underage people be able to share sexually explicit content of themselves?
Is that somehow against the law in America?
Yes it is.
Re: (Score:2)
Re: (Score:3)
And who do they convict, in that case? The minor?
Yes [kansascity.com]. Also yes [independent.co.uk]. And yes [washingtonpost.com]. For more than a decade now. Also the recipient, of course.
ACLU has provided legal defense for the minors in some cases. They keep losing.
Re: (Score:2)
Also the recipient, of course.
I think there may be exceptions if the recipient deletes the photo in the case of texts, etc... as one can't control who sends you things -- unless you've blocked them in advance...
Re: (Score:2)
And who do they convict, in that case? The minor?
It's illegal to distribute erotic photos of a minor, even if you're that minor. Google it...
Re: (Score:2)
Re: (Score:2)
I believe you, but if the minor was simply storing it on the cloud, not distributing it to anyone else, what exactly would they do to the minor?
Don't know, unless having it stored in the cloud constitutes "distribution". (IANAL) :-)
But why take an erotic photo of yourself just for yourself? That's *really* narcissistic.
(Note that photos of simple nudity, even of a minor, may be a "grey" area legally. Probably depends a LOT on the content.)
Re: (Score:2)
I'm guessing you haven't been a parent of a teenage daughter.
My point. however. is that if it's stored on the cloud, it's not being actively shared with anyone there. That doesn't mean that the photo isn't being shared with someone else, and its storage on icloud isn't going to be particularly enlightening as to who else that picture has been sent to, only that it probably has been sent to people.
Re: (Score:2)
Re: Why double down? (Score:1)
Because the law is written by puritan idiots who spend their time forcing their pathetic "moral values" on everyone else on the planet.
Pictures should not be illegal, no matter what they depict or their intended purpose when they are taken without hurting anyone and are not stolen.
For instance, if I'm an adult (or a sexually mature minor) and have some naked pictures of myself taken when i was a child why on earth should anyone else beside me get to decide if i can or cannot share them with someone else or
Re: (Score:2)
Can I seriously get my neighbor arrested if I have access to his phone and a child?
Sure. You can even do it with a dumb phone. Just take some obscene photos of a kid and send them to law enforcement. Easy.
Your comment is based on a fundamental (but entirely understandable!) misunderstanding of what this new set of features is actually doing. Apple badly botched their announcement and follow up communications, which hasn't helped matters any. They're now having to deal with a load of PR fires after this thing exploded in their face, which is somewhat ironic, given that everyone else has be
Re: Why double down? (Score:2)
Remember, every one is a potential criminal, terrorist, anarchist, Covid superspreader, and likely rips the tags off of matresses. Except for the rich, because they are Trusted(TM) and we we all know rich, powerful men have never have engaged in underage sex traffiking
nonsense audit (Score:1)
day one, audit says everything is fine
day two, the government censors your winnie the poo pics
Re: (Score:2)
[Grainy B&W tv ad of a runner zooming through a dystopia of marching worker drones sitting to watch Big Brother lord over them. She carries a sledge hammer, but gets caught and tackled to the ground.]
Vooive over: "And you will see why 2022 will be exactly like 1984."
Congratulations, Apple. You lived long enough to become the villain, as you defined it.
Not protecting children (Score:5, Interesting)
Apple's naughty image detector is not about protecting children, it's about protecting Apple. They just don't want that shit on their cloud servers, too much liability for them.
Re: (Score:2)
Re:Not protecting children (Score:4, Informative)
Exactly. Though, it's actually the law in the US that hosting providers can't carry that stuff, and Apple is one of the last providers for actually doing this - every other provider around already has scanners using the same databases, and Apple was dragging their feet for years.
The scanner runs locally on images that you upload to iCloud. If it detects a match, the details (but not the image) are sent to Apple who can check it on iCloud themselves to see if it's a false positive and if not, the image is removed from their server, but not your device.
But it doesn't matter if you disable the upload to iCloud - every other photo sharing site has to comply as well, so switching to say, DropBox or Google Photos or other thing will have the same result.
Re: (Score:2)
But it doesn't matter if you disable the upload to iCloud - every other photo sharing site has to comply as well, so switching to say, DropBox or Google Photos or other thing will have the same result.
To your point, Dropbox made nearly 21,000 reports to law enforcement last year, and Google made over 540,000 [missingkids.org]. Of the 21.4 million reports made in total last year, Facebook had the lead with 20.3 million. In contrast, Apple only made 265, which is low enough that it makes me think they only reported cases that were brought to their attention by others (e.g. pedophiles whose photos were discovered after they turned on public sharing of photos via the web).
As you suggested, however, Apple is legally required t [wikipedia.org]
Re: (Score:2)
Re: (Score:1)
Soon, on the news ... (Score:4, Funny)
Millions of pedophiles arrested in Europe, all Christian Orthodox churches shut down by police after Apple notified authorities of the horrifying child pornography sessions called "baptism" where infants as young as a few months were stripped naked in front of perverse adults taking pictures of their helpless bodies.
Re: (Score:3)
Will Apple also flag pictures of Jewish child torture sessions known as bris?
I'm guessing religious nutters of all denominations will not be using iPhones to keep photographic memories of their weird rituals.
Re: (Score:2)
Will Apple also flag pictures of Jewish child torture sessions known as bris?
If that happens, Apple's campus will probably get destroyed by that Jewish Space Laser I was hearing about.
Re: (Score:2)
If that happens, Apple's campus will probably get destroyed by that Jewish Space Laser I was hearing about.
Destroyed? Doubtful. Probably just take a bit off the top and call it a day.
Re: (Score:1)
You mean circumcision [mayoclinic.org]? Yeah it would be weird to photograph that. Also it's weird to take photos and videos during childbirth. So what are you going to use instead of your iPhone?
Re: (Score:1)
You compare the barbaric circumcision with childbirth?! Sure childbirth isn't pretty but that's the point. A reminder so you reconsider making another one. Without a recording you only remember the feelgood part of it.
Re: (Score:1)
When I'm thinking about what to do, I'm less concerned about how difficult something is, and more about whether it's worth the effort and risks involved.
There are only three ways to solve our overpopulation problem on Earth: 1) decrease reproduction rate, 2) increase death rate, and 3) increase emigration rate.
Since we don't have any viable way to send people to live off planet, #3 is out for a while. Increasing death rate by any means is usually considered a bad thing by many people, whether it's criminal,
Re: (Score:3)
Re: (Score:2)
The success and indeed proliferation of religion literally depends on the child not thinking.
Re: (Score:1)
"Will Apple also flag pictures of Jewish child torture sessions known as bris?"
They actually take pictures of that?! That stuff is hard to look at. Not something a normal person would keep as a memory. Can only hope the poor kid doesn't find that content.
"I'm guessing religious nutters of all denominations will not be using iPhones to keep photographic memories of their weird rituals"
The iphone is considered a status symbol so I think many will risk it.
I can only assume they already have pictures uploaded.
Re: (Score:2)
No, because that's not what that part of the system does. It only works for detecting specific "bad images."
Part of the problem is that two features that are very similar but work very differently "leaked" at the same time. (Hadn't heard about the Siri feature before but that's not really a privacy issue - it's just a new set of keywords that triggers a new response.)
Basically, there are two different things Apple is doing that invade privacy in two very different ways:
The first is an ML model that can dete
Re: (Score:2)
Also since the program runs locally and scans on device before upload, Apple would have to comply with government requests to scan other folders besides iCloud uploads.
Re: (Score:2)
Re: (Score:2)
What's the system that they 'trained on 250,000 CSAM images from the NCMEC database'?
You're going to have to find the quote on that because I have no idea where you got that.
The NeuralHash thing they do based on the NCMEC database is trained, as far as I can tell from their tech documents [apple.com], on photos in general. It's basically designed to be able to tell if two pictures are the same picture and that's it.
They then run that database against the trained ML model to generate the hashes. As best I can tell from their summary, the hashes are literally hashes of the neural network after running an
Re: (Score:3)
Oh, just wait until they start analyzing everyones' archived pictures from Burning Man. There is no shortage of boob, butt, vag, and penis there; even if only in the background of the subject. (I have two albums for every Burn I attended... a main album and a work safe one. A cursory glance and guesstimate would indicate that between 1/5th to 1/3rd are non-worksafe. And I *don't* go around looking for the naked people.) Naked people don't usually have any place to carry ID. Pretty much no one at Burni
Re: (Score:2)
Re: Soon, on the news ... (Score:2)
What are you gonna do about it? (Score:2)
The Applephiles will whine a while, then be first in line to buy the next iWhatever.
Count on it.
Re: (Score:2)
Make sure that you have no old digitized magazines (Score:2)
Re: (Score:2)
Never mind Coppertone ads. What about National Geographic?
And "user" includes ... (Score:2)
This includes Apple employees too -- right?
(I'd hate to be the Apple employee that gets caught with CP in their iCloud or on their iPhone. Talk about irony.)
I sense an unpleasant scenario unfolding (Score:4, Insightful)
Apple says hey, we already have the technology in places to spot child porn
We can easily modify it to locate any image that you (the Feds) want to find
Federal government pressures Amazon
Amazon says hey, we have a nationwide network of video spy devices already in place
Amazon wins NSA contract.
Federal government pressures Facebook
Facebook says hey, we can identify every anti-vaxer militant racist on the planet.
We also have algorithms and data sets that let us predict who might become an anti-vaxer militant racist in the future.
Federal government pressures
Re: I sense an unpleasant scenario unfolding (Score:1)
Is this iCloud only? (Score:2)
Re: (Score:2)
I have a question (Score:2)
Can you put the kiddie porn automatically on Amazon Photos instead?
Asking for a friend.
Do they sell a child-only phone? (Score:2)
a feature in Messages that will scan devices operated by children
Um, anybody know how they are going to tell an adult-operated device from a child-operated device?
How easy to destroy someoneâ(TM)s life with t (Score:3)
Is Apple itself guilty of child porn possession? (Score:2)
Re: (Score:2)
Apple must have trained their models on a database of child porn, and their developers must have used child porn when constructing those models (I can't imagine a more repulsive job unless it's working on snuff videos), so is Apple in illegal possession of child porn?
Apple didn't create the database of hashes of child sex abuse images [theverge.com]. The National Center for Missing and Exploited Children did. Apple just compares hashes of images uploaded to their servers to hashes from known images of child sex abuse. So no - Apple is not and was never in illegal possession of illegal images, at least not for the purposes of creating this database.
One part I'm still wondering about... (Score:2)
They are essentially comparing the images against a list of SHA256 of known kiddie porn on device, so that's just a few milliseconds of hash creation and a table lookup before sending the image to iCloud. When it is submitted to iCloud and flagged as kiddie porn a human will verify the image and then whatever takes place takes place. The part I'm wondering about is how the Apple human is looking at the flagged photos. I was under the impression that all of my iCloud files are encrypted and even Apple can't