Apple Removes All References To Controversial CSAM Scanning Feature From Its Child Safety Webpage (macrumors.com) 36
Apple has quietly nixed all mentions of CSAM from its Child Safety webpage, suggesting its controversial plan to detect child sexual abuse images on iPhones and iPads may hang in the balance following significant criticism of its methods. From a report: Apple in August announced a planned suite of new child safety features, including scanning users' iCloud Photos libraries for Child Sexual Abuse Material (CSAM), Communication Safety to warn children and their parents when receiving or sending sexually explicit photos, and expanded CSAM guidance in Siri and Search. Following their announcement, the features were criticized by a wide range of individuals and organizations, including security researchers, the privacy whistleblower Edward Snowden, the Electronic Frontier Foundation (EFF), Facebook's former security chief, politicians, policy groups, university researchers, and even some Apple employees.
Nelson: Ha ha! (Score:1)
Re: Nelson: Ha ha! (Score:5, Insightful)
Pedos ran to Apple now fuel backlash... (Score:1, Troll)
Billionaire powerful Pedos apparently can buy suckers to support them and have one of their own "killed" while staying at "little gitmo" the same prison the famous drug lord escape artist is held.
Slippery Slope is a fallacy whereby people like the parent jump way far ahead of a syllogism of logically connected steps without dealing with any of the logic required to get to their conclusion. Scanning for childporn UPLOADED to Apple servers just like Google and Microsoft (who patented a scanning technique) is
Re: Pedos ran to Apple now fuel backlash... (Score:1)
Re: (Score:1)
Siri, Alexa, and Google Assistant can all fuck off. If you're so weak minded to make it easy to allow things like in your world, that's on you. Others of us don't want big brother in our homes; whether in the guise of the government or corporations.
Re: (Score:2)
DO NOT UPLOAD YOUR KIDDIES TO MY SERVER! If you have a problem with that, upload your images somewhere else.
Apple isn't preventing you from sending photos elsewhere. They have ZERO requirement to let you do anything with THEIR property. Your phone is yours but their server and their service is not. This is simple.
Don't upload your photos if you don't want them to scan them. goes for MS and google too.
Yes... (Score:3, Insightful)
Because as we all know, if they remove public references to it, it means it's gone, and there's no chance it'll just be used anyway, right? Right?
People really are gullible these days...
Re: (Score:1, Insightful)
Re: (Score:1)
As if android is any better....disclaimer: I am an android user, recovering apple user
Hidden feature? (Score:5, Insightful)
We know it is implemented already.
Good. (Score:2)
Now remove it from my phone.
Just demand the source code in court or you must a (Score:1)
Just demand the source code in court or you must acquit.
Or do it in WI and hope you draw Judge Bruce Schroeder
Re: (Score:2)
Removed the documentationb (Score:3)
Did they remove the reference, or the feature?
It is entirely possible the 'feature' could still be alive and well, unless of course they pinky promise not to implement it at all.
Well (Score:2)
I would like a feature to warn me about such photos. I would not like a feature that funnels it automatically to the FBI.
These are the same people who brought you marijuana residue on walls, not as evidence of past use, but as actual possession in and of itself.
And they feel themselves clever.
We'll skip for the moment whether reporting to the FBI amounts to a forbidden "cozy relationship" between private and government. A repair tech who finds something and reports it, ok. A tech who finds something and
Re: (Score:1)
Re: (Score:2)
Apple Haters - GO (Score:2, Troll)
Re:Apple Haters - GO (Score:4, Insightful)
I think you'd find it difficult to dismiss me as an Apple hater when I have two iPhones. That said, my thought is that they're probably going to keep looking for images they deem improper, and perhaps even make them available to the authorities.
They just won't talk about it.
Re: Apple Haters - GO (Score:2)
Re: (Score:2)
I guess you didn't read the "won't talk about it" part of my comment. Probably a bit to much for your attention span.
Re:Apple Haters - GO (Score:4, Insightful)
Apple cancels something everybody hated.
But Apple haven't actually stated they are cancelling it, they have just stopped talking about it. There is a MASSIVE difference.
Either: 1) they are cancelling it and haven't said so yet; 2) they are holding while trying to find another way to spin this; or 3) they are rolling this out in secret anyway (this last option will be the worst PR for Apple if it comes to light).
Apple don't exactly have a good track record for transparency. With the recent exposure of their secret $275Billion deal with the Chinese government, I would not be surprised if this had already been rolled out in secret (at least for Chinese iPhone users).
Re: (Score:3)
Or, they are trying to find another way to do it.
The problem for Apple is Apple encrypts the images so only you and those you share images with can decrypt it. Or so they claim.
However, as an image hosting provider, they are bound by law to scan for that kind of mat
Re: (Score:2)
I just came to see how Apple haters were going to spin this story. Apple cancels something everybody hated. How can that be bad?
I use Apple products and is very happy about them, but I am 100% against this CSAM thing.
Also, it is important to note, as many have said, Apple stop mentioning it doesn't mean they have cancelled the project.
I've said it before, what's worse than a Big Brother government is lots of Big Brother private companies spying on you. Apple protecting our data is good. Apple trying to do law enforcement is bad, really bad, even worse when it was done by molesting our data. Apple's business is building good phon
Could have been reasonable (Score:1)
Now add an opt-in button that lets you opt-in or opt-out but doesnâ(TM)t show which one you picked, and it would all be fine. But what they
I would be very surprised... (Score:2)
... if it's actually gone away. Just the reference. And it's nothing new. Back in the days when people used film and had it developed/printed through the drugstore, the processors always scanned the prints for anything that might today be called "child porn" and reported it to the police. IIRC they were required to do that in some jurisdictions. Apple may have added features, this sounds like a modern version of the film processor thing. Would be surprised if it's gone - the references, maybe, but not the s
Re: (Score:2)
Further, I
Apple hate. (Score:2)
Lots of other companies are already doing csam searches of everything you post / share. Facebook and Google for starters.
Apple announced they were going to do it. Their customers got upset, so they are backing down.
Why isn't Google getting hate for this too? Is it because Apple is the "privacy" focused company?
Sure, it's sketchy. But it was also, basically, opt in. You have to upload your photos to iCloud.
And, it really seems like they are trying to get out ahead of being regulated / forced to have backdoor
Re: (Score:1)
No wonder Apple gets "hate" when its defenders lie and say "It's just what Google (etc.) do".
Once again, this scanning was to take place *locally* *on your device*. Only rabid Apple fanbois conflate this with scanning *on a cloud service* *on someone else's server* *after you uploaded it* which is what everyone else does. This is crossing a big red line in the sand.
But you know that, don't you? You're just fudding to protect your precious lover, Apple. No matter how many times the clear difference between
Re: (Score:1)
No, the CSAM scanning was in the cloud, after uploading to iCloud.
the on device stuff is detecting if sexually explicit material is being shared via Messages, for users under 13 yo, if the parents turn it on.
2 different issues.
Re: (Score:2)
"Post/share" is public. What's on your device is private.
Pants on the head retarded fanboy: "These are same things. There's no difference"