Edward Snowden and EFF Slam Apple's Plans To Scan Messages and iCloud Images (macrumors.com) 55
Apple's plans to scan users' iCloud Photos library against a database of child sexual abuse material (CSAM) to look for matches and childrens' messages for explicit content has come under fire from privacy whistleblower Edward Snowden and the Electronic Frontier Foundation (EFF). MacRumors reports: In a series of tweets, the prominent privacy campaigner and whistleblower Edward Snowden highlighted concerns that Apple is rolling out a form of "mass surveillance to the entire world" and setting a precedent that could allow the company to scan for any other arbitrary content in the future. Snowden also noted that Apple has historically been an industry-leader in terms of digital privacy, and even refused to unlock an iPhone owned by Syed Farook, one of the shooters in the December 2015 attacks in San Bernardino, California, despite being ordered to do so by the FBI and a federal judge. Apple opposed the order, noting that it would set a "dangerous precedent."
The EFF, an eminent international non-profit digital rights group, has issued an extensive condemnation of Apple's move to scan users' iCloud libraries and messages, saying that it is extremely "disappointed" that a "champion of end-to-end encryption" is undertaking a "shocking about-face for users who have relied on the company's leadership in privacy and security." The EFF highlighted how various governments around the world have passed laws that demand surveillance and censorship of content on various platforms, including messaging apps, and that Apple's move to scan messages and "iCloud Photos" could be legally required to encompass additional materials or easily be widened. "Make no mistake: this is a decrease in privacy for all "iCloud Photos" users, not an improvement," the EFF cautioned.
The EFF, an eminent international non-profit digital rights group, has issued an extensive condemnation of Apple's move to scan users' iCloud libraries and messages, saying that it is extremely "disappointed" that a "champion of end-to-end encryption" is undertaking a "shocking about-face for users who have relied on the company's leadership in privacy and security." The EFF highlighted how various governments around the world have passed laws that demand surveillance and censorship of content on various platforms, including messaging apps, and that Apple's move to scan messages and "iCloud Photos" could be legally required to encompass additional materials or easily be widened. "Make no mistake: this is a decrease in privacy for all "iCloud Photos" users, not an improvement," the EFF cautioned.
please sign the petition (Score:1)
the more people who sign this the better:
https://www.change.org/p/apple... [change.org]
-dave
If only it were used for good purposes... (Score:5, Insightful)
To illustrate the issue I have, consider the following.
There are loaded assault rifles at every street corner. They are _intended_ only to be used in self defence, or to prevent criminal activity. There are clear signs above them informing people that they are not to be used for any other purpose. But there are no actual barriers preventing someone from picking one up and using it to rob the local Kwik-e-mart.
Do you think the trust placed would ever be abused?
That is the problem here: once privacy is invaded, no matter how good the initial intentions, eventually it will be abused by many of those in a position to abuse.
Re: (Score:2)
Re:If only it were used for good purposes... (Score:4, Insightful)
That is why this is important. Communicate it to Apple. http://chng.it/Jq9xLmvmsz [chng.it]
No one seemed to complain when it was regularly used on Google drive. Must be different.
Re: (Score:2)
I think the key difference is that is you uploading data to Googles servers. I think the owner of a server checking what is being uploaded (via hash) is a lot more reasonable than a company scanning user devices. I think most people would be fine with this if it was only iCloud content. I am not taking a side here, I just think that is an important distinction.
Re: (Score:2, Informative)
I think the key difference is that is you uploading data to Googles servers. I think the owner of a server checking what is being uploaded (via hash) is a lot more reasonable than a company scanning user devices. I think most people would be fine with this if it was only iCloud content. I am not taking a side here, I just think that is an important distinction.
Except that the picures that get scanned are pictures that a person who uses ICloud photos is planning to upload to iCloud.
https://apple.slashdot.org/sto... [slashdot.org]
Seriously, they don't scan your iCloud images if you don't have iCloud imaging turned on. https://www.imore.com/psa-appl... [imore.com]
I'm 100 percent certain that almost everyone here spreading the FUD are simply people that hate Apple. So that's how they justify and approve of Google Drive doing the same thing. Yet they go Reee! if the dreaded criminals fro
Re: (Score:2)
Funny the security analysis document I read from Apple focuses on images being sent by iMessage to minors in a family sharing plan. I take it iCloud is actually being used to store those images in the background. It seems to be aimed at protecting minors from being traumatized by predators sending them known CSAM pictures, but such predators could just take pictures of themselves. Unless the scanning is expanded to recognize genitals in general it appears to be of limited use but a simple configuration away
Re: (Score:2)
Funny the security analysis document I read from Apple focuses on images being sent by iMessage to minors in a family sharing plan. I take it iCloud is actually being used to store those images in the background. It seems to be aimed at protecting minors from being traumatized by predators sending them known CSAM pictures, but such predators could just take pictures of themselves. Unless the scanning is expanded to recognize genitals in general it appears to be of limited use but a simple configuration away from scanning whatever, like political party logos. I think focusing on iCloud is not the point. It can be expanded through pressure from governments in the future, that is what people are worried about.
How big a leap is it between getting hash values and actually looking at the pictures?
Re: (Score:1)
If I pay someone for a product but didn't build it myself, I guess I don't own it.
If I pay someone for copper but didn't mine it myself, I guess I don't own the phone I build with it.
If I pay someone for a shovel but didn't craft it myself, I guess I don't own the copper.
I'm aware that the status quo is currently luxuriating in "corporation/apple still owns the shit you bought lolololol" but you're saying they're right to.
They're not. The idea is wrong and so is anyone protecting it. Smelting copper by hand
Re: (Score:2)
Exactly. Nice illustrative example!
Re: (Score:2, Interesting)
To illustrate the issue I have, consider the following.
There are loaded assault rifles at every street corner.
I'm pro-privacy, and pro gun control. But that is possibly the worst analogy ever on slashdot that did not include a motor car.
https://yourlogicalfallacyis.c... [yourlogicalfallacyis.com]
So I guess you want a complete ban on search warrants or police power of arrest?
Let's not make murder illegal, because sooner or later we will be executing people for jaywalking.
Re: (Score:3)
You having a phone with a data processing device and a photo camera is not probable cause.
This up above programme is the equivalent of deciding to deal with the fact cases like Natascha Kampusch (which was horrible, don't get me wrong) happen via making sure cops can thoroughly search anyone's house and property whenever they want.
It's the on-device scanning that rubs me wrong (Score:1)
Apple's content detection doesn't work well as it is. If I search for a "candle" among the pictures of things that are, in fact, candles, there's also a picture of my turntable playing a record. Neural networks just aren't reliable enough that false positives aren't going to happen.
I'd have no problem if this were implemented the same way other services do it, where they scan their own servers for potentially illegal material. But having a phone scan itself and narc on you? That's insanely draconian and
Re: (Score:2)
Apple's content detection doesn't work well as it is.
No no no, you don't get it. This will be "AI" powered, and AI is amazing and can AI the AI AI so we can all AI.
It's completely different from the rest of their systems. Trust us.
Re: (Score:2)
Ay ay ay AI AI AI...
The EFF reputation... (Score:1)
Putting Edward Snowden in the same sentence as an organization is probably not good for that org's reputation.
But let's try it out, "Edward Snowden and Microsoft both announce new major leak." Nope. That did nothing to improve the reputation of either one and probably harmed the latter far more than the former.
Re: (Score:2)
The EFF started out defending hackers in court. Being associated with people like Snowden is who they are. If you don't like that, you probably don't like the EFF.
Re: (Score:2)
Ahhh I see you're one of the people who judges words based on who says them rather than their content, applicability and whether or not they make sense.
Re: (Score:1)
When tRump made claims, his 30,000 + outright lies bore heavily on the reliability of the claims.
Snowden has no such burden
Re: (Score:2)
It only matters when the only source of a claim is the person themselves.
An appeal to authority is as much of a logical fallacy as an ad hominem. The two go hand in hand. No one here is suggesting people blindly trust, but rather examine the content and analyse the point being made.
Re: (Score:1)
Argumentum ad magesterium is only a fallacy when the authority is in some way compromised.
what you're thinking of is the famous Argumentum ad Verecundium, argument from INCOMPETENT Authority, but supposedly you know the difference.
VRWC, incompetent.
CNN, not so much, subject to failures but takes every reasonable precaution.
Newsmax? Incompetent to tell day from night
PBS? Not so much
Re: (Score:2)
Re: (Score:2, Offtopic)
I grew tired of his opinions years ago.
Gasp!! Snowdon is a God here on Slashdot, how dare you? Read that in in best Greta Thunberg voice.
Re: (Score:2)
Re: (Score:2)
Snowden would be a free man in the US right now if he had turned himself in like Chelsea Manning did. But instead he ran away to Russia.
Could be. I hope he's enjoying the Russian winters.
Re: (Score:1)
Snowden exposed ALL U.S. ILLEGAL INTELLIGENCE. He would be in a parking lot support column if he had stayed
Re: (Score:2)
Re: (Score:1)
Like I said, manning was a minor embarrassment
Snowden exposed the entire network of illegal spying on Citizens communications
If he comes back, he's dead.
If he goes to trial, it will be in front of a Military Tribunal, probably at Camp X-ray.
They will kill him.
Re: (Score:1)
I grew tired of his opinions years ago.
Are you drunk?
The road to hell is paved with "for the Children!" (Score:3)
I mean really. If you can't convince people to do something to protect adults, you use the magic words "for the Children!" and suddenly perfectly sane people give up all their rights.
There are a lot of women that never agreed to porn, but have had naked pictures of them spread throughout the internet. If you are not willing to use this software for them, then you should not be willing to use it for the children.
The way to save children is not to invade the privacy of the billions of people that have phones. Better ways to do it - and use those same ways to save the adult women that never agreed to porn too, while you are at it.
Re: The road to hell is paved with "for the Childr (Score:1)
Imagine if Apple deoloys this. The doors are now ripped off the hinges, and all sorts of abuse will flood forth in an unstoppable torrent. Where Apple leads, others follow, and you won't be able to buy any new device where you are not invaded in this manner.
Governments are salivating over this, more so than the rabid Apple fanboys did right before the first iPhone came out.
The nightmare is about to begin and it's got the battering ram at the ready to break down the doors to allow the whole fucking a
Re: The road to hell is paved with "for the Child (Score:1)
Of course, Apple is standing at the gate making appeals to emotion, hoping somebody is dumb enough to open the door before it picks up that battering ram.
To make it clear "For the children" is just a pretext, a bait so all sorts of corporate and government abuses will be allowed through. Abuses that are justified with "but they are already scanning through our phones, so what difference do the (other abuses) make?".
As I said this is far less about "protecting children" and far more about introducing other,
Re: The road to hell is paved with "for the Chil (Score:2)
Hmm, it seems someone working for those who are trying to deploy this are modding down posts explaining the true motives behind what Apple is trying to do.
They don't have to worry as Apple (and every other corporation) will monster all of this through, and the public will just put up with the abuse as usual.
Fake News? (Score:1)
When this hit slashdot yesterday, the story was about how a single researcher at a lab in New York speculated that this was going to happen, without any substantive proof.
Today's news is that people are slamming Apple for something that some guy a basement conjectured could happen?
Re: (Score:3, Informative)
You missed the story where it was confirmed by Apple: https://apple.slashdot.org/sto... [slashdot.org]
To what end? (Score:2)
So... what's the goal (of Apple) here? They detect "bad stuff" on a phone. Now what? Is a police department just supposed to take the word of Apple & arrest someone? Or maybe the PD will try to get a warrant. That will be an interesting affidavit to read "...because Apple said so..."
Assuming you could have this "evidence" admitted in court that'll be fun to watch. And, oh my, the appeals will almost write themselves.
It wasn't that long ago (Score:2)
That something like this would get a company run out on a rail. Sign a pention? How about giving Apple the big "go fuck yourself" sendoff, and holding events such as burning a pile of iShit ala the "bigger than Jesus" outrage?
Seriously, as it stands now Apple is just going to laugh and at most just introduce this garbage a little bit at a time. And when Apple does something, everybody wamts to do it. Apple is still seen as the cool guy and everyone wants to be just like Apple.
We need to trounce Appl
I tried looking at the general idea. (Score:2)
The underlying issue is the question:
What, of what I know, may I keep for myself?
This, however, is impossible to answer.
Because of this, any issues as the one in this discussion, will never be solved: only a compromise may be reached, at the very best. This means that, no matter what, there will always be people dissatisfied, people who feel their rights are violated.
It starts with "Think of the children" (Score:1)
It's easy to be on board with protecting children, but it's unlikely to stop there. "Think of the children", likely will later include chosen blacklisted images from those pesky "Domestic Terrorists" and bigots. Better round up those who saved pepe memes and 4chan satire.
Re: (Score:2)
Better round up those who saved pepe memes and 4chan satire.
You assume the folks here have an issue with rounding them up.
now if they frame him with SOME CP will russah (Score:2)
now if they frame him with SOME planed CP will Russia trun him over?
Mixed messages (Score:3)
And if it's found the government threatened Apple to force them into this, there's constitutional issues. Courts have ruled if the government coerces a private entity to perform a search for criminal evidence to prosecute someone, they're acting as an extension of the government and the 4th Amendment comes into play.
Re: (Score:2)
Apple is claiming a PhotoDNA-like hash only of known CSAM material.
Other places, they're claiming to use their "neuralMatch AI", a ML system trained
on 200k CSAM images. That's a *very* different thing.
Exactly.
So if they know the hash of an offending image, what do
they need "AI" for? Also, if somebody does have such
an image on iCloud, couldn't they totally trash the hash
by changing one pixel?
The whole thing stinks of deception.
Re: (Score:1)
AI vs AI (Score:2)
CSAM censorship in the UK (Score:3)
A UK-based organisation called IWF (Internet Watch Foundation) has, for over 10 years, maintained a list of URLs which it alleges contain images of child sex abuse. The list is not public knowledge but we do know that a Wikipedia page containing an album cover as well as the entirety of the Wayback Machine have been blocked at various points.
Most of the consumer & mass-market ISPs in the UK subscribe to this list, but it is not mandatory. It was positioned as something which will help in the fight against child sex abuse and will help to prevent internet users from accessing such material.
In 2011, the MPAA said to BT (probably our biggest ISP): "nice URL blacklist you have there; please add these URLs to it". The matter went all the way to the High Court of Justice which ruled in favour of adding URLs which did not contain images of child sex abuse to the blacklist. There is now an established procedure in which other such sites can be added.
It is very easy to imagine how this ends up several years down the line. Here are a few possibilities which could conceivable trigger an alert to a user's national Government.
- Chinese users with photos of Tiananmen Square on their phone.
- Middle-eastern users with photos from a pride-type event.
- American users with images of drugs.
- British users with images which suggest attendance at unlawful protests.
- Users from a country with photos suggesting opposition of the ruling President or party.
As if Apple is the only one? (Score:2)
Apple's plan to scan images loaded to the cloud is revealed.
Media loses it's mind
Meanwhile google, FB, IG and others who have been doing it all along with data that sits on their servers continue scanning away and no one cares?