Policy Groups Ask Apple To Drop Plans To Inspect iMessages, Scan for Abuse Images (reuters.com) 89
More than 90 policy and rights groups around the world published an open letter on Thursday urging Apple to abandon plans for scanning children's messages for nudity and the phones of adults for images of child sex abuse. From a report: "Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children," the groups wrote in the letter, which was first reported by Reuters. The largest campaign to date over an encryption issue at a single company was organized by the U.S.-based nonprofit Center for Democracy & Technology (CDT). Some overseas signatories in particular are worried about the impact of the changes in nations with different legal systems, including some already hosting heated fights over encryption and privacy.
Re: Bad Move (Score:5, Insightful)
Re: (Score:2)
In China the data is already available to the government. They don't need this system to inspect citizens' data; Apple is already forced to store cloud data in such a way that the government can already get at it.
Any country that can compel Apple to add data to the CSAM database or whatever already has the power to compel user data out of Apple today.
China is not the issue here; that bird has flown the coop. The issue is basically only Western governments abusing their power. That's a legit concern. China a
Re: (Score:3)
In other words, Won't someone think of the children. The tired trope that is always brought out when someone wants to threaten privacy or personal freedoms.
Re: (Score:2)
Yeah. I remember when they though of the children back in the 80's. Look at the world now.
Re: (Score:2)
In other words, Won't someone think of the children. The tired trope that is always brought out when someone wants to threaten privacy or personal freedoms.
What's astonishing is there are 90 activist groups who are willing to publicly buck the "think of the children" rhetoric. Think of the children used to be bulletproof. Daring to contradict it meant being ostracized and attacked. For policy and rights groups to defy it was unthinkable. Historically it meant your lobbyist would never be able to get a meeting with a congressional staffer again (or parliamentary staffer as the case may be).
If Apple doesn't back down, those 90 groups may find themselves in t
Re: (Score:2)
Pretty sure the Android ecosystem is no better. It would seem to me the horse has already left the barn.
You want privacy? Stop owning a smart phone or perhaps get one of those linux phones but even that will let the carry track you (how else is the phone suppose to work, after all).
Maybe a prepaid phone that you turn on for emergencies only. I like that idea, but it's not really practical and honestly, no one cares where I'm going (but that doesn't make it okay for them to collect and store that data).
Re: (Score:1)
Android is linux, and as such if you're not locked out of it by the phone manufacturer, you can check on these things yourself, but based on your flippant 'pretty sure' I'm guessing you cannot/will not.
Yes, everyone is still loosely tracked by the carriers. That's not what anyone is talking about. They've already obviously accepted that. What people have a problem with is that the data, your pers
Re: (Score:1)
Thinking about this from the perspective of "what do you have to hide?" is wrong-headed.
Power like this must be looked at with a healthy dose of history and what-if scenarios.
Have governments in the past violated people's rights based on speech or behavior that while not immoral/harmful to others was contrary to what the government wanted people to say/think/do? Yes.
Have governments in the past violated people's rights based on looks or ethnicity? Yes.
Would those past governments have subverted this tool an
Re: (Score:3)
Exactly. People are ASS-U-MEing that the United States will remain a democracy forevermore. Except governments don't live on forever, and it will really suck that shortly after this is deployed that we find ourselves under a dictator that we must always refer to as "glorious" and gigantic portraits of him adorn the buildings of every US city.
The US capitol siege that took place in January should've served as a reminder that this possibility always exists, and to not give any potential future dictato
Re: (Score:1)
US stopped being a democracy a LONG time ago, in a very non-partisan way. It has been a plutocracy, I would argue, since roughly 1963.
Re: Bad Move (Score:2)
As bad as things are now in the US, history has proven over and over again that they can get a whole lot worse for a country.
"It will never happen here"- seldom have more foolish words been spoken.
Re: (Score:1)
Yep, shits gonna get a whole lot worse before anything really changes. See: England in 1831. US is headed on the same course.
Re: Bad Move (Score:2)
Re: (Score:2)
Then you have no problem with the police doing random searches of your home. You have nothing to hide, right?
Re:Bad Move (Score:4, Insightful)
Can I go to your house right the fuck now and peruse your magazines, books, records, movies, and every square inch of your house for something someone decided is bad? What are you hiding in your house that you don't want me in it? Huh? HUH?
STFU. You sound like the kind of politician that should die the fuck off right this second.
In other words, for people like you: Today, kiddy porn (supposedly.) Tomorrow, things the Government doesn't like, things the Woke doesn't like, things your school doesn't like, things that someone, somewhere, arbitrarily and without much thought decided is bad.
Like questioning the government, or criticizing it, or asking that it be accountable for the gob-smacking stupid shit they're pulling lately.
Good enough for you, or are you gonna cling to your 'ThInK oF tEh KiDz0Rz" lifering? You know, if you find yourself thinking of the kiddies all the time.. you know what? Gimmie your phone, I wanna take a look at it now.
Re: (Score:2)
Pretty sure that if you live in an apartment, you signed an agreement to allow your landlord to do just that.
Same deal with the iPhone, you agreed to some terms that allows Apple to do whatever they want with their OS.
Re: (Score:2)
Pretty sure that if you live in an apartment, you signed an agreement to allow your landlord to do just that.
Not sure what fucked up world you live in, but in my country a landlord has zero right to enter a tenants apartment uninvited.
Re: (Score:1)
Re: (Score:2)
And I'm willing to bet you that in no jurisdiction do they have the right to rummage through your things. In other news Apple can force downloading a security update on your iPhone. That's not the same as going through your dickpics.
Re: (Score:2)
They have to give notice. They can't just go in without saying anything.
Re: (Score:2)
Today, kiddy porn (supposedly.)
What do you mean kiddy porn. The government has already prosecuted people for watching porn that didn't include kids simply because the actress *looked* young and had a flat chest.
Re: (Score:2)
This is still not how the system works. You have to be moving a photo through iCloud Photo library, whether downloading it or uploading it.
This is important, because it means you can turn it off and not have your photos scanned.
Moreover, as far as I know, every major photo service is doing scans like this. Facebook scans all the photos that are uploaded to it. I'm pretty sure Google does as well. What seems to be driving people insane is that the hashing is done on-device, even though that's honestly like t
Re: (Score:2)
I hate to say something in any corporations defense, but to be fair, if you ran a service that stored online content, wouldn't you want to make sure nothing illegal was being stored on your property? I kind of think you would. I know I would otherwise I could possibly be charged as an accessory to their crime. No bueno.
Re: (Score:2)
Why risk the personal safety of children if already implemented?
In other words, What do you have to hide?
If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him.
- Cardinal Richelieu
Re: (Score:2)
And
Re: Bad Move (Score:2)
Irrelevant now (Score:4, Insightful)
It doesn't matter if Apple backs away from this, at this point.
They have publicly announced that they can do this. Therefore, governments that want to use it to control politically inconvenient subjects (looking at you, China) will mandate that they do so, and probably mandate that they do so secretly.
So, in the end, as one would expect from the Law of Unintended Consequences, the protests will result in all of the bad things that can be done with this being done, and secretly, and none of the good that is intended. Caused by the privacy advocates.
Re: (Score:2)
We know that the basic tools for doing the hash are in iOS starting at least as far back as 14.3, per other stories. Is that code accessible to non-Apple apps?
If so, the problem already exists in the wild, and may exist already in the App Store.
Re: (Score:2)
none of the good that is intended
There never was any "good" intended. This was a (possibly misconceived) PR move by Apple to shed their encryption's image as a safe haven for terrorists and pedophiles. Problem is, if your trust chain is broken so you can catch the bad guys, it's also broken for everybody else too.
Re: (Score:2)
China didn't need a public announcement to know Apple can do this. They didn't need protests to ask them to do it secretly. Why all the mental gymnastics to shift the blame to those darned privacy advocates? I suppose it's someone else's fault Apple decided to manufacture their stuff there, too.
Re: (Score:2)
They have publicly announced that they can do this.
Meh.
It was always obvious that they could do this, and a lot more. They can give Xi Jinping an account which allows him to look at all photos in all iCloud accounts if they want to. It wouldn't shock me if there's some bit of their privacy policy statement that can even be interpreted to allow that... creative interpretation of language can take you a long way.
The fact that they've announced a willingness to do this more privacy-preserving sort of scanning (even if it's maybe not as privacy-preserving a
Re: (Score:1)
"as one would expect from the Law of Unintended Consequences, the protests will result in all of the bad things that can be done with this being done,"
I agree and one of the bad things that will be done is for this to be weaponized.
Whatever threshold Apple sets, 30, 40, whatever, someone will figure out how to upload that many illegal pictures to your phone. Apple will find them and report them. I don't see any way Apple can avoid this short of a major policy change, such as warning the phone owner that t
Re: (Score:2)
They don’t trust Apple because of removing Parler and they believe the elites are untouchable, so I doubt they would support this.
Who cares (Score:2)
Just don't use iMessage. There are dozens of better alternatives.
Re:We should care (Score:3)
Re: (Score:2)
If mail is encrypted with the key on the user's phone or computer, then Apple shouldn't be able to do this. End-to-End encryption both traveling and in place for email, chat, video, etc should be the default. The fact they don't do that tells me to not use their service.
I don't think this works the way you think it works. Apple isn't sending your mail and attachments in plaintext to be scanned on their servers... The scanning happens on your device, comparing a hash of an image on your device to a library of hashes of child porn. Non-encrypted data never leaves your phone, and there's end-to-end encryption for just about everything [apple.com].
There may be reasons to criticize this technology and policy, but "Apple's sending my mail in plaintext!" isn't one of them.
Re: (Score:3)
The way they announced it, they were not giving you any choice in the matter.
This has nothing to do with iMessage.
Re: (Score:2)
of course the best hint is always not to use Apple products, but I guess you are not forced to upload your pictures to iCloud, isn't it?
Re: (Score:2)
Oh? And whose product *would* you use?
Notice how all the pushback and criticism of Apple here is from privacy advocate and civil rights organizations. Not one government that I've heard of, at any level, has stood for its citizens and told Apple: "No. You will NOT violate our citizens' privacy by spying on their cameras."
Moreover; if Google or Facebook or Microsoft or Amazon or whoever could honestly say... or were confident enough in their implementation that they could lie and be confident about not ge
Re: (Score:2)
it will be scanned by Apple’s servers as it gets transferred in to your iCloud account
No. It will be scanned by the user’s device which will then generate a key that will either be able to be used with other keys to decrypt the content if there are enough matches, or that does nothing because it isn’t a match. That way they can still say that your pictures are encrypted before being uploaded to Apple’s servers and advertise how (supposedly) great their privacy is compared to the competition.
Re: (Score:2)
Apple has been HORRIBLE at explaining this new 'feature' (actually two features). Your description of this merges the two functions into one not-quite-correct function (don't worry, a lot of people have done this).
Feature #1: Scan iCloud for hashes that match know bad images (What is a 'bad' image? And who decides?). This one is easy to circumvent but not using iCloud (which I never have and never will). THIS function has nothing to do with iMessage.
Feature #2: When an image is sent via iMessage, yo
Re: Who cares (Score:2)
Re: (Score:2)
new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos.
So, they are comparing image hashes of images stored in iCloud, not your phone, right? Except:
Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes.
So, they are scanning on your phone, and NOT in iCloud? Have I mentioned that Apple is awful at describing how they are going to subvert our privacy?
Re: (Score:1)
Just don't use iMessage. There are dozens of better alternatives.
Problem is, all messaging platforms besides SMS require that both communicating parties be on the same platform.
Re: (Score:2)
SMS has the same requirement.
You need a phone number by a mobile phone carrier. And it doesn't work on PC and tablets without a cellular connectivity.
Although there are some workarounds such as Google Voice, it's still far from ideal. You can lose your number when you change country (or even worse, carrier) and messages sent to another country are not always free. The simple fact that the carrier has the ability to bill per message means SMS is crap.
Re: (Score:2)
SMS works fine with my VOIP providers. No cellular connectivity needed at all to work on PC and tablets to receive/send from/to anybody.
I swear... (Score:2)
...they haven't got a clue what actually is going to happen. If you don't want your images scanned (not your phone!) quit storing your porn in iCloud. Only an idiot would do that anyway.
Re: (Score:1)
Most people are idiots. Or at least in cases like this behave like idiots. If you need a citation, just read some more comments here. Or log into whatever social media and read comments. It shouldn't take long to convince you how high the percentage of idiots is vs. non idiots.
Re: I swear... (Score:2)
Re: (Score:1)
You need to do some research. Apple is NOT scanning your phone. It will only scan images uploaded to iCloud. Read much? Sheesh. No wonder the Trump cult is so popular.
Re: (Score:2)
Maybe I can help your stupid ass learn to RTFA. You literally need an update on your phone to receive this "feature", which completely eviscerates your bullshit
https://www.google.com/amp/s/a... [google.com]
"Cloud-based photo storage systems and social networking sites already scan for child abuse imagery, but that process becomes more complex when trying to access data stored on a personal device.
Appleâ(TM)s system is less invasive in that the screening is done on the phone, and âoeonly if there is a match is
Re: (Score:2)
No, I was saying most people are idiotic enough to store their porn in iCloud. It was a direct response to the statement made above.
I'm on the side of thinking Apple is making a really atrociously dumb move here. This will be abused. 100%. And if people keep themselves informed they'll lose customers over it.
Re: I swear... (Score:2)
Apple's motivation (Score:5, Interesting)
I have a hunch about Apple's motivation, but Apple would never disclose this publicly. I think that government / law enforcement keeps pressuring Apple on moral grounds that their messaging and devices are too secure. Apple won't give FBI access for this and that, and that could be used against Apple in some moral sense, in that they are protecting the worst of the worst (child pornographers). Yes, privacy is also a moral high ground, but the flip-side of that is the ability to investigate and prosecute criminals to at least some extent.
Thus, by coming up with some (theoretically) privacy-safe method of patrolling for child pornography, Apple could get at least one government vector of leverage off of them. All of these big companies (Facebook, Google, Apple, Microsoft) would much rather self-police than have government intrusion or laws that stifle them in some way. So if they see that government action is on the horizon, they'll try to stay a step ahead and use technology to get that focus off of them.
Re: (Score:2)
This is exactly it. I was incorrectly linking to the EARN IT Act of 2020 [wikipedia.org] last week as "proof" that Apple and others are legally required to do this sort of work, but that act hasn't yet been passed (of which I had failed to take note). Even so, the industry surely sees the legislation working its way through the system, so they know which way the winds are blowing. Putting practices like these in place is a sure way to take the wind out of Washington's sails and ensure that Congress is less likely to bring
Re: (Score:1)
National Security Letters are secret. We have no idea of the pressure being applied.
Also caution that whatever Apple can read they can also write. It is not an uncommon thing to plant evidence [reason.com] when they can't find any
Re: (Score:2)
Don't share any of your hunches. First, this has nothing to do with "moral grounds", either government "moral grounds" or Apple "moral grounds" as if either of those exist. Second, this has nothing to do with Apple devices being "too secure". Instead, it has to do with Apple dealing with government pressure to intrude on people's privacy. Citizens have protections from intrusion from the government so government has private business do it for them. If Apple doesn't play ball, government makes doing bus
Re: (Score:2)
Don't share any of your hunches. First, ...
On the contrary, Dan East, please do continue to share your hunches. I thought your hunch was interesting and I learned from it (even if I don't agree with it). While dfghjk made valid points and valid nits, they didn't touch the central plank of your hunch which is that this is a way for Apple to reduce government pressure - through the gamble that they can offer up this right now, and presumably get reduced (presumably US) govt pressure for a while until the US govt turns more authoritarian. The reason I
Re: (Score:2)
Don't share any of your hunches.
Yet you go right ahead and feel free to provide your opinion regarding my opinion.
I don't appreciate or respect your "worst of the worst" characterization
And I don't care whatsoever about your feelings regarding my post, even if you sobbed a little because of it.
Re: (Score:2)
It is pretty clear Apple is bowing to pressure. That is the only plausible reason why they are not backing down after this massive backlash. Hence they have already been compromised and extending the search parameters is just a question of time. This also means their assurances they will keep this limited are completely worthless.
stop buying Apple products. (Score:2)
Re: (Score:1)
It's not all snob appeal, but it might be worthy of consideration. The problem with that "solution" is that there is no reason to believe that other options do not do the same sorts of things. Regarding cell phones, Google has been doing this kind of crap already.
We once had some reason to believe that Apple championed its customers privacy. That is no longer the case, but it doesn't mean anyone else does.
Re:(don’t) stop buying Apple products. (Score:1)
They’re still better about it than most. They don’t sell their customers, unlike Google, FB, Amazon, etc but your data is still available by warrant. The hash/match scheme they’ve introduced has (minor, probably ineffective) safeguards, and that concerns me. But then again, if you think that Amazon photos aren’t analyzed for ad targeting and worse I’ve got a bridge to sell you (see section 1.1 of the Amazon Photos TOS).
Apple is taking a lot of heat for this move, but people don
Add safeguards to prevent abuse (Score:2)
Re: (Score:2)
How do you, as a member of the public, know what the original image is for a particular hash to know whether said hash is worthy of being added or removed?
What's to stop some nation state to compel Apple to add new hashes under a gag order under threat of being expelled from said nation state? Even if the hash list were public, what's to prevent Apple f
Re: (Score:1)
Apple doesn’t control the known CSAM hashes, it’s all done by 3rd parties, which itself introduces vulnerabilities.
It’s unclear how many matches trigger account review. Apple says that it’s a “threshold,” but that can mean anything. At least the nature of the hashing algorithm makes accidental collisions extremely rare (adversarial images are another issue, but those more of a curiosity than a threat). Reporting is only done after human review, although what the review co
Re: (Score:2)
They cannot. Obviously they are implementing this thing only because they could not withstand government pressure. Hence why would they be able to withstand pressure to not put in any safeguards against abuse? They are compromised. They just try to keep a semblance of their honor intact for marketing reasons.
Let's get real... (Score:2)
Re: (Score:1)
Bingo. At least Apple has a published process. Amazon, Google, FB, etc? Not so much. Their TOS do not limit what they can do with your data at all, and they get away with it because they’re opaque.
Single biggest mistake... (Score:1)
This is the single biggest mistake Apple has made in the history of the company. Yes, even bigger than hiring John Sculley.
Apple has spent years cultivating a privacy first image, and they threw it all away in one fell swoop.
I'm having a hard time understanding the rationale behind this decision. It seems to go directly against everything they have been working towards.
Scanning for nudity? (Score:1)
Oh dear, does the writer think they're looking for flesh tones? As opposed to looking for files that have the same hash as known bad content?
Re: (Score:2)
A bit more sophisticated than looking for flesh tones, but very different than looking for files which have the same hash as known bad content.
The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.
When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.
Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.
Source: Apple [apple.com].
Expanding the scope of technology (Score:3)
Like many people, I am very much in the dislike camp. In my case, I have recently acquired a Samsung device (you can take me at my word, or don't..) and am in the process of learning how to use it and will in due course start backing up folders and files directly to my NAS.
With that said, I am curious as to why Apple is so insistent that this won't and can't be expanded to other use cases, as well as why those in favour of the concept aren't in favour of expansion.
Let's start by saying that child sex abuse is a heinous crime. I am in favour of criminalising those who abuse children and subjecting them to the full force of the law. However, child sex abuse is not the only crime that causes harm, and detecting images of such acts after they have taken place only serves to criminalise the viewers; if the images exist then the child has already been abused.
I'd like to explore the ways in which the two technologies which Apple is deploying could be used to great benefit. I may well stray into the use of technologies which aren't in scope of this initiative, but which do or will exist in the near future.
Revenge Porn
The act of sharing nudes or explicit videos after the end of a relationship affects both children and adults. There are also mobile apps such as Snapchat where people share suggestive or nude pictures with the expectation that these will 'self destruct'.
With the neural hashing technology, we could see the process become more secure and eliminate cases of revenge porn. When a relationship ends or when a self destructing message is shared, a hash of 'cancelled' images could be added to iOS devices worldwide and thus preventing the sharing or viewing of these private images.
The same principle could be used for images which present a national security concern. The exact definition would vary for each country, but expected results could well include photos of prisoners in orange jump suits, videos of bombs hitting hospitals or even photographs of tanks parading in city squares.
Missing Children
Child abduction is not a new thing. We have seen photos on milk cartons, AMBER alerts and other such initiatives which have varying rates of success.
Apple says that there are one billion iPhones in use worldwide, so let's do a modern take of what the search for missing children could look like.
We know that the technology to scan for particular faces exists in iOS because the photo gallery helpfully categorises all faces of a person together. We also know that iMessages will acquire this new feature which can scan images for particular content types.
So let's marry the two together: in the minutes after a child abduction, the parent can upload as many photos of the victim. Apple will then push an alert to all the iOS devices around the world. Hopefully someone, somewhere has or will take a photo where the missing child happens to be in the background and boom: we get a time and location for the police to act.
Fugitives
The same principle as outlined for missing children could apply here, except this time with images of criminals uploaded by law enforcement.
Naturally, we would need to trust law enforcement to use this feature correctly, but if we could quickly identify their appearance in the background of images, we could get a location of any and all fugitives including those suspected or convicted of violent crimes or even those who committed a minor crime such as shoplifting or drug use.
Unlawful Protests
The concept of an unlawful protest has started to make its way to the western world. Even the UK, which purports to be a western democracy, has introduced laws around curbing protests and there is even the concept of needing to apply to the police if you wish to march.
The concept of face identification, which does exist today, could be used alongside the technology deployed to the one billion iOS devices out there in the world. Those who dar
Re: (Score:1)
Your ideas are essentially mass surveillance using all private property as government sensors. Someone taking a photo at a park will not know which faces the government is looking for, or why.
Their device will eventually just report on everything it captures, because someone will make the case that if we only start looking for criminals after their crime, we're losing valuable intelligence. Better if we have a complete profile on the whereabouts and activities of everyone all the time so when they do someth
Re: (Score:2)
This is precisely part of the point I was trying to get across... I did mention at the top of my post that I am against any of this and have actually moved away from Apple, and I do talk about the slippery slope argument towards the end. I apologise if I wasn't clear enough but the purpose was to try and demonstrate the ease of moving from acceptance of (a), to accepting (b) and eventually realising that we have reached (z).
I'm no marketing expert, but even with the way I described my ideas, I suspect a lar
Re: (Score:1)
I read your post again. Here are the two parts that I think caused me to ignore the disclaimer and think you were taking the opposite position:
And: