'Apple's Device Surveillance Plan Is a Threat To User Privacy -- And Press Freedom' (freedom.press) 213
The Freedom of the Press Foundation is calling Apple's plan to scan photos on user devices to detect known child sexual abuse material (CSAM) a "dangerous precedent" that "could be misused when Apple and its partners come under outside pressure from governments or other powerful actors." They join the EFF, whistleblower Edward Snowden, and many other privacy and human rights advocates in condemning the move. Advocacy Director Parker Higgins writes: Very broadly speaking, the privacy invasions come from situations where "false positives" are generated -- that is to say, an image or a device or a user is flagged even though there are no sexual abuse images present. These kinds of false positives could happen if the matching database has been tampered with or expanded to include images that do not depict child abuse, or if an adversary could trick Apple's algorithm into erroneously matching an existing image. (Apple, for its part, has said that an accidental false positive -- where an innocent image is flagged as child abuse material for no reason -- is extremely unlikely, which is probably true.) The false positive problem most directly touches on press freedom issues when considering that first category, with adversaries that can change the contents of the database that Apple devices are checking files against. An organization that could add leaked copies of its internal records, for example, could find devices that held that data -- including, potentially, whistleblowers and journalists who worked on a given story. This could also reveal the extent of a leak if it is not yet known. Governments that could include images critical of its policies or officials could find dissidents that are exchanging those files.
[...]
Journalists, in particular, have increasingly relied on the strong privacy protections that Apple has provided even when other large tech companies have not. Apple famously refused to redesign its software to open the phone of an alleged terrorist -- not because they wanted to shield the content on a criminal's phone, but because they worried about the precedent it would set for other people who rely on Apple's technology for protection. How is this situation any different? No backdoor for law enforcement will be safe enough to keep bad actors from continuing to push it open just a little bit further. The privacy risks from this system are too extreme to tolerate. Apple may have had noble intentions with this announced system, but good intentions are not enough to save a plan that is rotten at its core.
[...]
Journalists, in particular, have increasingly relied on the strong privacy protections that Apple has provided even when other large tech companies have not. Apple famously refused to redesign its software to open the phone of an alleged terrorist -- not because they wanted to shield the content on a criminal's phone, but because they worried about the precedent it would set for other people who rely on Apple's technology for protection. How is this situation any different? No backdoor for law enforcement will be safe enough to keep bad actors from continuing to push it open just a little bit further. The privacy risks from this system are too extreme to tolerate. Apple may have had noble intentions with this announced system, but good intentions are not enough to save a plan that is rotten at its core.
Too late (Score:5, Insightful)
Software exists , Apple can do it , cat's out the bag so .. forget it.It WILL be used . Whether it's apple or a political regime it will surface and get used if not in use already. Genies out the bottle and all those things ..
Re: (Score:3, Interesting)
Re: (Score:2)
I wouldn't want foreign companies flouting laws we hav
Re: (Score:2, Interesting)
You mean like when Canadians smuggled black slaves out of America?
Re: (Score:3)
Remind me again, for the sake of relevance...
Was this last week or the week before?
Re:Too late (Score:5, Insightful)
You mean like when Canadians smuggled black slaves out of America?
One of the rules of Wokeness is that one be more concerned about slavery in the US 150 years ago than slavery practiced in some cultures right now.
Re: (Score:3)
Nah, you Americans just moved it from cotton fields to prisons.
Re: (Score:2)
Apple should not have built the technology. Apple should not do business in China.
Now do Android! Now do Android!
Re: (Score:3)
Well Google doesn't do business in China, they took the decision not to. Android is open source so Chinese companies do use it, but of course that's the nature of open source software. No Google apps on Chinese phones, no Google Search. Maps is quite spotty for China too.
Re: (Score:3)
Apple should not be the same greedy, dishonest scum as every other large enterprise. But look, they are.
Re: (Score:3, Insightful)
CCP China is nationalist ("One China" policy).
CCP China is socialist (abandoning central planning for enterprise in service of the state).
CCP China is totalitarian (only one party is allowed).
CCP China is a dictatorship (term limits have been removed for Xi, essentially now President for life).
So, CCP China is a nationalist socialist totalitarian dictatorship. Perhaps that's a reason not to economically and technologically bolster the regime?
New to the ecosystem- (Score:2)
Re:New to the ecosystem- (Score:5, Insightful)
Re: (Score:3, Funny)
Yes Google is a MUCH better way to protect your privacy. As the progenitor of surveillance capitalism I am sure you have nothing to worry about.
Re: (Score:2)
I was a die hard apple fan until August 5th. I yesterday transitioned from my iphone 12 to a Samsung galaxy S21 utlra 5,
This is great! You have left Apple to a platform that tracks your every move, scans your email, ad Google drive.
Seriously though - you jumped from the frying pan into the blowtorch f personal surveillance.
Re: (Score:2)
The one true way to leave Apple and not be subject to surveillance would be to walk around with a black Bell landline phone dangling from your belt. Now all you would need is a place to plug it in and the appropriate plethora of local landline plans.
Re: (Score:2)
If you think Android is a step up in privacy I have a bridge to sell you.
Re: (Score:2)
Re: (Score:2)
I didn't say that. I am saying if you think Android is better then...
Re: (Score:2)
Do you like OS updates? Well bad news in that aspect.
Re: (Score:3)
Re: (Score:2)
Which Android phone should I get?
The one that does need to have you run an antivirus every time you make a phone call. Oh, wait!...
Re: (Score:2)
Re: (Score:2)
They make fairly small phones as well as the larger ones. If you want small... Galaxy Flip maybe? or Xiaomi make some decent smaller models I think.
Re: (Score:3)
Pixel, are you joking, that's out of the frying pan and into the live volcano. Google tries to privacy rape you in multiple ways and punishes you with captcha's and other security crap if you try to protect your privacy. Google makes Android so that they can have access to your address book and searches etc. IDK about OnePlus, are they Chinese?
Re: New to the ecosystem- (Score:2)
Depends on who you're trying to protect yourself from. Let's compare:
- All iOS devices are easy to break into by hostile regimes. Basically if you can crack one iphone, you can crack all of them, and that's exactly what cellebrite et al does.
- Android devices are a mixed bag, depending on how old it is, though the more recent ones are pretty resilient. In fact, for the pixel 2 and up, cellebrite's capabilities are quite limited, if not useless. Ditto for the most recent 3 or so generations of Samsung phones
Re: (Score:3)
I wouldn't say that paranoid.
Google already scans your photos, actively collects your information to sell, and has fairly bad privacy practices in general. Apple is just starting down the road of bad privacy practices.
It's throwing the baby out with the bathwater if you switch over this issue. There are plenty of good reasons to use an android phone, but this issue is not one of them. (Unless you are using some 3rd party android OS that is privacy focused, but that turns into a PITA).
Re: (Score:2)
Do you have any evidence that Google sells your information? I want it for the most Earth-shattering GDPR legal complaint in history.
Re: (Score:2)
They are first and foremost a advertising company. I consider delivering targeted advertisements selling my information to the highest bidder. Especially when those ads can contain trackers and cookies of their own. Do they directly sell databases of people to others? No.
Re: (Score:2)
If you don't use Google Search or any of their other services then how do they get your data and how do they sell ads that you see?
I'm assuming you have basic security protections enabled, like no 3rd party cookies etc.
Re: (Score:2)
Because it's an android phone. See https://policies.google.com/pr... [google.com]
I do not use a single google product, this is not a concern of mine directly, more a commentary on google's privacy stance which is firmly in the "Privacy is not to be expected" arena.
Re: (Score:2)
The link you provided explicitly says they don't do what you are claiming.
Re: (Score:2)
If you don't use Google Search or any of their other services then how do they get your data and how do they sell ads that you see?
I'm assuming you have basic security protections enabled, like no 3rd party cookies etc.
Install noscript, then come back and make your claim. Keeping Google from invading your privacy is a much bigger fight than cookies.
As well, aren't you conceding that Google is following you around if you have to do all these things? Because for the majority of people - those that do not block cookies and scripts, oh yeah, they are definitely following us for damn near everything.
You're smart - I'm pretty certain you know all this already, and just allowing your hatred of all things Apple to cloud your
Re: (Score:3)
Sure, while they're at it, don't go to any web sites that use Google Analytics or Google Ads. :-)
And noscript is how we find out who's using what. They are everywhere. Some pages break. That's okay. Counterproductive on the page owner's part.
Re: (Score:2)
I wouldn't say that paranoid.
Google already scans your photos, actively collects your information to sell, and has fairly bad privacy practices in general
You forgot they scan your email for illegal pR0n and monetizing purposes.
Look folks - lets think of this another way. Would we want illegal pR0n stored on our servers? Google and Apple don't want to be a repository and distribution hub for that shit.
I would suggest to our privacy people to set up a server farm, and promise users that their files are 100 percent private, and you won't ever run hashes on them.
See what happens.
Re: (Score:2)
Like protondrive? https://protonmail.com/ [protonmail.com]
Re: (Score:2)
It's not paranoia when it's true, Google is primarily an advertising company and their goal has always been to hold as much info as they can. Every free product Google makes is there to keep you in their product sphere so they can data-mine you to increase the value of adverts.
Suggesting someone move from Apple to Google phone because they value privacy doesn't make any sense. I try to avoid putting any money in Google's hands, Google make Pixel.
If I'm too racist? I'll just ignore what the Chinese are doing
Re: (Score:2, Troll)
So you believe that Google reads your address book (without permission, a serious crime in the EU) and monitors your searches (even if you don't use Google search) on Android? Is that correct?
As for "the Chinese", it says a lot that you consider the actions of their government reason enough to punish all of them. Perhaps we should have done more to boycott the US and effect regime change a few years ago. Since you actually voted for your leader that makes you far more responsible than the average Chinese pe
Re: (Score:2)
Without permission? What happens when you start up a new android phone? EULA.
"and monitors your searches "
Sticking words in my mouth there.
Don't assume, it's rude.
Fuck off with calling me a racist because I don't like shitty dictatorships FFS, sort it out. I am very anti-racist. I'm not against anyone boycotting USA and would not call them a racist or make other pointless assumptions if they did so. I am strongly in favour of boycotting countries that don't treat their citizens well.
My family was strongly a
Re: (Score:2)
Well The SA government was elected so a boycott was quite effective. Holding 1.4 billion Chinese people culpable for the actions of the dictatorship that victimizes them is a bit odd though. To go back to your analogy how would hurting black people in SA have helped?
Re: (Score:2, Insightful)
So, you are saying the SA sanctions shouldn't of happened because it hurt black people?
Explain to me how any sanctions doesn't hurt the general population ever? And if sanctions always hurts the general population they why do you think that people who actually care about the general population would want sanctions? Do you realise that often that same population would also support the sanctions against their very own country.
Do you think the people of Hong Kong would like China to be sanctioned?
Do you think
Re: (Score:3)
Sanctions are some some generic thing that hurts everyone in a country equally. For example they are often targeted at individuals, usually members of the government. That's fine, but "I'm boycotting the entire country" seems a bit over-the-top and very likely to hurt innocent people.
SA had democracy for whites so they were responsible for the government and any bad referenda.
Re:New to the ecosystem- (Score:4)
If I'm too racist? I'll just ignore what the Chinese are doing to the Uyghurs.shall I or the fact that China has an abysmal human rights record. Have you noticed what just happened in Hong Kong or am I racist for mentioning that? Excuse me if I don't want to make a powerful dictatorship more powerful.
This is a little OT, but bears being posted.
I think you might be arguing with a person who believes that only those people considered "white" can be racist.
Which is in itself 100 percent racist.
Anyhow, yes, China is a racist country. In fact, while these people always point to the US as the real racists, in fact:
https://worldpopulationreview.... [worldpopul...review.com]
A map: https://www.dailymail.co.uk/ne... [dailymail.co.uk] There are obviously people here who are racists. But it turns out we are a whole lot less racist than the memes would have everyone believe.
Re: (Score:2)
Chinese is no longer a race, it's an ideology.
Re: (Score:2)
Oh, Motorola are American, I guess you could buy one of their phones.
"Motorola Mobility LLC, marketed as Motorola, is an American consumer electronics and telecommunications company, and a subsidiary of Chinese multinational technology company Lenovo. " https://en.wikipedia.org/wiki/... [wikipedia.org]
Re: (Score:2)
Well they are all made in China anyway, if you are that worried or don't want anything to do with China for some reason... Maybe go talk to the Amish.
Re: (Score:2)
Hard to go wrong with a Pixel. OnePlus phones are good too.
Care to refute this?
https://www.theverge.com/2014/... [theverge.com]
'Will nobody think of the children' (Score:5, Insightful)
China is very likely to enforce this or something similar on software on phones sold there. Apple has a choice; abandon its major market, or find an excuse to surrender elegantly.
Don't be surprised.
China owns both ends of the pipe (Score:2)
They don't need access to a phone. They have everything they need to see your traffic and likely have or can get any keys stored on the device. When your ISP is the CCP, there is no incognito mode.
Re:'Will nobody think of the children' (Score:5, Insightful)
Well, it is clear they have surrendered. But there is nothing "elegant" about it. They just admitted that all their concerns about user privacy of the last few years were direct lies. Not that this is any real surprise.
Re: (Score:2)
Google chose not to do business in China. There is always a choice.
Re: (Score:2)
China already has all the unencrypted backups of the phones; China's laws already make it possible for them to grab this info off of any phone or backup at any time.
Think, people. Oppressive regimes have no need for this rigamarole. China is a country openly practising a cultural and possibly a literal genocide on Uighur people, and you think they're going to care about tinkering with a phone to make it look like you had pictures on it that you didn't? They'll just toss you in jail and say you did a bad thi
Burn (Score:3, Insightful)
The company will go down over this.
It's an absolutely monumental mistake that the company didn't see this blow-back coming. World and dog is against this plan but they keep insisting they just need to "explain it better." That's usually a sign that they're stuck in the groove and resist crawling out of the hole they've dug for themselves.
Re:Burn (Score:4, Insightful)
They won't burn. Most people understand that this is more of a gray zone than people around here want to admit.
Apple is trying to solve a very real and very serious problem that most people understand is a huge issue in today's world. This is where the rubber hits the road. You would have Apple do nothing about serious abuses of their technology because of the (very real) potential abuses of their solution. Is your opinion no digital tools can ever be used to combat some truly evil things because they might be used in some future bad way? If you think this is wrong, is there a technology you can suggest that can help combat the very real problem of sexual exploitation?
Re: (Score:2)
Aside from that: it's not Apple's job to police everyone's smartphone. Whether CP is present on smartphones in large quantities is highly questionable anyway.
Other tech companies will take a wait and see attitude to see how this works out for Apple and I don't see Microsoft announcing a similar scheme on Windows anytime soon.
In addition, the threat of false positives is HUGE since they don't use regula
Re: (Score:2)
Microsoft and Google have been scanning emails for chid porn for almost a decade now. Doesn't seem to have hurt them
Re: (Score:2)
Microsoft and Google have been scanning emails for chid porn for almost a decade now. Doesn't seem to have hurt them
Not on your device. Ignorance and a blind devotion to apple are not your friends here.
What's the difference, your email isn't really yours, or private? I just want to hear you say it before you get back on that privacy high horse.
I know the reality, privacy expectations online are wishful thinking. There are almost no legal protections in the US for digital privacy. So hop down off that horse if you want to talk about privacy on "your device".
Re:Burn (Score:5, Insightful)
They won't burn.
agreed
Most people understand
If your sentence starts like this, you are already wrong.
that this is more of a gray zone than people around here want to admit.
Putting aside the fact that most people have their heads completely up their assholes, this is not a gray zone. This is a hard line which must not be crossed. Inspection of users' data only goes one way. Either store users' data for them without inspection, or don't store it at all. There's nothing good down the path of spying on your customers.
You would have Apple do nothing about serious abuses of their technology
That's right. I would.
because of the (very real) potential abuses of their solution
s/potential//
Is your opinion no digital tools can ever be used to combat some truly evil things because they might be used in some future bad way?
They developed these tools so that they could use them in a bad way in the present, specifically to quash dissent in China. They are now making excuses for it on the usual "won't someone think of the children" basis.
If you think this is wrong, is there a technology you can suggest that can help combat the very real problem of sexual exploitation?
Mental health care.
Re: (Score:2)
As expected, you have no real solutions or alternatives.
I have the only real solution and alternative. Sadly, people like you oppose it on specious bases.
nowhere in the Constitution is the word privacy mentioned
"The Supreme Court in Griswold v. Connecticut, 381 U.S. 479 (1965) found in that the Constitution guarantees a right to privacy against governmental intrusion [wikipedia.org]". But no, it doesn't guarantee your right to privacy against private companies. However, since I didn't use the word right in that way in my comment, you are responding to something I didn't say. This is a fuckoff waste of time, like responding to your co
Re: (Score:2)
Your only proposed solution is what, therapy? For who? The victims? Also Supreme Court ruling does not mean something is in the Constitution. It means that is how they interpret it.
The reason you want to stop is you like to sprout platitudes and absolutes without considering other perspectives or defending the obvious flaws in your logic.
Re: (Score:2)
Most people will not even realize this "feature" will exist or what the implications of it are. It's not like Apple is going to include this on the list of new features for iOS 15 page.
Re: (Score:2)
It's an absolutely monumental mistake that the company didn't see this blow-back coming. World and dog is against this plan but they keep insisting they just need to "explain it better." That's usually a sign that they're stuck in the groove and resist crawling out of the hole they've dug for themselves.
The thing is, Apple has had (generally) succeeded with just doubling down on "you're doing it wrong, you need to think more courageously like we do." Based on that history, it's easy to see why they would keep sticking to this line. Look how long it took them to admit that the butterfly keyboard design was unfixably flaky (not that they ever admitted this in so many words; they just courageously and semi-silently omitted it from future MacBook designs). It's understandable that they might see the best strat
Re: (Score:2)
If they double down on this (like I've predicted) they're in for a hard ride.
BTW: I hate me having to add my own HTML markup to my Slashdot comments. Why are the
Re: (Score:3)
It's not that I am not concerned about the slippery slope argument and where this could go, but frankly if Apple was willing to secretly divulge documents outside the annou
Re: (Score:2)
Google, Microsoft, Dropbox, etc already do this. Apple is just the first to make a big deal out of it and the first to do it locally on the device rather than the cloud storage (and it only does it locally on the device if you use iCloud so if you care, turn off iCloud).
This is a thing that will blow over very soon and everyone will forget about it. I'm not a fan of reducing my privacy, but short of using some 3rd party android OS or other half backed phone OS I can't think of any other devices that would i
Re: (Score:2)
You can be sure that copyright holders will start pressuring Apple and the government to start policing for copyright infringement on their devices in the near future. Apple claims it will resist such requests, but what if the government starts mandating it by law?
Re: (Score:3)
(and it only does it locally on the device if you use iCloud so if you care, turn off iCloud).
As I learned this weekend, it's not that simple.
A software update (or one of the 57 click-throughs that occured after in order to be able to use the device again after it booted) happily turned it back on to me.
Well, to clarify, it turned on iCloud Photo for me again.
I suppose if I didn't have an account associated with iCloud at all, then it probably would have had a harder time... but then again, the device is essentially useless without that.
Re: (Score:2)
What blowback? People squak about literally everything on the Internet. If your prediction that Apple will "go down" over this is correct - if it really hurts their quarterly results - then that will be blowback.
Here are my predictions though:
1) The vast majority will have zero personal impact and won't care.
2) Of those who care on the basis of what they read on the internet, the highest-impact stories will be
Re: (Score:2)
5) instead of going down, Apple will keep making a buttload of money
Re: (Score:2)
No, this is a thing that tech people like us are worried about. Nobody else gives two shits.
But as tech people, we have to understand the nuance here. Apple isn't wholesale scanning your phone. They're scanning photos that are being moved to and from their iCloud Photo service. The processing happens to be on-device, rather than in their cloud. Facebook already scans all the pictures that are uploaded to that service and they rake in about 20 million CSA images a year.
Is there a worry that bad stuff will be
THIS is what the distortion field breaks down on? (Score:2)
It's such a nothingburger. Oh no if you choose to store your shit on icloud there's a tiny fucking chance a stranger from Apple will check a couple of your images because they have "similarity" (from the pov of artificial stupidity) to kiddy porn and if not just move on to the next stranger. The humanity!!!
Anyone else thinking the same thing (Score:2)
Its a plot! (Score:2)
Re: (Score:2)
I've been called worse for less for my whole life.
reminds me of the Intel FP bug, only much worse (Score:2)
"only a few of the calculations" would trigger the bug, but the user had no way of knowing which (the IRS even issued a "we don't care; you signed it, not the chip). For practical purposes it meant that ALL FP calculations were wrong.
No matter how miniscule the chances of a false positive, and the disastrous consequences for the file holder (some places still have the death penalty for child porn), since you have no way of knowing if an image on your phone/tablet(/Mac?) will generate a matching hash, you
Re: reminds me of the Intel FP bug, only much wors (Score:4, Insightful)
What disastrous consequence? Even if they triggered on a single match what would happen? Some stranger from Apple in a max security setting will get to see 1 of your images (the chances of multiple accidental collisions is too small to be relevant). Then he sees it's only a dick pic and not cp and moves on.
By requiring multiple matches there is no chance of accidental matching without actual cp or planted colliding images. Now the stranger at Apple sees some bullshit colliding images and moves on.
Disastrous.
Re: (Score:2)
You have obviously read and understood what Apple is proposing and have a realistic approach to the issue. Why are you posting here?
Re: (Score:3)
What disastrous consequence? Even if they triggered on a single match what would happen? Some stranger from Apple in a max security setting will get to see 1 of your images (the chances of multiple accidental collisions is too small to be relevant). Then he sees it's only a dick pic and not cp and moves on.
The real disastrous consequence is that Apple is turning dissidents in to the Chinese government, and they are doing this child porn shit in order to get the usual cadre of useful idiots to cheerlead for them as a result. Whether you know it or not, you're literally celebrating oppression.
That you don't seem to know it is typically chilling.
Re: (Score:3)
Re: reminds me of the Intel FP bug, only much wors (Score:5, Insightful)
Then he sees it's only a dick pic and not cp and moves on.
And what if it's a picture of your daughter naked? Not CP just naked. In a country like America where people have been prosecuted for watching legal porn (but the actress looked young, omg, arrest the pedo fucker!), or in a country where people have been dragged through the police system for having pictures of their own children playing (sick pedo fucks, what would Jesus do!), or lynched in a park for taking pictures of their own children (he's a lying pedo, get him, we have morals on our side),
yes. YES it very much could have disastrous consequences.
If we were talking about a black and white concept here I'd agree with you, but there's nothing black and white about child pornography.
Fuck man some western countries have proposed banning all porn with actresses who have A-cups, because OMFG THE PEDOS ARE EVERYWHERE.
Fall in fucking line citizen.
Re: (Score:3)
"Even if they triggered on a single match what would happen? Some stranger from Apple in a max security setting will get to see 1 of your images"
If someone at Apple can see 1 image, then a govt can compel Apple to allow someone NOT at Apple to see more than 1 image. A slight modification to wildcard ALL matches for images/files on a phone could easily allow this type of interrogation.
Collision is a DOS attack on verification (Score:2)
Collision are not some cyberswatting weapon unless you assume Apple doesn't verify matches, which they do. So they are only a DOS attack on the verification, let Apple worry about that.
https://appleinsider.com/artic... [appleinsider.com]
Re: (Score:3)
How do they verify though?
From: https://www.hackerfactor.com/b... [hackerfactor.com]
"The laws related to CSAM are very explicit. 18 U.S. Code 2252 states that knowingly transferring CSAM material is a felony. (The only exception, in 2258A, is when it is reported to NCMEC.) In this case, Apple has a very strong reason to believe they are transferring CSAM material, and they are sending it to Apple -- not NCMEC.
It does not matter that Apple will then check it and forward it to NCMEC. 18 U.S.C. 2258A is specific: the data can
The trouble with this system (Score:5, Insightful)
My concerns:
First, no one is allowed to see the images that were used to create the database, since they are presumed illegal to possess or distribute (regardless of whether they actually are illegal to possess or distribute). While the term CSAM is being used, what assurance do we have that this only contains "abuse" imagery? For example, 16-year-old couples can engage in various perfectly legal sexual activities, but if those activities are recorded, the recordings are "child porn" under U.S. law. Does the CSAM database include such images? Many people would be happy if the answer is "yes", but I think that vigorous censorship of legal activities actually encourages child abuse. You see, because of the crackdown, the most practical way to acquire images of young people doing the nasty is to buy pictures from a black market, and the people most likely to take the risk of selling such imagery are illegal sex traffickers, not individual 16-year-old couples raising money to pay for college. (And because recordings of legal activities and illegal activities are both criminal to possess, a pedophile who might have been satisfied with a recording of legal activity might choose to buy abuse imagery instead if it is easier to find, which also incentivizes child abuse. Plus, a pedophile might look at the vigorous enforcement of laws against image possession, and conclude that downloading images is too dangerous and that he is actually less likely to get caught by actually abusing a child. Also, naughty pencil drawings are illegal under U.S. law [wikipedia.org], and some would argue that it would be safer for children if pedophiles were to look at fictional images rather than to purchase real ones.)
And as many have pointed out, if there were completely non-sexual images in the DB, no one would know.
The system also has an oldness bias: that is, old images are more likely to be in the database than new ones, and so a person is less likely to be caught by storing brand-new "0 day" images that they just purchased from a sex trafficker, and much more likely to be caught based on something found in an old CD from 15 years ago. Paradoxically, then, someone familiar with the system might be more inclined to pay for new images than old ones, which incentivizes child abuse (though only mildly, since there are safer ways to avoid the system, like storing nasty images on a PC/Android)
Re: The trouble with this system (Score:2)
For example, 16-year-old couples can engage in various perfectly legal sexual activities, but if those activities are recorded, the recordings are "child porn" under U.S. law. Does the CSAM database include such images?
How would Apple have the image hashes if they weren't uploaded to the god damned Internet and circulating as child pornography? So THAT happens, and you're asking what about the people in the video when they have the video in their phones? Seriously?
"I'm the one that took the video"
", your honor."
Hmm, does it pass the test?
Re: (Score:2)
You could read their Threat Model [apple.com] paper in which they discuss the answer to the question. And in fact, they even formalize that Source image correctness (bolding in original) is a design goal, which they formalize as
Re: (Score:3)
It's clear that their coal is legitimately to catch what they say they're trying to catch.
The security mechanisms in place are obviously bona fide.
But good faith does not make an action immune from bad consequences.
There are many reasons to believe that a persons right to security in their persons, houses, papers, and effects, against unreasonable searches and seizures shouldn't be violated by a private corporation not-so-bound by said constitutional protection.
Apple is free to do this, u
Re: (Score:3)
Essentially it's telling pedophiles (stupid enough to store their shit on their phone...really?) that they just have to be more creative about their framing, posing, and subjects.
Setting aside the efficacy of AI, I'm still wondering what PRECISELY we're even CALLING child porn since we can't seem to decide on it either.
In Japan, what I would ABSOLUTELY call uncomfortably-close quasi porn is dismissed because "well...it's a 400 year old demon in a prepubescent child's body, so not pedophilia".
If I draw a big
this will be a DOS battle (Score:2)
I've already read that the algorithm was reverse engineered and people have created pictures that collide. People will protest this by mass altering everyday pictures on the web to match pictures already in the database and doing everything they can to get them onto everyone's devices.
What they are doing is a much more complex version of what I did with a software distribution years ago. Instead of keeping a file of CRC32s to verify the distributions files, I wrote an algorithm to calculate and add four byt
Re: (Score:2)
Even if you manage to get something with the same checksum (keeping in mind a 32-digit checksum has 300 undecillion combinations), would it have the same filesize and dimensions?
Re: (Score:3)
Instead of keeping a file of CRC32s to verify the distributions files, I wrote an algorithm to calculate and add four bytes to every file that caused the CRC32 to be zero.
Appending the CRC32 of the original data to the end of the file will cause the CRC32 of the combination to be zero. This is a well-known property of CRCs. There are some caveats (e.g. some CRC code inverts the result and you'd need to compensate for that) but the principle is straightforward.
A less-known property is that CRC(X ^ Y ^ Z) = CRC(X) ^ CRC(Y) ^ CRC(Z). It's relatively simple to leverage this property to determine four bytes you can add to any given file to produce an arbitrary user-selected CRC32
Rube Goldberg (Score:2)
The only images scanned by the system are images that are milliseconds away from being uploaded and stored in iCloud, unencrypted. The attacks contemplated here are ridiculously more complex than necessary, since the attacker can just directly scan your unencrypted photos in the cloud.
Would hate to be a not so well endowed man (Score:2)
Re: (Score:3)
I used to frequent a forum aimed at people with various chromosomal or intersex disorders, and one section was for people who suffered from issues relating to delayed or impossible puberty in general and this issue came up quite often, most notably when Australia decided to demand a minimum breast size in porn because pEdOpHiLiA. What you're saying is a very real fear of people like myself, even if we can prove that we're adults, what if we're asked to prove when the photo was taken? What do you want me to
Champions Privacy, Except From Themselves (Score:2)
But Apple still has your device -- your contacts, email, notes, photos, and all manner of private data -- at their fingertips to scan and otherwise do what they please.
You think you're buying Apple's product when you buy their iPhone. No, you're buy your own membership card to becoming their product.
"could be misused" = "will be misused" (Score:2)
Seriously. Stop living in a fantasy world. This is every authoritarian's wet dream.
This also gives is the only credible reason Apple had this really bad idea and is not insisting on doing this: They were pressured into it and were unable to defend themselves effectively. Sets a nice precedent and comes assured intent to misuse it a little down the road.
Apple's Real Reason (Score:3)
Obviously I don't sit in a C-level position at Apple, so this is opinion, but it looks like Apple is doing the least intrusive, but still effective means of preventing child pornography from residing in iCloud. Due to changes in ISP safe harbor laws, they could be held liable for child pornography stored on their servers. They aren't scanning your hard drives, it's just a scan before upload to iCloud. Flag and set aside. Any other images are still encrypted in your iCloud account and not accessible. Does it mean there will never be any child pornography in iCloud? Of course not, but they've made it enough of a inconvenience that it will not be a frequented repository of that stuff. Scanning on device before upload is the least intrusive, yet still effective means of preventing child pornography on their servers and they will be able to say that anything that got through was because the Feds didn't flag that image. If you really must have kiddie porn on your iDevice, turn off iCloud. Apple really isn't trying to be the police, they're trying to not be sued / fined. But, yes, it's putting the piping in place to scan for other files.
Even for its purpose (Score:2)
It will be a bad thing even when used for its purpose. Just wait until your picture of some flower is deemed by Apple's infallible AI to be child abuse. Everything Apple you own suddenly stops working. You get a visit at 5 AM from the vice squad who take you away in handcuffs. Right, good luck with that.
Apple's idea is that it is going to subject millions and millions of people to potential life-altering errors because there is a 0.0000000001% chance that they will discover some crime. But this will pl
Re: (Score:2)
As a child abuse victim I support Apple's approach and I am disgusted you have used that issue as talking point.
Re: (Score:2)
And then there are other such victims that manage to grow beyond their personal situation and realize they actually have a responsibility to society as a whole here. I guess you do not qualify.