We Built a CSAM System Like Apple's - the Tech Is Dangerous (washingtonpost.com) 186
An anonymous reader writes: Earlier this month, Apple unveiled a system that would scan iPhone and iPad photos for child sexual abuse material (CSAM). The announcement sparked a civil liberties firestorm, and Apple's own employees have been expressing alarm. The company insists reservations about the system are rooted in "misunderstandings." We disagree.
We wrote the only peer-reviewed publication on how to build a system like Apple's -- and we concluded the technology was dangerous. We're not concerned because we misunderstand how Apple's system works. The problem is, we understand exactly how it works.
Our research project began two years ago, as an experimental system to identify CSAM in end-to-end-encrypted online services. As security researchers, we know the value of end-to-end encryption, which protects data from third-party access. But we're also horrified that CSAM is proliferating on encrypted platforms. And we worry online services are reluctant to use encryption without additional tools to combat CSAM.
We sought to explore a possible middle ground, where online services could identify harmful content while otherwise preserving end-to-end encryption. The concept was straightforward: If someone shared material that matched a database of known harmful content, the service would be alerted. If a person shared innocent content, the service would learn nothing. People couldn't read the database or learn whether content matched, since that information could reveal law enforcement methods and help criminals evade detection.
But we encountered a glaring problem.
Our system could be easily repurposed for surveillance and censorship. The design wasn't restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser. About the authors of this report: Jonathan Mayer is an assistant professor of computer science and public affairs at Princeton University. He previously served as technology counsel to then-Sen. Kamala D. Harris and as chief technologist of the Federal Communications Commission Enforcement Bureau. Anunay Kulshrestha is a graduate researcher at the Princeton University Center for Information Technology Policy and a PhD candidate in the department of computer science.
We wrote the only peer-reviewed publication on how to build a system like Apple's -- and we concluded the technology was dangerous. We're not concerned because we misunderstand how Apple's system works. The problem is, we understand exactly how it works.
Our research project began two years ago, as an experimental system to identify CSAM in end-to-end-encrypted online services. As security researchers, we know the value of end-to-end encryption, which protects data from third-party access. But we're also horrified that CSAM is proliferating on encrypted platforms. And we worry online services are reluctant to use encryption without additional tools to combat CSAM.
We sought to explore a possible middle ground, where online services could identify harmful content while otherwise preserving end-to-end encryption. The concept was straightforward: If someone shared material that matched a database of known harmful content, the service would be alerted. If a person shared innocent content, the service would learn nothing. People couldn't read the database or learn whether content matched, since that information could reveal law enforcement methods and help criminals evade detection.
But we encountered a glaring problem.
Our system could be easily repurposed for surveillance and censorship. The design wasn't restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser. About the authors of this report: Jonathan Mayer is an assistant professor of computer science and public affairs at Princeton University. He previously served as technology counsel to then-Sen. Kamala D. Harris and as chief technologist of the Federal Communications Commission Enforcement Bureau. Anunay Kulshrestha is a graduate researcher at the Princeton University Center for Information Technology Policy and a PhD candidate in the department of computer science.
Repurposed for surveillance and censorship? (Score:5, Insightful)
Re: (Score:2, Insightful)
even if for now is only censoring illegal content and surveilling criminals
I don't think they've been limiting themselves at all. Seems to me big tech has been overtly censoring perfectly legal content and normal law abiding citizens, just because they have a different political view.
I encourage everyone who has been subjected to censorship at the hands of big tech to join President Trump's lawsuit [takeonbigtech.com]. The tyranny must be stopped!
Re:Repurposed for surveillance and censorship? (Score:5, Informative)
Surely EX-president Trump?
It's traditional in the US to refer to all past presidents with the title of President, even after they leave office. The tradition is slowly fading but it's not gone.
Re: (Score:2)
he never should've become president in the first place, and doesn't deserve that title at all
So, we should only refer to former Presidents as President IF Sebby approves of the election of said president?
Re: (Score:2)
Re: (Score:2)
So, we should only refer to former Presidents as President IF Sebby approves of the election of said president?
Nah people get titles though their actions. He may have been elected by a minority but National Embarrassment Trump never once did anything presidential.
Re: (Score:2)
only if they are actually competent enough to hold that position
So once President Biden is out of office you will no longer refer to him as President?
Re: (Score:2)
So once President Biden is out of office you will no longer refer to him as President?
I never said anything about what happens "once [someone] is out of office".
Re: (Score:2)
Re: (Score:2)
Hmmm. I'm quite sure those 3 dolls surrounding you in your bedroom of your parents' basement are in full agreement with you.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
"they never said anything about that"
Yes, 'they' did:
It's traditional in the US to refer to all past presidents with the title of President, even after they leave office.
"only that the orange blob wasn't competent enough to be president, or to have actually been president"
Nope, he CLEARLY stated that he shouldn't be called President:
he never should've become president in the first place, and doesn't deserve that title at all.
And you can call me a Trumptard/MAGAtard all you like (which is what people with little debating skills do), but I'm not. Trump is an idiot, and him being elected president was one of the dumbest things this country has ever done. And, no Trump won (in 2016, not 2020), and was elected pres
Re: (Score:2)
Re: (Score:2)
You mean like former Vice President Biden?
Yeah, just like Former Vice (non-)President Pence.
Re:Repurposed for surveillance and censorship? (Score:5, Interesting)
armed insurrection in Washington DC on 2021/01/06
Perhaps you missed today's news FBI finds scant evidence U.S. Capitol attack was coordinated [reuters.com], which destroys the idea of an "insurrection".
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Not sure why you think those things are incompatible. Nothing in the definition says an insurrection has to be pre-meditated or coordinated. A spur-of-the-moment mob trying to violently stop the counting of votes is just as much an insurrection as an organized group trying to violently stop the counting of votes.
Re: (Score:2)
Re: (Score:2)
So calling for a protest including folks who came with zip ties & combat gear, demanding that the protestors go to capitol hill to fight like hell, refusing to augment the capitol police security until it was clear that the certification process would not stop and people were dying. Nope, no coordination in advance here.
These aren't the droids you're looking for, move along.
Re: (Score:2)
I saw the report, and I say it's garbage. There is clear evidence of organizers on public internet groups. There was definite direction given over the public airwaves on the day. Perhaps they meant "It doesn't meet the definition of criminal conspiracy", but they have been conspiracy convictions on a lot weaker evidence. More likely they decided that prosecution would be politically inconvenient.
As for the report...do you really believe everything the government tells you? When did this start?
Re: (Score:2)
armed insurrection in Washington DC on 2021/01/06
Perhaps you missed today's news FBI finds scant evidence U.S. Capitol attack was coordinated [reuters.com], which destroys the idea of an "insurrection".
Okay, not one big insurrection, but a thousand little/individual ones. Happy now? :-)
Re: (Score:2)
FBI finds scant evidence U.S. Capitol attack was coordinated [reuters.com], which destroys the idea of an "insurrection".
Here's the dictionary definition of insurrection: https://www.merriam-webster.co... [merriam-webster.com]
Here's the legal definition of insurrection: https://legal-dictionary.thefr... [thefreedictionary.com]
Do you want to guess which word does not form part of the definition? Hint: There's no mention of the word "coordinated". That's a criteria you made up.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
surveillance and censorship (Score:5, Insightful)
Re: (Score:2)
And since even people who are otherwise impervious for law enforcement get caught, it seems there isn't really any reason for these suspicious tools for that purpose.
So what could it be?
Re: (Score:2)
Surveillance and censorship was always the purpose, anything else is the excuse. Child exploitation, a heinous crime, is successfully detected and prosecuted without this technology.
With a lot of damage done between beginning and end. It's not just about catching, but catching early.
Re:surveillance and censorship (Score:5, Insightful)
With a lot of damage done between beginning and end. It's not just about catching, but catching early.
Definitely. The next step is weekly warrantless home inspections. After all, what's probable cause for the curtailing of civil liberties if not the potential for children to be hurt? And finally, when the technology is available- catch it before they even commit it- then we'll have saved them all!
Re: (Score:2)
Who will you target first? Those with children (since parents and very close relatives are overwhelmingly the perpetrators of CSA), or those without children?
Or, and this is the one that the average politician will protest against most strongly, those who have children who they haven't publicly acknowledged?
Re: (Score:2)
Who will you target first? Those with children (since parents and very close relatives are overwhelmingly the perpetrators of CSA), or those without children?
Whoever pisses me off the most politically, of course.
Re: (Score:2)
Just need to get the hashes for images you have just taken and enter them into the database just before checking the database for those hashes!
Re: (Score:2)
Is it though? After all, how long does it take for a known image to be entered in the database? Is this a database of 40 year old images or of data intended to catch crimes "early"? Is this system even capable of "catching early"?
Re: (Score:2)
With a lot of damage done between beginning and end. It's not just about catching, but catching early.
So, now that the entire world knows Apple is scanning for CSAM images, who exactly do you think this is going to catch? Probably a couple of idiots who haven't gotten the iMessage. For that, we intrusively scan a few hundred million phones of otherwise innocent people, and, as a bonus, create a system that can literally scan for any other content as well?
Anyone who owns an iDevice now knows that there is a thin technological + human line between them and a call to the FBI/country-specific authority. I me
Re: (Score:2)
To be fair, detecting and prosecuting exploitation isn't either "we can't catch them" or "we catch 100% of them", so there is a question of "what percentage of incidents are caught?" We caught and prosecuted murderers in the 1800s, and yet somehow forensic advances were still warranted. So there is presumably some area for improving our success on this front.
Another is to what extent is the evolving norms may impact detection and prosecution. If unencrypted photo storage is a frequent element, and that get
Two things can be true at once (Score:2)
There absolutely are people who want to snoop, for various reasons. That's absolutely true.
It would be an error to say:
People want to snoop, the sun doesn't really rise in the morning.
Two unrelated facts can be true. The fact that people want to snoop doesn't make every other statement false.
Another true statement is that sexual abuse of children is huge problem. The numbers are staggering. The stories are heart wrenching. It's very much NOT a solved a problem. In fact, 1 in 5 girls and 1 in 20 boys report
Re: (Score:2)
Doggie and Hellman proved the cryptography community wrong in 1976. A challenge for today's smartest people is to find ingenious ways to fight the victimization of children, without a police officer in every house or every computer.
So instead of utilizing 2 other blatantly unconstitutional methods, a third is OK?
Granted, it's not unconstitutional for Apple to do this- the 4th amendment doesn't bind them
But that's the fucking point, and that's why it's so insidious.
It's a circumvention of constitutional protections via the private sector.
I'm sympathetic to the goal, and I believe Apples intentions to be bona fide.
But good faith doesn't make it right.
Re: (Score:2)
Yes, of course! Of course if somebody says "this shouldn't happen", that means EVERYTHING ELSE *should* happen.
Just like if "the sun rises in the morning" is true, that makes every other statement false.
You're brighter than that.
Re: (Score:2)
Yes, of course! Of course if somebody says "this shouldn't happen", that means EVERYTHING ELSE *should* happen.
We're not talking about EVERYTHING ELSE. We're talking about the status quo.
Just like if "the sun rises in the morning" is true, that makes every other statement false.
Nope. No one claimed that.
You're brighter than that.
Sure am. I'll re-quote.
Doggie and Hellman proved the cryptography community wrong in 1976. A challenge for today's smartest people is to find ingenious ways to fight the victimization of children, without a police officer in every house or every computer.
If I'm to interpret this without the context of the discussion at large (Whether it's OK for Apple to play CP police on hardware that I own) then this entire post is a meaningless bit of you waxing philosophical.
Ergo, it's reasonable to come to the conclusion that you feel that the smarted people today have risen to the challenge, and this was their solution, and that it's OK. Becaus
Re: (Score:2)
Oh, I see.
When I said: .. a challenge for today's smartest people is to find
we shouldn't
You read: ... back in 2009, Hany Farid found
We should
When I said we shouldn't, I mean we shouldn't.
When I said today's challenge to to figure out, I didn't mean it was all figured out 15 years ago. By "shouldn't" I meant "shouldn't" and by "today's challenge is to figure out", I meant that today we have the challenge of trying to figure something out. I guess for whatever reason you thought I meant the exact opposite of
Re: (Score:2)
You said:
A challenge for today's smartest people is to find ingenious ways to fight the victimization of children, without a police officer in every house or every computer.
So, in the context of everything else you wrote, and the fact that Apple is not the police, I took what you said to be tacit approval of the private sector playing the police on hardware we own.
I.e., the 2 conditions you gave are technically false. There's no police officer in every house, or every computer, even with what Apple is doing.
You meant to imply that Apple == the Police. Which is fine, being the distinction is only borne out in legalese, not in rational
Re: (Score:2)
Re: (Score:2)
Another true statement is that sexual abuse of children is huge problem. The numbers are staggering. The stories are heart wrenching. It's very much NOT a solved a problem. In fact, 1 in 5 girls and 1 in 20 boys reported they were victims of childhood sexual abuse. Of course we don't know the exact numbers, but we know it's rampant. We also know it's rampant online.
The problem with this system is that it's not going to stop new sexual abuse, it's only matching known pictures, so it will catch the traders, but not the original abusers (who aren't going to upload their pictures to iCloud). But it's a very small fall down a slippery slope to have Apple scan *all* photos to look for sexual abuse (and why stop at children, adults are abused and photographed too) and send them for verification when they reach some threshold.
So if you really care about stopping CSAM, then yo
Re: (Score:2)
I think you've identified (though exaggerated) a couple problems; now I wonder what idea you have - even fantastical, ridiculous ideas - for what would solve the problems.
Public key encryption was "impossible", Elon Musk likes doing "impossible" things, and I love doing "impossible" things, so "impossible" ideas just might end up being great ideas.
Re: (Score:2)
I think you've identified (though exaggerated) a couple problems; now I wonder what idea you have - even fantastical, ridiculous ideas - for what would solve the problems.
Public key encryption was "impossible", Elon Musk likes doing "impossible" things, and I love doing "impossible" things, so "impossible" ideas just might end up being great ideas.
Not all problems have reasonable solutions, even horrific problems. After all, we let millions of children die of starvation each year even though there is plenty of food produced in the world each year. Even in the USA, we let kids suffer through hunger and poor nutrition, and those problems are arguably much easier to solve than ending CSAM.
Re: (Score:2)
> Not all problems have reasonable solutions, even horrific problems. After all, we let millions of children die of starvation each year
Let me see if I understand the parallel you're drawing here. I think I'm missing something. Perhaps you can help me understand the difference between the comparison your making and this wording:
I don't see an easy solution to provide food for starving children, so we shouldn't try. Similarly, stopping the rape of kids isn't easy, so we probably shouldn't bother to do any
Re: (Score:2)
Consider these questions:
1. How effective would a well-publicized method of scanning for child pornography would be at catching or preventing on-going child abuse?
2. How much net harm this would cause if every single child in US grows up with less freedoms?
Re: (Score:2)
I believe I understand what you're saying there. I agree with you.
You think scanning people's phones, at least in the ways we can currently conceive of, isn't worth the loss of privacy. We agree on that.
What I'm not clear on is what parallel you're drawing to starving children.
Re: (Score:2)
Surveillance and censorship was always the purpose, anything else is the excuse. Child exploitation, a heinous crime, is successfully detected and prosecuted without this technology. This is why you hear in the news about pedophile rings getting busted. Even politically-connected billionaires get eventually busted, this is how seriously society takes this.
Looks like Prince Andrew might get away with allegedly raping children. Apparently, he's currently hiding in his mother's castle up in Scotland.
Re: (Score:2)
Self incrimination. (Score:5, Interesting)
It tests the fifth amendment.
It tests warrantless search.
So many problems with what Apple is doing it isn't funny - it's serious.
Re: (Score:2)
It tests the fifth amendment.
It tests warrantless search.
No, no it does not. Apple is not a government actor nor are they acting at the behest of government actors.
Now, if a government actor were to go to Apple and say "We believe Fly Swatter has child porn on their phone and we want you to scan it" without a warrant, that *might* be a problem depending on jurisdiction, level of involvement etc. As of now there is no indication that Apple is acting at the behest of any other entity.
Re: (Score:2)
No, no it does not. Apple is not a government actor nor are they acting at the behest of government actors.
Whether or not it's at the behest of, is irrelevant.
It's arguably Fruit of the Poisonous tree. Except it won't be in this case, for the kids.
Apple can only be acting as a Government agent in this regard. They have no law enforcement power. They're circumventing the fact that the Government is bound from being Big Brother, and acting in that capacity on their behalf.
Re: (Score:2)
No, no it does not. Apple is not a government actor nor are they acting at the behest of government actors.
I could have sworn I saw one of their statements somewhere saying that they built this system in response to requests from law enforcement.
Re: (Score:2)
nor are they acting at the behest of government actors.
Do we know that for sure? Because the government has on numerous accusations accused Apple publicly of supporting terrorists and pedophiles. I'd be very interested to know how much more pressure they exerted behind the scenes to get a company that's spent a great deal of effort marketing itself as protective of user privacy to completely torch that reputation.
Re: (Score:2)
Nothing you've said here makes any sense.
A government actor is someone who acts on behalf of a government. It's right there in the name. A government actor could be an elected official, an employee, or a contractor.
An individual person or organization, by virtue of the actions alone, can not simply become a government actor. After all, there are quite a few insane conspiracy nuts turned domestic terrorists around who believe that they are acting on behalf of the government when they clearly are not.
If Apple is using a government database of hashes, how are they not a government actor?
Using
Re: (Score:2)
So... No evidence then? Just empty speculation based on your silly beliefs about Apple, of which you also have no evidence?
I'm shocked.
I get it. Apple is your religion. Just don't expect the rest of us to drink the kool-aid, alright?
Try not to blow anything up.
Re: (Score:2)
I hate to break it to you, but Al Gore hasn't been involved in government for nearly 20 years.
Just how much influence do you think a former VP and failed presidential candidate has after two-decades?
Get real. You conspiracy nuts are too much.
Re: (Score:2)
The problem is that by using this system, Apple is making iCloud demonstrably LESS SECURE. From the summary:
...a service could simply swap in any content-matching database, and the person using that service would be none the wiser.
Would you know if "Child Porn" was swapped for "Thing Someone in Power Doesn't Like"?
Could you PROVE it?
Re: (Score:2)
Re: (Score:2)
With the scanning software running on your own property, I argue that it is the same as testifying against yourself.
You can't be compelled to testify against yourself. You can however do so voluntarily, which you do according to the ToS you voluntarily agreed to. This will be tested on many levels, but it won't be a fifth amendment issue.
Intentionally created slippery slopes... (Score:5, Interesting)
>Our system could be easily repurposed for surveillance and censorship.
This is not a flaw - this is the intended goal. Slippery slopes really do exist - often intentionally.
Re: (Score:2)
>Our system could be easily repurposed for surveillance and censorship.
This is not a flaw - this is the intended goal. Slippery slopes really do exist - often intentionally.
Over-the-top cynicism is a good way to get modded up on slashdot. That doesn't make it accurate. Apply Hanlon's Razor liberally.
No worse in theory than most 'cloud' photo storage (Score:2)
Mostly images are stored in the clear on the cloud provider and you have nothing but their word that they aren't messing with the content.I suppose the biggest question is whether matches drive things to the cloud storage that didn't otherwise go there, but broadly speaking handsets pretty much give up all their data to some cloud provider or another. Saying that they could surreptitiously apply a different database doesn't mean much when they could surreptitiously do whatever snooping they want since they
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Mostly images are stored in the clear on the cloud provider and you have nothing but their word that they aren't messing with the content.
Actually, you have their word that they are messing with the content. OneDrive, for example, scans uploaded photos and can suspend or terminate your Microsoft account if you store photos containing, I quote: "nudity, bestiality, pornography, offensive language, graphic violence, or criminal activity". Google Photos has similar terms. And so with the others.
Re: (Score:2)
Actually, that doesn't say that they scan your images, merely that if you do you violate the terms of service. And since that list include "offensive language" they can be pretty sure that you *are* violating their terms of service...they just have to pick the right sensitive flower to be offended. (I know someone who finds technical language to be offensive. A friend's wife finds precise qualification of meaning to be offensive. Etc.)
Re: (Score:2)
I know someone who finds technical language to be offensive.
Whitelist, blacklist, master, slave...
very clear, the crux is in the database (Score:2)
Re: (Score:2)
Re: (Score:2)
Good thing Google would never do this, right?
https://www.theverge.com/2013/... [theverge.com]
8 years ago
Re: (Score:2)
Plus, those scans were on Google's servers, not the owner's devices.
This is it's real goal (Score:2)
Our system could be easily repurposed for surveillance and censorship
This possibility has been brought up repeatedly since day one, so I'm not sure how this is news. Of course all you have to do is swap database 'kiddieporn' for database 'government doesn't like X'. I have a sneaking suspicion that Apple built this for the Chinese to use, and just hid it behind the 'think of the children!' mantra.
Problem isn't having it used for surveillance (Score:2)
It's the same reason why drugs should be lega
Something odd is going on here... (Score:3)
Given the following:
* Apple is a control freak, but not stupid
* Apple cares enough about user privacy to buck the FBI
* Apple follows the law, here and in places like China
I have to conclude that Apple rolled this stuff out to try and forestall something worse being required by law.
Yes, the system can be co-opted and used by bad actors like China for arbitrary data suppression.
At least with this system, it will be obvious when they do that - there will be extra hashes outside the ones Apple gets from the US CSAM authorities.
But... (Score:2)
Re: (Score:2)
Isn't hash matching pretty easy to work around? I.e. if you change a single bit in the image, it changes the hash, and it no longer matches?
Sure, that's why CSAM doesn't use hashes, at least not hashes of the sort we typically use in computer science and computer security. The "hashes" are instead simplified, (probably) non-reversible descriptions of image contents, making them scale-independent, often rotation-independent (at least to a degree), and able to survive small changes in the source image. Matching a pair of such "hashes" is done with a distance metric, measuring how close the two values are, rather than exact matching.
Yes, this m
no shit, sherlock (Score:2)
Our system could be easily repurposed for surveillance and censorship.
no shit, sherlock. it is, after all, NOTHING BUT CENSORSHIP. that it is "good censorship" doesn't make it any less censorship and any less obvious, and obviously prone to abuse like any other form of censorship. that this wasn't bloody obvious from the get-go to someone who apparently gets the point and implications of e2e encryption and privacy rights is just amazing. so much common sense and naivety at the same time ...
Re: (Score:2)
It's not the picture of child abuse that harms children. IT'S THE CHILD ABUSE THAT HARMS CHILDREN.
Somebody mod parent up please.
All such busy spying and checking if people have "child porn" photos / pictures are just telling child abusers to not take photos of their victims. And child abuse will keep going on, just with one less evidence of those true crime.
Why is Apple doing this feature? (Score:2)
Apple isn't going to be spending the money to implement this feature without a good reason. They clearly aren't doing it because their customers want a child porn scanner. To the best of my knowledge they aren't being forced to implement a child porn scanner because of government regulations.
So why are they doing this? Is there some PR benefit from this? Is there some anti-child-porn group pushing them to do it? Someone (e.g. spy or secret police agency in some country) pushing Apple to do this using the "t
No need to build a system (Score:2)
Something like this is blindingly obvious as something just begging to be abused.
I would've said "to the average citizen as well", but "Think of the children" is the power off command for your average citizen's brain.
could? (Score:2)
Our system could be easily repurposed for surveillance and censorship. The design wasn't restricted to a specific category of content; a service could simply swap in any content-matching database, and the person using that service would be none the wiser.
The crux is in the word 'could'.
Re:If you don't want... (Score:4, Informative)
Re: (Score:2)
Is it really your property? (Score:2)
The problem everyone has with it is the location of the scanning. It happens on your own property (phone) before it encrypts the data for the cloud. You are essentially self-reporting.
The device is yours. The software/service is rented to you. The search is done by their cloud service client. As a private business, why is Apple not free to set whatever terms of services they see fit? Why do they have an obligation to cater their cloud service to your ideals?
I don't have a strong opinion either way. I am not into kiddie porn, so I know I won't get flagged. Having a lifelong fixation on milfs has served me well in this regard. I am not really seeing your argument. This is no di
Re: (Score:2)
I am not into kiddie porn, so I know I won't get flagged.
False positives are a real concern.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
It compares hashes, so false positives are inevitable.
Re: (Score:2)
Why do you think that you can still decide whether you want to do that in the future?
Re: (Score:2)
Don't worry, you've been in it for a long time.
It's not glaringly visible because there's simply not enough hands on deck to interpret all that data. They're still working on the very top of the data mountain, there's going to be a while until Average Joe feels it.
Re: (Score:2)
The temperature just went up a little more, but nobody cares. Very soon, we will wake up and find ourselves in Communist Chinese America.
Stop blaming technology, and stop voting for authoritarians. It's like you're whining about the gas regulator on the burner instead of the person you elected to watch the temperature. He'll kill you with a log if you let him. Authoritarian rule is not a technology problem, it's a people problem. You don't get from here to there with technology, you get there with people. When the Trumpliban take over DC, you can take out that old flip phone you'd been hanging onto and crow about how much safer you are,
Re: (Score:2)
All politicians are self-serving meat puppets. If you think voting for sleepy Joe made the world a better place, you are hopelessly lost. You have to go after the tech companies and the politicians they own.
Re: (Score:2)
Random hashes are fine (Score:2)
Re: (Score:2)
Well, your conclusions if a bit reasonable, but your first paragraph is hopelessly optimistic. Eventually that may happen, but quite possibly only after decade or so of abuse.
And even you final paragraph is pretty unreal. I can't imagine that all those who store improper images will even think of removing them before Apple starts scanning. Most of them probably won't know that it's even planned. Remember there are robbers who post selfies of themselves committing the robbery on YouTube.