Apple Plans To Scan US iPhones for Child Abuse Imagery (ft.com) 314
Apple intends to install software on American iPhones to scan for child abuse imagery, Financial Times is reporting citing people briefed on the plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people's personal devices. From the report: Apple detailed its proposed system -- known as "neuralMatch" -- to some US academics earlier this week, according to two security researchers briefed on the virtual meeting. The plans could be publicised more widely as soon as this week, they said. The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified. The scheme will initially roll out only in the US.
The proposals are Apple's attempt to find a compromise between its own promise to protect customers' privacy and ongoing demands from governments, law enforcement agencies and child safety campaigners for more assistance in criminal investigations, including terrorism and child pornography. [...] "This will break the dam -- governments will demand it from everyone," said Matthew Green, a security professor at Johns Hopkins University, who is believed to be the first researcher to post a tweet about the issue. Alec Muffett, a security researcher and privacy campaigner who formerly worked at Facebook and Deliveroo, said Apple's move was "tectonic" and a "huge and regressive step for individual privacy. Apple are walking back privacy to enable 1984," he said.
The proposals are Apple's attempt to find a compromise between its own promise to protect customers' privacy and ongoing demands from governments, law enforcement agencies and child safety campaigners for more assistance in criminal investigations, including terrorism and child pornography. [...] "This will break the dam -- governments will demand it from everyone," said Matthew Green, a security professor at Johns Hopkins University, who is believed to be the first researcher to post a tweet about the issue. Alec Muffett, a security researcher and privacy campaigner who formerly worked at Facebook and Deliveroo, said Apple's move was "tectonic" and a "huge and regressive step for individual privacy. Apple are walking back privacy to enable 1984," he said.
Don't scan my phone. (Score:4, Insightful)
Enforce the 4th and 5th! Down with the surveillance state!
Re:Don't scan my phone. (Score:5, Insightful)
Sadly, this sort of shit is enabled by all the people who constantly blab about how we need to be protected from everything. There's always been a little of that in the world, but since 9/11 we've seen it ramp up into the stratosphere and those of us that prefer privacy and freedom are being blocked out by people begging to have the government and the businesses that sponsor the government put safety, security and control above everything else.
I believe we are beginning to exit the security theater stage of the slow creep towards complete oppression, and entering the actual oppression stage. The problem is, so many people are so worried about safety and security they'll never accept that there are legitimate reasons to be opposed to this type of thing.
Mark my words, it's child porn today, it'll be filtering through your personal notes for wrong-think tomorrow.
Re:Don't scan my phone. (Score:5, Insightful)
I believe we are beginning to exit the security theater stage of the slow creep towards complete oppression, and entering the actual oppression stage. The problem is, so many people are so worried about safety and security they'll never accept that there are legitimate reasons to be opposed to this type of thing.
It's not surprising. It's been 20 years next month since 9/11, so we've got an entire generation now coming into adulthood that grew up entirely in the surveillance age.
Re: (Score:3)
Yes, and another decade worth of people who were too young to notice any difference when it changed entering their thirties.
Re:Don't scan my phone. (Score:4, Interesting)
Re:Don't scan my phone. (Score:4, Interesting)
Just child porn? Or also terrorism? According to the Russians, running a news paper that does not say what the gov't want it to say is tantamount to terrorism. According to Israelis, not selling ice cream to people in illegal settlements is terrorism (I'm not kidding, they said that). According to the Chinese, simply being Uygur is pretty much enough to be treated like a terrorist.
So yes, I'm inclined to agree this is disturbingly surveillance state. It's a little unusual that Apple went from shouting about being the defenders of privacy to saying they are now going to stop and search every single one of their customers phones, wow, complete opposites in such a short period of time.
Re: (Score:3)
Re: (Score:2)
I'm sure hoping most people are with me on this.
They're not.
Otherwise, we will see a massive drop in the number of iPhones in use in the US.
We won't. Apple knows this.
Apathy will welcome 1984 just as quickly as anything else.
I'm mostly with you (Score:2)
Re: (Score:3)
I hope Apple has researched what its buyers think of this policy because it's completely contrary to the "privacy conscious image" it's trying to create.
I don't care about little children getting raped if this means my phone is being scanned all the time. I reject and revolt against a totalitarian information dictatorship being introduced by any company or government.
Re:I'm mostly with you (Score:5, Insightful)
That distinction is important. You actually have a say in your government.
This is completely backward. I have no say on government surveillance policy because that is not on the ballot, and no political party is clearly "pro-surveillance", so who do I vote against?
But if the policy comes from Apple, I can opt-out by not buying their products.
Re:I'm mostly with you (Score:5, Insightful)
I can opt-out by not buying their products.
... until some idiots in the government force other vendors to do the same thing, after it's been normalized by Apple.
Re:I'm mostly with you (Score:5, Interesting)
When it gets to the point where there's an automatic funnel from the device direct to law enforcement, then they kind of do functionally become part of the government. Think of it this way. The police want to search your house, but they have no warrant and the 4th amendment doesn't allow them to just break in, should they be allowed to pay an "informant" who happens to be a burglar, to break into your house and search it for them? Then let the burglar off the hook for burglary and be protected by qualified immunity and still get to present the evidence obtained at trial? Honestly, I'm not really sure they couldn't get away with exactly that, but any sensible consideration of that arrangement should conclude that the burglar is acting as an agent of the police, so the 4th amendment should apply.
Same here. If Apple is cooperating with police to perform searches of peoples files, then they are acting as agents of the police.
Re: (Score:3)
Anonymous tips are already used to get search warrants. Locally it turned out that one of the regular "tipsters" was a cop's girlfriend reading off a script he had prepared. They got caught because they were stupid enough to use her regular personal phone, not all of them are that dumb so I'm sure that it happens fairly frequently.
Re:I'm mostly with you (Score:4, Informative)
Oh sure, a lot of parallel construction and laundering of improperly obtained evidence, etc. surely goes on. I'm really thinking more of a situation where, for example, the Pinkerton private detective agency, under contract from the police forcibly perform "inspections" of everyone's home. Since they're private, the 4th amendment doesn't apply and since the police won't arrest them for breaking and entering or assault, there's no real recourse against them (that won't get you shot by the police), and the police have qualified immunity so they can't be sued, and there's no one to arrest them. That's an extreme situation, but I would say it's a pretty good parallel to the police having an arrangement with a computer company to rifle through your personal files for them.
Re:I'm mostly with you (Score:4, Insightful)
should they be allowed to pay an "informant" who happens to be a burglar, to break into your house and search it for them?
You ask this as if it's not already common.
Though they usually just have the informant lie about you instead of bothering to break in.
Re:I'm mostly with you (Score:5, Insightful)
Apple is 100% Democrat.
What a load of nonsense. Like all corporations, Apple is whatever is convenient at the time.
Re:I'm mostly with you (Score:5, Informative)
Give me a break
Re: (Score:2)
I don't care what justification or how evil the thing you're scanning for is. It's my damn phone. Leave me f***ing alone. I'd rather criminals not get caught than be spied on 24/7. I'm sure hoping most people are with me on this. Get government out of my computers!
Enforce the 4th and 5th! Down with the surveillance state!
Apple isn't a/the government -- at least, not yet anyway. The 4/5th amendments don't apply to them.
If you don't like the policies of a *company* don't buy/use their products.
Re:Don't scan my phone. (Score:4, Insightful)
When has an iPhone ever been yours? They have always been Apple's property, they are the ones with the keys, the ability to choose what software it runs.
Re: (Score:3)
When has an iPhone ever been yours? They have always been Apple's property, they are the ones with the keys, the ability to choose what software it runs.
Exactly, it's why it's called "your Apple Iphone", to remind you that you never actually owned the phone, you're just using it with master Steves permission (Steve bless his soul).
And that permission can be taken away, what the Steve giveth, the Steve taketh away.
So for all the years Apple Fanboys have crowed that Apple doesn't do this. Apple doesn't monitor them, Apple protects them, here's a giant mug of "I fucking told you so" because I said Apple were spying as much as Google, if not more, but wer
Re: (Score:2)
This illustrates two important things: First is the leftist line "Just do this one little thing. It's no big deal." This can be applied to every erosion of our rights. Before you know it, you don't have any rights left. Second, the government knows damn well it can't do this or suppress speech or affect gun control or any other infringement of our rights so it enlists loyal private industry to do it for them.
Re: (Score:3)
The mask mandate...
Lots of talks about a Vaccine mandate...
Are you telling me that you are anti-plauge but pro-child pornography?
And yes, questioning this will be equated as being pro-child pornography just like questioning a single vaccine makes you an anti-vaxxer.
And using the term retard is very telling of you
Re:Don't scan my phone. (Score:5, Insightful)
You mean the same Republicans who will chant, "My body! My choice!" when it comes to covid, but will use the power of big government to deny women the same choice when it comes to their body? The same Republicans who say a pregnant girl (under the age of 18) can't make adult decisions so they'll force her to have a child even if she's been raped? The same Republicans who say a girl (under the age of 18) who has been raped, must allow visitation rights to the rapist [churchandstate.org.uk]?
Not sure when Republicans were so concened about privacy when they have literally said the right to privacy does not exist in the Constitution [redstate.com].
That's not an effective argument (Score:3, Insightful)
A much more effective argument for keeping abortion legal is to discuss what would happen if we criminalized it. As Donald Trump said "There has to be some kind of punishment".
I don't bring that up to beat on Trump (not that I'm a fan) but it's the one thing in his political career he ever walked back. And he walked it back fast.
When abortion is a crime, women who miscarry will be subject to crim
Re: (Score:2)
Wait, that's not a very good corollary, is it?
I think that's because Robert Conquest's First Law of Politics is shit.
Re: (Score:2)
Knowing how left is advocating for child fucking under umbrella of sexual liberation
Do you have any sources for this outside of a basement of a pizza parlor?
Bs (Score:3, Insightful)
Re:Bs (Score:5, Insightful)
There is no way Apple or any company would do this stupid a move. This is just a losing proposition with tons of issues. I don't believe this rumor for a second.
Gosh, it's almost as if I've heard this before.
Right before they removed the headphone jack...and standardized ports...and removable memory...
Re:Bs (Score:5, Insightful)
Re: Bs (Score:3, Interesting)
I certainly believe the US government would want this. Agencies like the FBI are all about nailing the bad guys, collateral damage be damned. They're also not above creating bad guys, if they need to run up their numbers.
Re: (Score:2)
The problem with building devices that havre a reputation for being the most secure in the industry is that every criminal uses them. And Apple has to pay for every legal fight where they are being asked to unlock a phone as part of an investigation. Maybe they've weighed those costs against the loss of customers who still give a shit about their privacy? I don't know. But I agree with you, this is likely full-on BS.
Re: (Score:2)
Re: (Score:3)
How are you going to determine a false positive?
If anyone saved pictures from dating apps, hookup apps, *-gone-wild subreddits, sexts, or anything else - and unlike Apple, we're not going to pretend that humans aren't sexual and don't save sexy pics, girls included; they are sexual, and they DO save sexy pics on their phones - then any of those pics could get flagged by some AI/ML algorithm.
It seems safe to presume that your privacy will then be violated first by those Apple is having check this stuff (and
Re: (Score:3)
I still remember the family who posted photos of their two year-old playing naked in the lawn sprinkler on their MySpace page and were charged with child pornography.
Re: (Score:2)
I highly doubt Apple is seriously entertaining this idea.
Re: (Score:3)
They really needed Steve Jobs to tell them "No, you stupid fuckers, you're not doing this, and that's it".
So many bad choices since he died.
Re: (Score:3)
There is no way Apple or any company would do this stupid a move. This is just a losing proposition with tons of issues. I don’t believe this rumor for a second.
Oh. [apple.com]
Orwell (Score:5, Insightful)
This is, of course, only the ("noncontroversial") start. Once this is accepted to detect possible child abuse, the next phase will be something less horrific, and so on down the slope until a picture of you with the wrong person becomes a reportable offense (where you even get to incriminate yourself by taking said photo).
Once again, 1984 was supposed to be a warning and not an instruction manual.
There will be no curiosity, no enjoyment of the process of life. All competing pleasures will be destroyed. But always— do not forget this, Winston— always there will be the intoxication of power, constantly increasing and constantly growing subtler. Always, at every moment, there will be the thrill of victory, the sensation of trampling on an enemy who is helpless.
If you want a picture of the future, imagine a boot stamping on a human face— forever. ”
Re: (Score:2)
Once again, 1984 was supposed to be a warning and not an instruction manual.
The irony is palpable.
https://www.youtube.com/watch?... [youtube.com]
Re: (Score:2)
If you want a picture of the future, imagine a boot stamping on a human face— forever. ”
Pretty sure Apple will report you to the authorities for having that picture on your iPhone.
Re: (Score:3, Informative)
Vaccination records have been required for attending public school and traveling to foreign countries for over a hundred years.
George Washington required his troops to be inoculated for smallpox during the revolution over two hundred years ago.
Requiring vaccinations has been saving lives for hundreds of years.
But here you are, ignorant of all of history, claiming that COVID vaccinations being required would be a bad thing.
The only standard playbook in use here is the Idiots' Guide to Believing Conspiracy Th
Gotta love companies using child abuse as excuse (Score:2)
Re: (Score:2)
Yeah, this. Google's AI alerts me that it sees people when a bug crosses across my Nest camera's lens. How can Apple's AI be nuanced enough to identify pornography from ordinary nudity?
Re: (Score:3)
Yeah, this. Google's AI alerts me that it sees people when a bug crosses across my Nest camera's lens. How can Apple's AI be nuanced enough to identify pornography from ordinary nudity?
Ring Ring... "Excuse me ma'am, I'm looking at naked photos of your daughter on your husbands phone, I just wanted to check and verify what age she is... oh you don't have a daughter, well does he have a girlfriend? How old is she?"
Seriously? (Score:5, Insightful)
How is this supposed to work? How does it identify, say, a nude child compared to a nude adult in an image? Breast size? Skin complexion? Body hair? Exactly what metrics can you know a minor from an adult in a nude photo? Surely AI is not competent enough to be able to tell the difference in a photo without any other context.
My mom likes to bring out photos (and slides - yeesh) of me and my brother when we were kids, and that of course includes the usual taking baths and stuff. "Oh, isn't his naked little butt cute standing beside the tub." Yeah, thanks for embarrassing me, mom. That isn't child porn. So are you telling me that a person at Apple may look be looking at pictures of my 5 year old child that I took while they were in the bath to see if they are porn?
This reeks of privacy issues, technology issues, moral issues, on and on. Please tell me this isn't actually going to make it out of theory or research.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
It could be that it just checks the checksum of your photos to match a list of known images. But yeah, if it finds something it sounds like a human could look at it - which really sucks.
Re:Seriously? (Score:5, Informative)
It'll use Microsoft's PhotoDNA algorithm which creates a hash that isn't impacted by basic editing - such as scaling, cropping, flipping or minor colour alterations. The hash is then compared to a list provided by organisations such as the IWF, NCEMC and others.
If you've ever uploaded an image to a service owned by Facebook, Google, Twitter or Microsoft then it'll have been run through this check.
Re:Seriously? (Score:4, Informative)
laws aren't always rational either.
In this area they frequently aren't. Three cases I remind reading about over the years show how much so:
a) Drunk guy pees on a street corner, completely wasted. A couple crossing the street tens of feet away take a photo. He's prosecuted and gets registered as a sex offender for life.
b) Police finds a woman's own nudes, from back when she was a teen, on her own phone. She's prosecuted for carrying child porn.
c) The man in a (former) teen relationship becomes 18yo while his gf is still going to remain a 17yo for a few months, both having been together since they were 15yo. They text nudes as usual. The guy is prosecuted for pedophilia.
And so on and so forth.
So what did they feed the AI? (Score:2)
I thought Apple touted themselves as caring about your privacy and security? What happened iCultists? Apple is looking at your nudes now? Not only that, but if it's using an AI to do it, wtf did they feed it to be able to recognize child abuse images and where did they get it?
What the actual f**k?
Re:So what did they feed the AI? (Score:5, Informative)
wtf did they feed it to be able to recognize child abuse images
There is a shared DB of known child abuse images that is maintained with Government support and used to train recognition engines (and also to recognize specific images when they crop up). The name of it escapes me for the moment, but for example it's what Facebook uses.
what? (Score:2)
well that's outright insane.
Good thing... (Score:2)
Right?
illegal imagery verified (Score:2)
"The automated system would proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified."
The problem here starts with the concept of "illegal imagery". There are illegal acts and possibly imagery that documents those acts, but that does mean that the Imagery itself is illegal nor can an AI or Apple staff make that determination.
The problem then proceeds to "verified". No Apple employee can verify whether ima
Re: (Score:3)
Lots of 18+ yro women look much younger. If they choose to take intimate pictures, that's entirely their business and those photos are not there for the prurient pleasure of "a team of human reviewers"
get an good lawyer the chain of custody discovery (Score:2)
get an good lawyer the chain of custody & discovery issues will kill any criminal court case.
Also in the criminal court case with discovery you need to demand the source code and all logs.
place the human reviewers on the stand
place the human coders on the stand
Re: (Score:3)
Ever watched the process of a child porn case of someone who wasn't rich? Long before the court date the accused has lost their job, probably most of their family, and most likely their home since they're now unemployable, and any standing they might have had in their community. Getting off on a technicality like chain of custody is immaterial, by that point they've already lost everything. For that matter they'll probably have to rely on a public defender since they can no longer afford an experienced a
Similes. (Score:4, Funny)
Apple are walking back privacy to enable 1984, he said.
One of these days we'll find someone who's actually read the book.
Re: (Score:2)
Note what Winston was fighting against. [britannica.com]
Oh Great (Score:2)
The first mommy with baby pics will now be criminalized.
Re:Oh Great (Score:4, Interesting)
Good point. In a lot of countries, it would never occur to anyone that a naked baby picture might be considered kiddie-porn. When I was a kid (In the US), the high school year book would always have a few pictures of graduating seniors as little babies. Some of them were nude. It wasn't a big deal.
Now, having a picture of your own kid can for all effective purposes end your life.
China (Score:3)
Apple: "We follow the laws and regulations of the countries in which we operate."
I assume this will soon include scanning all devices in China for "anti-Xi Jinping" (or "fill in the blank") material and reporting violations to the local police. I'm going to guess China will be able to figure out a way to make this a laws/regulation.
The good old days of getting people to rat out their neighbors might be gone. This seems far more efficient.
Line in sand is here .... (Score:2)
Cellphones are a massive invasion of personal privacy just by carrying them around, powered-on. You're effectively reporting your whereabouts at all times to the telco, who can collect that data and do what they wish with it.
But clearly, most of us feel like the trade-offs are worth still using one. (It's not like you can't just leave it someplace so your "trail" is lost, should you actually become that concerned about it. And most likely, our daily travels just aren't that exciting or anything we're TOO
No doubt Apple will allow people to opt out (Score:3)
But then everyone on the opt-out list will be put on a child porn watch list.
And no one will be able to remove the software in question from their iPhones because of the nature of Apple's cryptographically-enforced walled garden. The walled garden keeps you safe, remember?
We've been here before (Score:5, Insightful)
Gmail has been scanning user emails and reporting child porn to the authorities, per US law, for some time.
https://money.cnn.com/2014/08/... [cnn.com]
FT is a venerable institution, but this report is hard to believe. Scanning one's phone seems to go too far, and Apple has been the most resistant of all companies against such invasions. I hope they substantially deny this report soon.
So many problems with this (Score:2)
Slippery slope: Once we've accepted scanning to "protect the children" what about other crimes like terrorism, calling for insurrection, and copyright violations of Apple owned content
Then when you get down to it, how is a phone different from a personal computer - the same arguement should let the go
Re: (Score:3)
Re: (Score:3)
Please sign the petition against this! (Score:2)
http://chng.it/4wFSfVgPgL [chng.it]
Re: Please sign the petition against this! (Score:2)
c'mon people, please sign the petition! let's make some noise! (change.org)
Placing them selves at risk. (Score:4, Insightful)
What is the solution? (Score:2)
OK everyone here agrees that this is bad approach to combatting child exploitation and porn. Is there any approach to combatting child exploitation through these digital devices that is acceptable?
Re: (Score:2)
Re: (Score:2)
No I wouldn't. And I wasn't saying that Apple's approach is ok. I am asking if anyone has an approach that is acceptable.
I don't own Apple products (Score:2)
Setting up the next 2014 (Score:2)
The automated system would proactively alert a team of human reviewers
The automated system would proactively alert a team of photo leakers.
Fixed that for you.
Wow (Score:2)
What a terrible idea (Score:2)
Seriously, this is a terrible idea. First, phones contains a lot of private, personal information. The very idea that some bunch of a faceless company, and their underpaid employees, are actively scanning through your phone should be abhorrent.
Second, who defines "child porn"? A pic of the grandkids in the tub? What about little kids running around topless through a sprinkler in your back yard? How many people will be proving their innocence, and to whom?
Finally, they justify this with kiddie porn, just
Re: (Score:2)
Second, who defines "child porn"? A pic of the grandkids in the tub? What about little kids running around topless through a sprinkler in your back yard? How many people will be proving their innocence, and to whom?
There are laws that define if a picture of a child can be considered pornography or not. Your examples are not considered so unless there is emphasis on the chest or genital area (and the only exceptions to that are if the media is for educational, medical, or criminal investigation purposes), o
guess (Score:2)
Let me guess...
- Apple will roll this out
- They won't respond to any reporters' questions about this
- You can't turn it off
- They won't publish documentation or commitments about how it works (and what it doesn't do) other than a vague advertisement
- This will have full access to the personal lives of more humans than were alive on Earth in 1900
Comment removed (Score:4, Informative)
Re: (Score:3)
https://archive.is/0fEvH [archive.is]
Thanks - I thought I would find this there (Score:3)
Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as “hashing”, will be compared with those on a database of known images of child sexual abuse.
Uploading hashes that will only match specific images is troubling, but at least the chance of a false positive should be very low.
Apple are control freaks, b
Profit! (Score:2)
2. It gets flagged for a "human review."
3. Coworker reports reviewer for viewing and having possession of child porn.
4. Teenager sues Apple for unauthorized use of their image.
5. ????
6. PROFIT!
This is so ripe for abuse... (Score:2)
If this is actually true (paywalled article, so who knows what facts they are going on), yes, even though Apple is a private company where the Fourth Amendment doesn't apply, and "for the children!" is a root password, what keeps this from turning into something that checks for IP violations and turns someone into the RIAA because they have a MP3 file that was from a torrent?
What happens if they are not on the "right" side of politics, and a stored mail trips a flag which gets sent and caused the person to
I honestly have no problem with this, in theory... (Score:2)
But A) software has bugs. False positives are inevitable. This reason alone is enough for me to be opposed to it, and B) while it is alleged to only be used to detect child abuse today, we do not know if that is all it is really detecting. Further, even if that were the case for today, the technology to phone home when ina
Wut? (Score:3, Insightful)
Re: (Score:3, Informative)
Animals rescue groups love to put language like this in their adoption contracts. You have to agree to give them access to your home at any time they demand it, so they can ensure it is still safe for your adopted pet, for the entire life of the animal rather than just as a pre-adoption inspection. If you refuse, they can reclaim the animal from you. I remember seeing language that from from one rescue that specialized in a species of bird that routinely lives 30+ years in captivity. Every single person who
Take a lil bit (Score:2)
Apple is hiring (Score:2)
great way to frame someone (Score:4, Interesting)
say you're a whistleblower and NSO group's customer doesn't like you.
uses their tool to plant child porn somewhere that you'll never look at yourself.
Apple happens to 'catch' you a short time later - getting around the whole ''needing a warrant" thing.
If they're up for chopping reporters into small enough pieces to fit in cake boxes, or kidnap them or poison them - you can bet they'll have no qualms about planting child porn on their phones either.
Re: (Score:2, Interesting)
Re:oh noes! (Score:5, Funny)
But... think of teh children!
No! Not in that way!
Re: (Score:2, Interesting)
it's time to replace my kid's iPhone with something non-Apple.
If Apple really does this, it's not only my kid's iPhone that will be replaced.
Re: (Score:3)
> If Apple really does this
They might no be doing this specifically yet, but the fact that they consider it at all means it is already time to replace.
Re: oh noes! (Score:3)
"This smells of someone high up at Apple who had a child that was a victim and they want to fight it. I can't see how this is even remotely legal"
More like Apple has been getting the stink eye from the feds lately and wants to do something to smoothe out their relations with them.
Re: (Score:2)
Instead, they should be checking for evidence of anti-semitism, white supremacy, racism, homophobia, gun possession, toxic masculinity, that people have their masks on, are pro-vaxx, and that they are staunchly pro-Israel. That would be an ideal use of this technology and I think that's something that we can all get behind as liberals.
That's version 1,1
Re: (Score:3)
"The intent here is a noble one, but at the same time I can't help but wonder about the pandora's box we might be opening here."
Aww you still think this way. How cute. Let me make a few adjustments here.
"The intent here is to introduce a trojan horse cloaked in a noble intention. Once Apple gets that horse through the gate, all hell will break loose and you will be always be under the microscope. "For the children" was just to get this past the defenses. The real intention is to build dossiers on ev