Apple Executive Defends Tools To Fight Child Porn, Acknowledges Privacy Backlash (wsj.com) 145
A senior Apple executive defended the company's new software to fight child pornography after the plans raised concerns about an erosion of privacy on the iPhone, revealing greater detail about safeguards to protect from abuse. From a report: Craig Federighi, Apple's senior vice president of software engineering, in an interview emphasized that the new system will be auditable. He conceded that the tech giant stumbled in last week's unveiling of two new tools. One is aimed at identifying known sexually explicit images of children stored in the company's cloud storage service and the second will allow parents to better monitor what images are being shared with and by their children through text messages. "It's really clear a lot of messages got jumbled pretty badly in terms of how things were understood," Mr. Federighi said. "We wish that this would've come out a little more clearly for everyone because we feel very positive and strongly about what we're doing."
The Cupertino, Calif., iPhone maker has built a reputation for defending user privacy and the company has framed the new tools as a way to continue that effort while also protecting children. Apple and other tech companies have faced pressure from governments around the world to provide better access to user data to root out illegal child pornography. While Apple's new efforts have drawn praise from some, the company has also received criticism. An executive at Facebook's WhatsApp messaging service and others, including Edward Snowden, have called Apple's approach bad for privacy. The overarching concern is whether Apple can use software that identifies illegal material without the system being taken advantage of by others, such as governments, pushing for more private information -- a suggestion Apple strongly denies and Mr. Federighi said will be protected against by "multiple levels of auditability." "We, who consider ourselves absolutely leading on privacy, see what we are doing here as an advancement of the state of the art in privacy, as enabling a more private world," Mr. Federighi said.
The Cupertino, Calif., iPhone maker has built a reputation for defending user privacy and the company has framed the new tools as a way to continue that effort while also protecting children. Apple and other tech companies have faced pressure from governments around the world to provide better access to user data to root out illegal child pornography. While Apple's new efforts have drawn praise from some, the company has also received criticism. An executive at Facebook's WhatsApp messaging service and others, including Edward Snowden, have called Apple's approach bad for privacy. The overarching concern is whether Apple can use software that identifies illegal material without the system being taken advantage of by others, such as governments, pushing for more private information -- a suggestion Apple strongly denies and Mr. Federighi said will be protected against by "multiple levels of auditability." "We, who consider ourselves absolutely leading on privacy, see what we are doing here as an advancement of the state of the art in privacy, as enabling a more private world," Mr. Federighi said.
known sexually explicit images of children? (Score:2, Interesting)
One is aimed at identifying known sexually explicit images of children stored in the company's cloud storage service
So is Apple storing sexually explicit images of children to compare against?
Re: (Score:3, Informative)
No, that would be super illegal.
What they're doing is getting a hash of the images, encrypting those, and putting a database of those encrypted hashes on everybody's phone to use for comparison. The hashes are being provided by a NGO who has permission from the US govt to possess the child abuse images for tracking and helping exploited children.
Re: (Score:2, Informative)
Re: (Score:3, Interesting)
"Also, they have some magic algorithm that can recognize the same image, even if it has been cropped, resized, stretched etc."
Hashes don't work that way. At best, you could have known CP content, and use common photo editing software to crop/resize/stretch those images and take hashes of all those variations. The only issue with that would be now you are potentially dealing with billions of hashes (taking many GB to store) to check against instead of thousands. That said, you could have phones just hash all
Re:known sexually explicit images of children? (Score:5, Insightful)
And since there's no way to know what the hashes are of, the government can provide hashes of any photo of interest, not limited to CP.
Re: (Score:2)
In that scenario, an Apple employee will review the image, see that it's not CSAM, and not report it to NCMEC.
(Also note that if you store your photos in iCloud, they are not encrypted, so it would make oodles more sense for a nefarious gov't to simply scan your iCloud.)
Re: (Score:2)
Why would Apple employees be allowed to possess/review CP imagery? My understanding is that if you find it by accident, you report it - but intentional possession or viewing is illegal - even if you are just doing your job with material that you don't want to see.
Re: (Score:2)
"Why would Apple employees be allowed to possess/review CP imagery?"
They possess the images because people effectively save it to their servers. They review it to weed out false positives.
Re: (Score:2)
Side note, they would still have to respond to a FISA warrant (or equivalent) about whether any such matches exist whether they are reported or not. So the human review only catches one side of it.
Re: (Score:2)
Side note, they would still have to respond to a FISA warrant (or equivalent) about whether any such matches exist whether they are reported or not. So the human review only catches one side of it.
One fun thing about this discussion is just how convoluted an hackneyed the conspiracy would have to be to shoehorn a CSAM check into a nefarious plot by the government. If we're worried about the gov't subverting the NCMEC CSAM database and then getting a FISA warrant to see if anyone got flagged, then surely they would take the much simpler path of subpoenaing (or snooping on) your iCloud photos directly, which are unencrypted anyway.
Re: (Score:2)
And since there's no way to know what the hashes are of, the government can provide hashes of any photo of interest, not limited to CP.
Indeed. And with a small bit of trickery any hash of any other file too.
Re:known sexually explicit images of children? (Score:5, Interesting)
And it's an NGO so there's no open government process, auditing, or accountability, so the secret organization can be easily used to selectively ruin anyone's lives the powerful don't like, or exclude anyone they do like from being included in any action.
Re: (Score:2)
AI Logarithm 1. Find sexually explicit images. Percentage of Skin Tone on tagged Human bodies, position of said human bodies.
AI Logarithm 2. Find images of children. Height compared to other objects, General Body Shape and dimensions.
You find a cross a cross reference of both then you have the database of Children Explicit images, which doesn't need a database of offending images to be stored.
Re: (Score:3)
Re: (Score:3)
Step 3: Arrest all parents taking photos of their toddlers :P
Re: (Score:2)
What nonsense, you need databases of offending images to do one and two.
As aside, the way our police have been known to work work perhaps some paid child rapists and molesters to supply them too, agents on the inside getting deep into the world of kiddie porn and developing twisted predilections, and kid agents.
Re: (Score:2)
For some reason I want to add your two logarithms up and use the sum as an exponent, but I'm unable to comprehend what it would mean to multiply those sets of images. If I multiply a sexually explicit image by a child image, the result is always kiddie porn, right? So this would generate a vast array of kiddie porn, which seems the opposite of the problem I was trying to solve.
Guilty as charged
Re: (Score:2)
Sorry, never mind, now I see it. The products would be images squared, i.e. kiddie-porn
tesseracts. Presumably the weird geometry makes it legal, since if a perv tries to fap to it, they'll fall down some kind of endless Escherian staircase.
And how does he defend the false positives? (Score:5, Insightful)
That's not even the worst of it.
If they do not take a stand against this issue now, then today it is child abuse, what if it later is used for something else, say some politically unpopular opinion?
Never say never. Historically, technology gets abused, regardless of the intentions of its creators. They need to nip this in the bud now and create a precedent for how to deal with similar ideas in the future.
In theory I have no problem with what they are wanting to use this technology for, but the realities of the world in which we live convince me that going down this road is a colossal mistake.
Re: (Score:2)
That's why they aren't debuting this feature in China but how long will that take to occur? The most common way of sending child porn is actually Facebook Messenger. End to end encryption will prevent this from being caught. But the tradeoff is lack of privacy.
Re:And how does he defend the false positives? (Score:4, Insightful)
Re: And how does he defend the false positives? (Score:2)
"How can Apple sit there with the technology to stop the capitol insurrectionist terrorists / black lives matter rioter terrorists? They are complicit! Scan all files for politically hostile memes immediately"
t. this forum 2024, with something more topical to that year inserted instead of todays bogeymen
Re:And how does he defend the false positives? (Score:5, Insightful)
oh, THEY MADE IT CLEAR.
oh. that is more than enough to satisfy us.
afterall, without looking at source code, we tend to just trust the ultra mega huge corps that basically have government sweet-deals and can play with rules that the rest of us dont get to play by.
sure. trust is just given based on our say-so.
yup. life works exactly like that.
Re: (Score:2)
Well if you are just going to go off the deep end of hysteria with doom crying and it'll all be abused to the fullest extent as an absolute given, then sure you make perfect sense.
Within your own paranoid delusional world, that is.
Re:And how does he defend the false positives? (Score:5, Insightful)
It's hash matching known images.
allow government to expand the capability for other less-innocent purposes.
And they can do this at will, without warning, because to Apple or anyone else it's just an image hash. Only the government knows what the original image is.
Re: (Score:2)
Re: (Score:2)
Which is a privacy hole the size of a large trainyard when the first false positive flag comes rolling around.
All of the images that might possibly be flagged are stored unencrypted in iCloud. That privacy hole size = 1 trillion large trainyards.
Re: (Score:2)
Are you saying that Apple will legally have a large library of child porn and employees authorized to use it?
I doubt it.
I am not saying that.
Re: (Score:3)
They made it pretty clear that the images have to be spot on to trigger the flag. They also made it clear that a single image flagged isn't going to result in any action.
There have been many discussions about this on the web the last week or two, particularly over at MacRumors. This response is pretty popular and misses the big picture entirely. The GP points out why this is a bad idea -- it shows potential for misuse in the future and is a huge potential privacy violation if not today, then perhaps tomorrow. It normalizes spying on users under the "think of the children" banner. Many responses are like yours "but the chance of a false positive is so near zero..." -- which
Re: (Score:2)
You trivialize it in the name of paranoia.
They came right out and clarified that occasional false flags will be essentially ignored. They even gave some anticipated stats on those occurring and they were in the astronomical, approaching heat death of the universe, edge cases. The conditions for getting flagged for review were so high that only a SUBSTANTIAL REGULAR usage of --KNOWN, INDEXED, CLASSIFIED CHILD PORN-- would ever trigger it.
But hey, it's cool to be alarmist and cry about things.
Re: (Score:2)
I really though the world had moved past "think of the children" as a rally cry for every stupid thing imaginable.
I was clearly wrong.
As claimed to be written and implemented, this service isn't a "bad thing" in an of itself. Except the extremely minor benefit (collectors of such offensive material will simply disable iPhoto) is massively outweighed by the complete invasion of privacy. Even if Apple doesn't get or see my images, they've set a precedent that they can - on my privately owned device - scan f
This makes it unclear (Score:5, Insightful)
They made it pretty clear that the images have to be spot on to trigger the flag.
Article is paywalled so I'm not sure how they made that "clear".
This article [rentafounder.com] makes it very un-clear that the accuracy will be as high as they say.
The very disturbing thing to me is that the kinds of images that would most likely hit a false positive would be extremely intimate photos from people, that Apple employees would be reviewing manually. You can imagine a picture of someone laying nude in a bed with a particular pose could easily have a tonality and shape match that would trigger a perceptual match against the database of hashes.
I normally side with Apple on a lot of things to help keep users safe. But in this instance they are exactly on the opposite side of being right, and what they are about to unleash on humanity is a travesty.
Re: (Score:2)
Yes. Because we all know that computers, code, and databases can be made perfect, with zero possibility of there ever being a false positive, just by an executive saying so to the press. And that same executive's statement to the press also uninvented the very concept of scope creep.
Seriously... are you new?
Re: (Score:3)
Re: (Score:2)
Yup.
They tried to implement this for something that most people can't or won't argue against "think of the childrenz"
Which sets an incredibly dangerous precedent: we monitor everything we want against whatever hashed data we decide by an algorithm we created - none of which you're able to know anything about. Next up - they're "hash-monitoring" your private messages for whatever political hot topic is in vogue. Hard. Pass.
Ferengi? (Score:2, Offtopic)
Craig Federighi, Apple's senior vice president of software engineering
Surely I'm not the only one that read his last name as "Ferengi"...
Re: (Score:2)
No, you're not.
I'm about to abandon the personal smartphone (Score:5, Insightful)
Not just over this.
Any of you with a recent iphone, say an 8 or later.
Go to Photos.
IN the search bar, type "car" . Or "House." Or whatever.
iphone has already catalogued your photos, in the device itself. That's how it makes "memories" and all this other bullshit.
Imagine when the Government wants Apple to tell them who has pictures of scary black rifles with detachable magazines and pistol grips. Or internal-combustion cars. Or anything else our Enlightened Inbred Leaders decide to be verbotten in the coming years.
Fuck you, Apple. You burned the bridge to me with this one.
I think itll be a tracfone for me from now on.
Re: (Score:2)
Re: (Score:2)
It's an on-device analysis. I just logged into iCloud and looked at my photo library and there isn't a place to search for those things.
Indeed, as I recall one of the complaints about how Apple does it is that every device you have reindexes every photo you have and the databases aren't shared, so if you have a slow old iPad that's connected to iPhoto, it'll take forever and burn a tonne of cycles trying to figure out which of your photos have cats on them.
This is in contrast, of course, to Google Photos wh
Re: (Score:2)
I am wearing an Open Smartwatch and its interesting. It does most of what I need and I control the code. Open source hardware and software for personal devices might be the answer for people like us.
Re: (Score:2)
Personally, I think knowing who might own scary black rifles with detachable magazines and pistol grips isn't a bad thing. Particularly useful if there's a domestic abuse incident & the police are called.
Spoken like a true authoritarian. Want to know who owns what. How STASI-like of you.
Re: (Score:2)
Re: (Score:2)
I don't think it's STASI-like to want to know who has military weaponry that only has one purpose
Which is fine. But outside of the bizarro world in which we currently live, elected representatives would make law to accomplish that sort of thing, according to the will of the majority. Using some back room scheme cooked up by a monopolistic corporation and the government is not the answer to problems like this.
Re: (Score:2)
I don't think it's STASI-like to want to know who has military weaponry that only has one purpose
Which is fine. But outside of the bizarro world in which we currently live, elected representatives would make law to accomplish that sort of thing, according to the will of the majority. Using some back room scheme cooked up by a monopolistic corporation and the government is not the answer to problems like this.
I guess 'Murica loves its guns more than the right to life.
Re: (Score:3)
By the way, Mr. Nosey, that black rifle with the pistol grip and detachable magazine -- which you hysterically call "military weaponry" -- is a complete wimp, barely enough for coyote, let alone deer, elk, moose or soldier. .223 is a really wimpy round outside of its comfort zone -- out to about 300 yards at best.
That AR is not the same AR the military gets. Theirs has the giggle switch / 3-rd burst (depending on vintage.) We don't get that. Stop buying into the pandering that the AR is a WeaPoN oF WaR.
Re: (Score:2)
Re: (Score:2, Informative)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Here's what you wrote:
I'm sure some have rendered their rifle inop for whatever reason, making them paperweights, sure. But the answer is lots of things! Ranchers use them to get rid of gophers, prairie dogs, and other varmints that wreak havoc with livestock (lots of rifles are sold in "varmint" configuration). I use mine in marksmanship competitions like 3-gun [nssf.org]. Lots of fun and very friendly people if you get a chance to go to one. Hope that helps!
Here's what people from outside the USA envisage: https://www.youtube.com/watch?... [youtube.com]
Re: (Score:2)
Re: (Score:2)
I don't own a single black rifle with detachable magazine and pistol grip.
I much prefer wood and steel over aluminum and plastic. I much prefer op-rods over direct impingement. If you don't know what any of this means, then stay the hell out of the guns talk.
But you really rose to the challenge when I used that language, eh? I guess you're one of the ones terrified of the black rifle with detachable magazine and pistol grip.
I deliberately chose the language the Left uses to demonize the AR.
Re: (Score:2)
I don't own a single black rifle with detachable magazine and pistol grip.
I much prefer wood and steel over aluminum and plastic. I much prefer op-rods over direct impingement. If you don't know what any of this means, then stay the hell out of the guns talk.
But you really rose to the challenge when I used that language, eh? I guess you're one of the ones terrified of the black rifle with detachable magazine and pistol grip.
I deliberately chose the language the Left uses to demonize the AR.
OK, who reckons this guy could be the USA's next mass-shooter?
Re: (Score:2)
OK, who reckons this guy could be the USA's next mass-shooter?
Better send your resume to Apple, you seem to be into pre-crime. You'll fit right in.
Re: (Score:2)
God damn it, 20 years on /. and I still fail to use the preview button.
Let me re-quote your screed, so I may properly reply:
OK, who reckons this guy could be the USA's next mass-shooter?
Better send your resume to Apple, you seem to be into pre-crime.
You'll fit right in.
Re: (Score:2)
Re: I'm about to abandon the personal smartphone (Score:2)
It is not a question of if but when will it be abused.
While we still dont have documents regarding Saudi involvement in 911, just imagine a post 911 type event the pressure apple would face from govt regarding find us the terrorists.
Most of us would not blame apple if they succumbed to that pressure but later we would whinge about precedents and loss of privacy
Re: (Score:2)
Sorry, I don't buy into the 'everything evil absolutely will happen no matter what" mindset. Convicting before anything even happens.
Re: (Score:2)
Not convicting, not even saying apple would succumb to the pressure but will bad shit happen and will the govt push!!! Yes it will and yes they will. And if the past is any indicator the business have no choice but to comply.
Look at the FISA court and the complicit of the providers and vendors post 911.
If you don’t know what i am talking about then please forgive me, i am just old and maybe misremembering things.
Re:I'm about to abandon the personal smartphone (Score:5, Insightful)
So this is the particular "but it might be abused one day" hill you choose to give up on and renounce it all?
You don't find the ability of a company to (despite all their assurances of proprietry) dip into your own personal phone and report on its contents? Yeah, they say iCloud only today but as I already illustrated, the phone is cataloguiing your shots internally, wihtout icloud and knows which pictures features common objects such as houses, bikes, cars
You don't see this as a possibility?:
"Yes, hello Police, this is Apple. There's a phone at such and such address you may want to look at, we see the owner has many pictures of rifles. Or old junky cars. Or bootleg DVDs. Yes, I htihnk they're a danger. Go apprehend them now"
I see that as a likelyhood, actually, within 5 years, if not sooner.
And likely we won't know, at first.
Yeah, this could be my hill to abandon Apple on. I can go back to lugging a camera with me, and a walkman. I don't need a smartphone, I dont have to browse right the fuck now.
You seem to think the coming days will be all sunny and roses. I think we're about to go full-out East Germany in the states now. Neighbor ratting on neighbor, son ratting on father, phone ratting on owner.
While they're screaming "think of the kids" I'm thinkign "Think of the malum prohibitum things that will be reported".
Like things the guv or their brawn doesn't like. "Oh, this one has a Betsy Ross hanging from the front porch. RaCisT! CaNcEl Him! BuRn tHe HoUse!"
Given Apples socialist bend, I can see this. I can even see them giving such info to direct action squads.
You don't? Heh. OK.
Privacy theater (Score:2)
The whole current privacy theater being conducted by the worst offenders would be laughable if people didn't fall for this shit over and over again.
"We're TRUSTED(TM)" "We know better than you!" "How DARE you try to gain root access to the device you own!".
The public has been manipulated well, made to always and only fear the awkward loner phishing for credit card numbers or the scary Russian hacker "gang", their attention diverted away from the true threat to privacy, device freedom, and freedom from warra
Re: (Score:2)
made to always and only fear the awkward loner phishing for credit card numbers or the scary Russian hacker "gang"
Exactly. I'm amazed that people would chose to have their government intrude/spy on all aspects of your life because you might have to cancel a credit card or might get locked out of your laptop files that probably aren't all that important.
Re: (Score:2)
To answer your question, yes possibilities exist.
But possibilities exist to abuse any technology.
I'd rather hold a company accountable for what it actually does than what people imagine it might do in their worst nightmares.
Re: (Score:2)
One can be a socialist and make mountains of money.
Including the .. person... who founded BLM. She made 4.5 something million, all grifted from socialy-concious panderers.
You know, like "think of the children" Apple.
Let's see.. other rich Socialist..
Bernie. Stalin. Mao. Xi. Shall I continue?
I think I"m comfortable with calling Apple "Socialist." Their actions are neutral, at best, Socialist at worst.
They'll never pass for Standard Oil.
Re: (Score:2)
This is the moment when Apple, who until now was pretty privacy-friendly, has created and advertised a framework for scanning content on your personal device and flagging it with them when it matches certain characteristics for a number of times.
That _is_ indeed a major event. It is not just "technology that might be abused one day". Apple has never built backdoors into iOS to unlock devices (or at the very least has not publicly advertised that they did), and as a result was able to argue before courts tha [wikipedia.org]
Re: (Score:2)
When did we turn into the "I'm scared of everything because of what MIGHT happen one day" crowd?
Everything was a scary, daunting thing to some people. Those people got swept aside and we all moved on.
Your comment is inherently unfalsifiable you literally said "everything was scary" so your argument can just as easily be used to dismiss anything on the very same grounds.
Re: I'm about to abandon the personal smartphone (Score:2)
When more and more tech started being valid for use at a massive scale. Thatâ(TM)s when and why.
Re: (Score:2)
Except this "might" already happens. Apple, Google, FB, etc. etc. etc. are all about diving into the dredges of your data. They're all about false-premise social 'progress'.
This isn't a hypothetical 'someone somewhere might somehow something' - this is a 'they did, they do, they will do it more'.
Re: (Score:2)
Or in other words... (Score:5, Interesting)
In other words, an IT company is defending its decision to implement warrantless search & seizure on its customers' personal files.
We already know that several US agencies have warrantless, dragnet access to all Big IT's users' files & data. Big IT even makes profits from providing these agencies with specialised search tools (This hasn't changed since Edward Snowden told us about it). Why aren't these agencies doing anything about child porn already? Do these agencies not care about it? Will these agencies not think of the poor children?!!
Re:Or in other words... (Score:4, Interesting)
In other words, an IT company is defending its decision to implement warrantless search & seizure on its customers' personal files.
"we're pretty sure we are on high moral ground here; you have to trust us and let us do what we want. now, and again when we find another thing that we want to do and we'll use this justification again, then, too"
apple is beyond absurd. the fact that rational intelligent people here continue to use apple shit - fully knowing that the company has long ago (if ever?) sold the userbase down the river.
next to trump's big lie, apple's big lie of 'you can trust us' is the biggest one of the year. and the distortion reality field catches most of you in it, too. boggle that!
Re: (Score:2)
rational intelligent people
Who, Vulcans? AFAIK, people aren't rational. Daniel Kahneman was awarded a Nobel Prize for proving that point.
Re: (Score:2)
In other words, an IT company is defending its decision to implement warrantless search & seizure on its customers' personal files.
It's OK though. It's completely auditable at every level.
When they kick in your door, there will be an auditor there to check the little box that said they kicked in the door. When they pull you off the couch and falsely accuse you of having child porn because some dipshit somewhere mixed up a has, there will be an auditor there to tick the little box that says they did it.
It's all on the up and up. Better privacy = auditable. Why didn't they just say that in the first place!
Re: (Score:2)
In other words, an IT company is defending its decision to implement warrantless search & seizure on its customers' personal files.
Okay, maybe you haven't read the fine print, but you have actually granted Apple permission to look at everything you give them. They don't need a warrant or anything.
The only thing their announcement was about was publicly acknowledging they were going to share findings w/ the cops. That is the only new thing there.
Re: (Score:2)
The only thing their announcement was about was publicly acknowledging they were going to share findings w/ the cops. That is the only new thing there.
Yes, that's the part that makes it warrantless search & seizure.
Re: (Score:2)
A man's rights are more important than justice.
Orwellian (Score:2)
"Surveillance is Privacy"
To be fair, if they're just using a lookup table of known CP image hashes provided by an NGO, and doing the comparison on your phone, that's at least an interesting concept. But I think it's still possible to be abused, no pun intended.
Re: (Score:2)
based on their say-so?
you are SO trusting.
remind me to never trust YOU.
Re: Orwellian (Score:2)
Looks like they are just checking a âoefor the children âoe govt form
Makes everyone especially politicans feel like the security theater is in place
How can you tell he's lying? (Score:3)
Not a tool to fight porn (Score:2)
multiple levels of auditability (Score:2)
in Geekspeak translates “ multiple levels of failure points”.
Having designed voting schemes in networked environment teaches that security is exposed on every level with security dependent upon a detrimental reliance at each point of failure. That “audibility” is Craig Federighi BS marketingSPIN on what amounts to a breech in security protocol to account for the compromised implementation scheme Apple chose.
In truth, levels introduce an impossible architecture across which accountabi
Cognitive Dissonance (Score:2)
The congitive dissonance involved in this is mind boggling. How do people not realize that spam and malware filters are just apps that "read your e-mail" and apply rules based on what it sees? Anti-malware "reads" attachments, including expanding archived files, to determine "correctness". The auto-tagging of photos has been done by every major photo-sharing/storage service for more than a decade.
Hell, big e-mail systems will automatically group messages based on contents using algorithms to determine impor
Re: (Score:2)
Scanning / spying on email is very different from scanning / spying on personal, private documents. Antivirus are trusted not flagging and sending samples of *non-executable* documents such as photos and videos. Auto photo tagging is supposed to not phoning home.
When Apple set up mechanism that can officially phone government based on content of personal documents, dangerous abuses is around the corner. And we all know a true criminal can easily dodge this mechanism by storing their picture somewhere els
Re: (Score:2)
Or, one can leave gmail for resumes, yahoo for a spam trap / misdirecting / poisoning mailing lists, and use Protonmail for real email.
I did just that, the year I found Yahoo was scanning emails to stick ads "relevant to your tastes"
Unsavory governments (Score:2)
Sure, Apple says they will deny "requests" from governments to expand this to include other images (Winnie the Pooh?).
But when those countries pass laws REQUIRING it, which of course will happen, it will be impossible to put the toothpaste back in the tube.
Sorry Apple. You are royally screwing up on this one.
Auditable by whom? (Score:2)
the new system will be auditable
By whom? I am pretty sure that Apple employees will not be allowed to look at flagged photos and compare them to the known child porn ones.
The auditing can only legally be done by authorities, which in some countries are more interested in other motifs than child abuse.
One aimed at driving children off iMessage (Score:2)
I mean, they are not that stupid. They will just go to another chat application.
Child Endangerment (Score:2)
I recall that the original report wasn't just "child pornography", but included "child endangerment."
A lot of things endanger children. Reckless driving, parents participating in political unrest, swimming without flotation devices, playing unsupervised, walking to school in a bad neighborhood, and many other things.
Who selects the pictures to define violations?
If Apple can read the photos on my phone (Score:2)
What's to stop them from writing to my photo directory?
If I criticized Apple they could retaliate by placing a picture on my phone and getting me arrested.
not auditable (Score:2)
Think of the Children! (Score:2)
If they catch you "thinking of the children" in a bad way, will they arrest you?
PR Stunt went bad... (Score:2)
If all they wanted to do was protect children, they could have just implemented it and let it run. We've all agreed to the different TOU docs that would have allowed it anyway.
Announcing it was a PR stunt that obviously went the wrong direction on them. They didn't need to announce it, but some marketing schmuck thought "oh just think of the good press we'll get" (probably from the Q groups thinking now all of those pedophile democrat celebs would get outed), a person that doesn't give a damn about privacy
Apple does not respect its customers (Score:3)
There is nothing I like about Apple.
The monoculture, unjustifiable uncompetitive premiums, walled gardens, touting privacy and respect while delivering none. They present the face of being a bastion of leftist hipsters while their supply chain is rife with human rights abuses. The only thing Apple stands for is its shareholders.
Backtracking (Score:2)
This nonsensical comment of "having to explain better" proves me right. This is exactly what a company's Pavlov response would look like when they're in denial over their own mistakes.
br Apple will die over this. By next year sales will have dropped 50%.
Odd focus (Score:2)
Note to Apple (Score:2)
How easy to destroy someoneâ(TM)s life (Score:2)
It's good for you (Score:2)
...will allow parents to better monitor what images are being shared with and by their children through text messages.
Remember, folks, by analyzing the hell out of your stuff without any way to opt-out, they're just empowering you.