Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Apple

Apple Executive Defends Tools To Fight Child Porn, Acknowledges Privacy Backlash (wsj.com) 145

A senior Apple executive defended the company's new software to fight child pornography after the plans raised concerns about an erosion of privacy on the iPhone, revealing greater detail about safeguards to protect from abuse. From a report: Craig Federighi, Apple's senior vice president of software engineering, in an interview emphasized that the new system will be auditable. He conceded that the tech giant stumbled in last week's unveiling of two new tools. One is aimed at identifying known sexually explicit images of children stored in the company's cloud storage service and the second will allow parents to better monitor what images are being shared with and by their children through text messages. "It's really clear a lot of messages got jumbled pretty badly in terms of how things were understood," Mr. Federighi said. "We wish that this would've come out a little more clearly for everyone because we feel very positive and strongly about what we're doing."

The Cupertino, Calif., iPhone maker has built a reputation for defending user privacy and the company has framed the new tools as a way to continue that effort while also protecting children. Apple and other tech companies have faced pressure from governments around the world to provide better access to user data to root out illegal child pornography. While Apple's new efforts have drawn praise from some, the company has also received criticism. An executive at Facebook's WhatsApp messaging service and others, including Edward Snowden, have called Apple's approach bad for privacy. The overarching concern is whether Apple can use software that identifies illegal material without the system being taken advantage of by others, such as governments, pushing for more private information -- a suggestion Apple strongly denies and Mr. Federighi said will be protected against by "multiple levels of auditability." "We, who consider ourselves absolutely leading on privacy, see what we are doing here as an advancement of the state of the art in privacy, as enabling a more private world," Mr. Federighi said.

This discussion has been archived. No new comments can be posted.

Apple Executive Defends Tools To Fight Child Porn, Acknowledges Privacy Backlash

Comments Filter:
  • One is aimed at identifying known sexually explicit images of children stored in the company's cloud storage service

    So is Apple storing sexually explicit images of children to compare against?

    • Re: (Score:3, Informative)

      by H3lldr0p ( 40304 )

      No, that would be super illegal.

      What they're doing is getting a hash of the images, encrypting those, and putting a database of those encrypted hashes on everybody's phone to use for comparison. The hashes are being provided by a NGO who has permission from the US govt to possess the child abuse images for tracking and helping exploited children.

      • Re: (Score:2, Informative)

        by Anonymous Coward
        Also, they have some magic algorithm that can recognize the same image, even if it has been cropped, resized, stretched etc. And in case in the future some authoritarian regime wants to ban any other type of content, all they have to do is switch the ML model. Probably gearing up for a big CCP contract.
        • Re: (Score:3, Interesting)

          by MooseTick ( 895855 )

          "Also, they have some magic algorithm that can recognize the same image, even if it has been cropped, resized, stretched etc."

          Hashes don't work that way. At best, you could have known CP content, and use common photo editing software to crop/resize/stretch those images and take hashes of all those variations. The only issue with that would be now you are potentially dealing with billions of hashes (taking many GB to store) to check against instead of thousands. That said, you could have phones just hash all

      • by omnichad ( 1198475 ) on Friday August 13, 2021 @10:52AM (#61687987) Homepage

        And since there's no way to know what the hashes are of, the government can provide hashes of any photo of interest, not limited to CP.

        • In that scenario, an Apple employee will review the image, see that it's not CSAM, and not report it to NCMEC.

          (Also note that if you store your photos in iCloud, they are not encrypted, so it would make oodles more sense for a nefarious gov't to simply scan your iCloud.)

          • Why would Apple employees be allowed to possess/review CP imagery? My understanding is that if you find it by accident, you report it - but intentional possession or viewing is illegal - even if you are just doing your job with material that you don't want to see.

            • "Why would Apple employees be allowed to possess/review CP imagery?"

              They possess the images because people effectively save it to their servers. They review it to weed out false positives.

          • Side note, they would still have to respond to a FISA warrant (or equivalent) about whether any such matches exist whether they are reported or not. So the human review only catches one side of it.

            • Side note, they would still have to respond to a FISA warrant (or equivalent) about whether any such matches exist whether they are reported or not. So the human review only catches one side of it.

              One fun thing about this discussion is just how convoluted an hackneyed the conspiracy would have to be to shoehorn a CSAM check into a nefarious plot by the government. If we're worried about the gov't subverting the NCMEC CSAM database and then getting a FISA warrant to see if anyone got flagged, then surely they would take the much simpler path of subpoenaing (or snooping on) your iCloud photos directly, which are unencrypted anyway.

        • by gweihir ( 88907 )

          And since there's no way to know what the hashes are of, the government can provide hashes of any photo of interest, not limited to CP.

          Indeed. And with a small bit of trickery any hash of any other file too.

      • by Lexicon ( 21437 ) on Friday August 13, 2021 @11:35AM (#61688241) Homepage

        And it's an NGO so there's no open government process, auditing, or accountability, so the secret organization can be easily used to selectively ruin anyone's lives the powerful don't like, or exclude anyone they do like from being included in any action.

    • AI Logarithm 1. Find sexually explicit images. Percentage of Skin Tone on tagged Human bodies, position of said human bodies.
      AI Logarithm 2. Find images of children. Height compared to other objects, General Body Shape and dimensions.

      You find a cross a cross reference of both then you have the database of Children Explicit images, which doesn't need a database of offending images to be stored.

      • Yeah, really simple: https://xkcd.com/1425/ [xkcd.com]
      • Step 3: Arrest all parents taking photos of their toddlers :P

      • What nonsense, you need databases of offending images to do one and two.

        As aside, the way our police have been known to work work perhaps some paid child rapists and molesters to supply them too, agents on the inside getting deep into the world of kiddie porn and developing twisted predilections, and kid agents.

      • by Sloppy ( 14984 )

        For some reason I want to add your two logarithms up and use the sum as an exponent, but I'm unable to comprehend what it would mean to multiply those sets of images. If I multiply a sexually explicit image by a child image, the result is always kiddie porn, right? So this would generate a vast array of kiddie porn, which seems the opposite of the problem I was trying to solve.

        If something is so important that you feel the need to post it on the internet... It probably isn't that important.

        Guilty as charged

        • by Sloppy ( 14984 )

          So this would generate a vast array of kiddie porn

          Sorry, never mind, now I see it. The products would be images squared, i.e. kiddie-porn
          tesseracts. Presumably the weird geometry makes it legal, since if a perv tries to fap to it, they'll fall down some kind of endless Escherian staircase.

  • by mark-t ( 151149 ) <markt AT nerdflat DOT com> on Friday August 13, 2021 @10:08AM (#61687755) Journal

    That's not even the worst of it.

    If they do not take a stand against this issue now, then today it is child abuse, what if it later is used for something else, say some politically unpopular opinion?

    Never say never. Historically, technology gets abused, regardless of the intentions of its creators. They need to nip this in the bud now and create a precedent for how to deal with similar ideas in the future.

    In theory I have no problem with what they are wanting to use this technology for, but the realities of the world in which we live convince me that going down this road is a colossal mistake.

    • That's why they aren't debuting this feature in China but how long will that take to occur? The most common way of sending child porn is actually Facebook Messenger. End to end encryption will prevent this from being caught. But the tradeoff is lack of privacy.

    • "How can Apple sit there with the technology to stop the capitol insurrectionist terrorists / black lives matter rioter terrorists? They are complicit! Scan all files for politically hostile memes immediately"

      t. this forum 2024, with something more topical to that year inserted instead of todays bogeymen

  • Ferengi? (Score:2, Offtopic)

    by tippen ( 704534 )

    Craig Federighi, Apple's senior vice president of software engineering

    Surely I'm not the only one that read his last name as "Ferengi"...

  • by TigerPlish ( 174064 ) on Friday August 13, 2021 @10:12AM (#61687771)

    Not just over this.

    Any of you with a recent iphone, say an 8 or later.

    Go to Photos.

    IN the search bar, type "car" . Or "House." Or whatever.

    iphone has already catalogued your photos, in the device itself. That's how it makes "memories" and all this other bullshit.

    Imagine when the Government wants Apple to tell them who has pictures of scary black rifles with detachable magazines and pistol grips. Or internal-combustion cars. Or anything else our Enlightened Inbred Leaders decide to be verbotten in the coming years.

    Fuck you, Apple. You burned the bridge to me with this one.

    I think itll be a tracfone for me from now on.

    • Just FYI, Plex does this too. Its turned on by default. Pretty much anything that stores your images will do this, most just dont surface it to the user.
    • It's an on-device analysis. I just logged into iCloud and looked at my photo library and there isn't a place to search for those things.

      Indeed, as I recall one of the complaints about how Apple does it is that every device you have reindexes every photo you have and the databases aren't shared, so if you have a slow old iPad that's connected to iPhoto, it'll take forever and burn a tonne of cycles trying to figure out which of your photos have cats on them.

      This is in contrast, of course, to Google Photos wh

    • I've been eyeing the Pine Phone but being made in China gives me pause. I'm looking for open source hardware for a phone myself.

      I am wearing an Open Smartwatch and its interesting. It does most of what I need and I control the code. Open source hardware and software for personal devices might be the answer for people like us.
  • Or in other words... (Score:5, Interesting)

    by VeryFluffyBunny ( 5037285 ) on Friday August 13, 2021 @10:19AM (#61687799)

    In other words, an IT company is defending its decision to implement warrantless search & seizure on its customers' personal files.

    We already know that several US agencies have warrantless, dragnet access to all Big IT's users' files & data. Big IT even makes profits from providing these agencies with specialised search tools (This hasn't changed since Edward Snowden told us about it). Why aren't these agencies doing anything about child porn already? Do these agencies not care about it? Will these agencies not think of the poor children?!!

    • by TheGratefulNet ( 143330 ) on Friday August 13, 2021 @10:23AM (#61687827)

      In other words, an IT company is defending its decision to implement warrantless search & seizure on its customers' personal files.

      "we're pretty sure we are on high moral ground here; you have to trust us and let us do what we want. now, and again when we find another thing that we want to do and we'll use this justification again, then, too"

      apple is beyond absurd. the fact that rational intelligent people here continue to use apple shit - fully knowing that the company has long ago (if ever?) sold the userbase down the river.

      next to trump's big lie, apple's big lie of 'you can trust us' is the biggest one of the year. and the distortion reality field catches most of you in it, too. boggle that!

      • rational intelligent people

        Who, Vulcans? AFAIK, people aren't rational. Daniel Kahneman was awarded a Nobel Prize for proving that point.

    • In other words, an IT company is defending its decision to implement warrantless search & seizure on its customers' personal files.

      It's OK though. It's completely auditable at every level.

      When they kick in your door, there will be an auditor there to check the little box that said they kicked in the door. When they pull you off the couch and falsely accuse you of having child porn because some dipshit somewhere mixed up a has, there will be an auditor there to tick the little box that says they did it.

      It's all on the up and up. Better privacy = auditable. Why didn't they just say that in the first place!

    • In other words, an IT company is defending its decision to implement warrantless search & seizure on its customers' personal files.

      Okay, maybe you haven't read the fine print, but you have actually granted Apple permission to look at everything you give them. They don't need a warrant or anything.

      The only thing their announcement was about was publicly acknowledging they were going to share findings w/ the cops. That is the only new thing there.

      • The only thing their announcement was about was publicly acknowledging they were going to share findings w/ the cops. That is the only new thing there.

        Yes, that's the part that makes it warrantless search & seizure.

    • A man's rights are more important than justice.

  • "Surveillance is Privacy"

    To be fair, if they're just using a lookup table of known CP image hashes provided by an NGO, and doing the comparison on your phone, that's at least an interesting concept. But I think it's still possible to be abused, no pun intended.

  • by Fuzi719 ( 1107665 ) on Friday August 13, 2021 @10:36AM (#61687899)
    You can tell he's lying, or at least being disingenuous, by the simple fact that he's claiming all the detractors are just "overreacting". He forgets that we all know how Apple already bends over and provides the lube for China, Saudi Arabia, et. al.
  • This is a pilot program for a new way to target ads to people. The "think of the children" rhetoric is just to get people to accept this change.
  • in Geekspeak translates “ multiple levels of failure points”.

    Having designed voting schemes in networked environment teaches that security is exposed on every level with security dependent upon a detrimental reliance at each point of failure. That “audibility” is Craig Federighi BS marketingSPIN on what amounts to a breech in security protocol to account for the compromised implementation scheme Apple chose.

    In truth, levels introduce an impossible architecture across which accountabi

  • The congitive dissonance involved in this is mind boggling. How do people not realize that spam and malware filters are just apps that "read your e-mail" and apply rules based on what it sees? Anti-malware "reads" attachments, including expanding archived files, to determine "correctness". The auto-tagging of photos has been done by every major photo-sharing/storage service for more than a decade.

    Hell, big e-mail systems will automatically group messages based on contents using algorithms to determine impor

    • Scanning / spying on email is very different from scanning / spying on personal, private documents. Antivirus are trusted not flagging and sending samples of *non-executable* documents such as photos and videos. Auto photo tagging is supposed to not phoning home.

      When Apple set up mechanism that can officially phone government based on content of personal documents, dangerous abuses is around the corner. And we all know a true criminal can easily dodge this mechanism by storing their picture somewhere els

    • Or, one can leave gmail for resumes, yahoo for a spam trap / misdirecting / poisoning mailing lists, and use Protonmail for real email.

      I did just that, the year I found Yahoo was scanning emails to stick ads "relevant to your tastes"

  • Sure, Apple says they will deny "requests" from governments to expand this to include other images (Winnie the Pooh?).

    But when those countries pass laws REQUIRING it, which of course will happen, it will be impossible to put the toothpaste back in the tube.

    Sorry Apple. You are royally screwing up on this one.

  • the new system will be auditable

    By whom? I am pretty sure that Apple employees will not be allowed to look at flagged photos and compare them to the known child porn ones.

    The auditing can only legally be done by authorities, which in some countries are more interested in other motifs than child abuse.

  • I mean, they are not that stupid. They will just go to another chat application.

  • I recall that the original report wasn't just "child pornography", but included "child endangerment."

    A lot of things endanger children. Reckless driving, parents participating in political unrest, swimming without flotation devices, playing unsupervised, walking to school in a bad neighborhood, and many other things.

    Who selects the pictures to define violations?

  • What's to stop them from writing to my photo directory?

    If I criticized Apple they could retaliate by placing a picture on my phone and getting me arrested.

  • Already the idea of having a hashed data base of kiddy porn pictures planted on my device is horrifying. This system can not be auditable because nobody obviously evern wants to see the real pictures. That apple is deceiving us with such "think of the children" rhetorics is obvious also from a technical point of view. It would be trivial to have an effective server side scan by submitting an additional hash of each picture which is encrypted with a secure one-way function. Also the illegal database is hashe
  • If all they wanted to do was protect children, they could have just implemented it and let it run. We've all agreed to the different TOU docs that would have allowed it anyway.

    Announcing it was a PR stunt that obviously went the wrong direction on them. They didn't need to announce it, but some marketing schmuck thought "oh just think of the good press we'll get" (probably from the Q groups thinking now all of those pedophile democrat celebs would get outed), a person that doesn't give a damn about privacy

  • by WaffleMonster ( 969671 ) on Friday August 13, 2021 @12:05PM (#61688387)

    There is nothing I like about Apple.

    The monoculture, unjustifiable uncompetitive premiums, walled gardens, touting privacy and respect while delivering none. They present the face of being a bastion of leftist hipsters while their supply chain is rife with human rights abuses. The only thing Apple stands for is its shareholders.

  • I previously stated that Apple will be unable to backtrack on its decision since this would imply their profits come before children's' sexual safety. They'll keep at it even if their sales go South.

    This nonsensical comment of "having to explain better" proves me right. This is exactly what a company's Pavlov response would look like when they're in denial over their own mistakes.
    br Apple will die over this. By next year sales will have dropped 50%.
  • I hate to be even posting on this topic, but doesn't the focus of the technology seem odd? Certainly, nearly everyone can agree that the danger that needs to be addressed is from the cretins that are creating this child porn. But, this technology specifically doesn't work on that at all. The images have to match those in a database - so these reference images are old and rather unlikely to have been taken by the people that are being identified by this program.
  • You are not law enforcement.
  • I think there is an angle most comments are missing: how easy it becomes to destroy someoneâ(TM)s life. Let us see two scenarios. 1. You have access to someoneâ(TM)s phone for some seconds; 2. Take pictures of child porn using that phone while locked (for instance, of pictures stored in your own phone); 3. Put the victimâ(TM)s phone back. Now when they unlock the pictures you took will be copied to the gallery and uploaded to the iCloud without the victims knowledge. Scenario 2: 1. You know
  • ...will allow parents to better monitor what images are being shared with and by their children through text messages.

    Remember, folks, by analyzing the hell out of your stuff without any way to opt-out, they're just empowering you.

Love makes the world go 'round, with a little help from intrinsic angular momentum.

Working...