Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Government Apple

Apple Says It Will Reject Government Demands To Use New Child Abuse Image Detection System for Surveillance (cnbc.com) 96

Apple defended its new system to scan iCloud for illegal child sexual abuse materials (CSAM) on Monday during an ongoing controversy over whether the system reduces Apple user privacy and could be used by governments to surveil citizens. From a report: Last week, Apple announced it has started testing a system that uses sophisticated cryptography to identify when users upload collections of known child pornography to its cloud storage service. It says it can do this without learning about the contents of a user's photos stored on its servers. Apple reiterated on Monday that its system is more private than those used by companies like Google and Microsoft because its system uses both its servers and software running on iPhones.

Privacy advocates and technology commentators are worried Apple's new system, which includes software that will be installed on people's iPhones through an iOS update, could be expanded in some countries through new laws to check for other types of images, like photos with political content, instead of just child pornography. Apple said in a document posted to its website on Sunday governments cannot force it to add non-CSAM images to a hash list, or the file of numbers that correspond to known child abuse images Apple will distribute to iPhones to enable the system.

This discussion has been archived. No new comments can be posted.

Apple Says It Will Reject Government Demands To Use New Child Abuse Image Detection System for Surveillance

Comments Filter:
  • by Tebriel ( 192168 ) on Monday August 09, 2021 @03:23PM (#61673407)

    I mean, we would never see technology where a company isn't forthright with governmental interference and use of their capabilities in order to do sketchy and immoral and illegal things. Never.

    • I mean, we would never see technology where a company isn't forthright with governmental interference and use of their capabilities in order to do sketchy and immoral and illegal things. Never.

      So... what, do we all run to Google and pretend like their business model isn't considerably more worrisome in this regard? Absolutely, by all means, criticize Apple! But please can we handle this better than the slave labor thing and make sure the rejection of this is across the board? I mean, really, Facebook's been fingerprinting images for at least five years now.

      • by saloomy ( 2817221 ) on Monday August 09, 2021 @04:26PM (#61673717)
        No. Google and Facebook make their money monetizing your data. They read it. Analyze it. Tear it apart, and extract as much from you as they can with it. In return you get the utility of their services (whatever you find that to be). You know this when you give them your data. Apple on the other hand have no reason to read your data (other than to provide you with service). You use their services and expect privacy because that is what was promised. You paid for their devices partly because of said privacy. Oh, and apple not doing anything nefarious with this tool? Bullshit. When they stood up to the FBI, their defense was, they did not write the software and could not be compelled to write it. This time, they did write the software. The photos have hashes. And the government will subpoena the hashes for ones they want to know. WHEN they do, it will be public in the US. It will be under gag order when China will do it.

        They will want to know every reporter, politician, soldier, police officer⦠hell every citizen who has https://images.app.goo.gl/KFEU... [slashdot.org]>this pic in their photo library. The results of which will be devastating. That will not be the only photo either.
        • Comment removed based on user account deletion
        • by MeNeXT ( 200840 )

          Apple on the other hand have no reason to read your data (other than to provide you with service).

          Then why must I provide a phone number in order to activate some local features that don't require any of their services?

          As long as the laws stay as they are nothing protects your privacy on any phone. Google and Microsoft are more blatant than Apple but Apple still uses information they collect about you for their own benefit. It's in their term of service.

        • by AmiMoJo ( 196126 )

          Google doesn't operate in China because they refused to cooperate with the Chinese government.

          Apple does operate in China, and fully cooperates. Chinese user's data is stored in China, and the CCP is given full access to it.

          It seems very likely that now Apple has built this powerful technology for finding photos on user's devices they will deploy it to do the work of the CCP.

      • No you have to go back to a primitive dumbphone.
      • > So... what, do we all run to Google and pretend

        What's this "we" business, that was your idea.

  • Here's a truck full of killer bees we've been breeding for years. We promise we won't release them.

  • by 93 Escort Wagon ( 326346 ) on Monday August 09, 2021 @03:28PM (#61673447)

    Apple will comply. In my mind that's the chief issue here.

    • by dgatwood ( 11270 ) on Monday August 09, 2021 @03:31PM (#61673471) Homepage Journal

      Not to mention repressive regimes. Not to mention that bad people will probably find a way to exploit it, hack-injecting their own definitions to cause problems for political dissidents, etc.

      The only way to prevent back doors from being abused is to not build them into your operating system in the first place. I'm ashamed of Apple.

      • I'm disappointed. And I cannot for the life of me understand what would posess Apple to make such a move. If they were forced to, sure, or if there was a lot of pressure from the public and from other companies already doing this. But I've never heard an Apple user say: "I wish they would scan our photos for kiddie porn". For a company having a public image of privacy and security, it is beyond bizarre that they would voluntarily do this. I guess they were just too damn proud of their (undoubtedly pate
        • by dgatwood ( 11270 )

          But I've never heard an Apple user say: "I wish they would scan our photos for kiddie porn".

          I can see a lot of parents really liking the feature where they will get notified if their kids start sending naked pictures of themselves to other people. So I'm sure there probably are a decent number of Apple users that do say those sorts of things.

          The problem is that it's part of the operating system, rather than as a third-party add-on that parents have to opt into, which means there's a nonzero risk of false positives causing real people serious harm, and there's no way to fully opt out of it. Even

          • by tragedy ( 27079 )

            I can see a lot of parents really liking the feature where they will get notified if their kids start sending naked pictures of themselves to other people.

            I would guess that they would like the feature right up until it happens, when they find out what actually happens to their kids at their point. Lots of parents don't want their kids to do drugs either, but they sure don't want their kids going to federal prison.

            Of course, this is creating hashes to compare against known hashes of CP images, so it won't actually trigger when their kids send naked pictures of themselves. Not right away anyway. If they store these hashes and compare them against future additi

            • by dgatwood ( 11270 )

              I can see a lot of parents really liking the feature where they will get notified if their kids start sending naked pictures of themselves to other people.

              I would guess that they would like the feature right up until it happens, when they find out what actually happens to their kids at their point. Lots of parents don't want their kids to do drugs either, but they sure don't want their kids going to federal prison.

              There are actually two separate features built on the same tech. The first is parental notification, which notifies parents if their kids send something that looks like it might be a nude photo. The second is known child porn notification, which alerts an authority if someone possesses a photo that is known to be child porn (from some database, presumably). The kids won't get in trouble (except by their parents).

              Of course, this is creating hashes to compare against known hashes of CP images, so it won't actually trigger when their kids send naked pictures of themselves. Not right away anyway. If they store these hashes and compare them against future additions to the db of known cp images. So, when the kids break up with their SO and the SO sends out those pictures as revenge and they end up out on the Internet, get collected by the authorities and hashed, I imagine this system will trigger then. Then probably everyone involved: the parents, their child, their child's SO, the people the SO sent the images to, etc. will go through a lengthy legal process and some of them (sometimes seems almost arbitrary which ones) will go to prison and/or end up in a registry. It does not seem like most parents would really want that.

              I would argue that if the former SO posts naked photos of an underage person on the Internet,

          • Iâ(TM)m donâ(TM)t think any parent will be ok with an apple employee reviewing their childâ(TM)s photos. Thatâ(TM)s some peeping tom stuff right there.
          • Except this isn't what it is. You think your own kids pictures will be added to the hash list? No. It'd have to be passed around enough to be noticed and maybe even flagged by police
            • by dgatwood ( 11270 )

              This is two different features, both involving on-device image (and, presumably, video) scanning, both announced the same day. In theory, the feature designed for parents is mostly harmless, because it can notify only the true owner of the device, and because it uses machine learning to identify problem images visually, rather than looking for specific data. The feature designed for detecting known child porn, however, is not, because of the risk of leaking corporate data on false positives, coupled with

              • "And so on. The more I think about this mis-feature, the more horrors I see lying at the bottom of Pandora's box."

                One of the problems is that the "mis-feature" will be used as a weapon by the same class of people (kids for the most part) who send a swat team to your house with false murder claims. I don't know if there is any way to avoid someone sending you a text message with an illegal image attached.

                It's not a new idea, the post office used to send people they didn't like kiddy porn and use that as an

        • And I cannot for the life of me understand what would posess Apple to make such a move.

          True, there is little business logic that would dictate making this move. So it makes one wonder what the motivating factor could actually be.

          We know the Trump DOJ wanted Apple to build them a back door. A terrorist incident was even leveraged in an attempt to justify their request. And the Biden DOJ is probably even worse. Apple is in constant danger of having to shelve their plans of having all user content always encrypted and unavailable to everyone. Just like how they had to cave with regard to

        • by superdave80 ( 1226592 ) on Monday August 09, 2021 @06:18PM (#61674037)

          And I cannot for the life of me understand what would posess Apple to make such a move.

          My half-assed guess? They are building this in for China to use on images THEY think are 'bad', but using the smokescreen of kiddie-porn to do it.

          • And I cannot for the life of me understand what would posess Apple to make such a move.

            My half-assed guess? They are building this in for China to use on images THEY think are 'bad', but using the smokescreen of kiddie-porn to do it.

            Well, that's certainly a half-assed guess, so kudos for being honest.

            • Apple continues to operate in China, and in order to do that they continue to do as they are told. It's not that half-assed, it's at least three-quarters assed.

        • My best guess is that it's because they're being stored on iCloud and it's a liability issue. In previous reports they mentioned that it won't scan your pictures if they're not stored on an iCloud account. I also think the whole thing is problematic, but that's the whole problem with the whole "think of the children" political plea—it's so effective at eroding liberty that very few are able to successfully stand up to it. People will give Apple a free pass for encryption that can be used by drug deale

      • Secret courts, like the FISA court in the US is what repressive regimes have.
        Apple will have no choice, and won't even be able to tell the users what they have been forced to share with the US government.
      • That is the whole point of the 'four riders of the infocalypse', or 'a terrorist with a nuke'. You take an extreme case of very bad people doing very bad things to get an agreement on the principle. This justifies the buildup of the whole infrastructure and technology for surveillance and censorship. Then changing settings becomes only a tiny operation - which in some countries is not even an issue.

        I don't want to be too radical about measures which can be abused but this one is pretty extreme.

    • Re: (Score:2, Insightful)

      by Anonymous Coward
      When China demands pictures of tank man be detected and reported, Apple will gladly comply to continue selling iPhones and iPads in the biggest phone market in the world.
    • Exactly (Score:5, Insightful)

      by SuperKendall ( 25149 ) on Monday August 09, 2021 @03:42PM (#61673521)

      Apple will reject demands - until they are forced to comply.

      The problem is, with such a capability in place they WILL be forced to, by courts or by threat.

    • As the joke goes, now they're just haggling about the price.

  • Yeah, Iâ(TM)m sure Apple will hold out over every government in the world that isnâ(TM)t China.
  • by BardBollocks ( 1231500 ) on Monday August 09, 2021 @03:29PM (#61673457)

    ..at which point it won't be able to tell us about the government using it.

    • That's what a Canary Page [wikipedia.org] is for. Maybe Apple should create one.
      • by flink ( 18449 )

        Or just don't build the capability in the first place, then no one can ask you to subvert it.

        • Or just don't build the capability in the first place, then no one can ask you to subvert it.

          That's like asking MADD mothers, to not be mad at drunk drivers. Certain topics, are a natural societal trigger. Child abuse, is certainly one of them.

          Sadly, it's also an area ripe for abuse and false accusations when you bring forth any type of automated system. I can envision people being put on some kind of "list" even if the automated systems trigger on naked bathtub pics from a proud new Mom that turn out to be false positives after a manual review. One too many false positives in a 12-month period

      • by geekmux ( 1040042 ) on Monday August 09, 2021 @03:42PM (#61673523)

        That's what a Canary Page [wikipedia.org] is for. Maybe Apple should create one.

        Maybe?

        Maybe citizens will wise up enough to demand it.

        I doubt it.

      • What if Judges learn how to read and find out that you updated it, or didn't, in order to warn people about something that you were required to keep confidential?

        Canary pages are fine as long as you believe that Judges will never discover reading.

        • by flink ( 18449 ) on Monday August 09, 2021 @04:09PM (#61673649)

          What if Judges learn how to read and find out that you updated it, or didn't, in order to warn people about something that you were required to keep confidential?

          Canary pages are fine as long as you believe that Judges will never discover reading.

          I think it's an interesting concept. Can a judge forge you specifically to lie to the public? Not withhold comment or refrain from releasing information, but actually force you to actively speak falsehood? Like if your canary mechanism was not automated and required you to type plain text into a form and post it to a website, could a court order compel you to continue doing that? It's a far more significant abrogation of free speech than a simple gag order. I'd be curious to hear if that has ever been tested. I've never heard of such a thing.

        • by sjames ( 1099 )

          They can read it all they want. An affirmative action like changing the page could get you in trouble, but it's harder to compel a lack of inaction.

  • Not an easy situation when one involves another party in ones actions. The simplest solution is if every iPhone user had their own personal iCloud. Their cloud, their responsibility. Not Apple. Not the government. The citizen. It doesn't solve the child pornography problem, but then the other would have been ineffective at best, and unworkable at the worst. So you work with what you have, preserving the most important.

    • If one buys a NAS, that is pretty much close to a personal iCloud, although having it able to be accessed from the Internet freely is a recipe for disaster, and a storage array full of crypto-locked files. However, if one is physically on the NAS's Wi-Fi network, it isn't hard to have an app back pictures and such to the NAS. From there, the NAS can run a tool like borgbase to do offsite, encrypted backups, so one has true 3-2-1 protection of what's on their phone.

    • The simplest solution is if every iPhone user had their own personal iCloud.

      Not 100% sure what apps (if any) do their own storage, but I thought Adobe at least had a camera app that would upload to Adobe's photo cloud... not that I'm sure Adobe is any better in terms of scanning or releasing photos from your cloud library to third parties.

      But that points the way to camera apps having a leg up if they introduce their own cloud storage solution, especially if they encrypt any data sent to the server.

      Quite a s

  • When will the back door be released (cracked) to the public ? How about a pool ?

    Yes, trying to find these images and people who abuse is great, but a backdoor is not the way to go. How about hashing the image as the i takes the pic and send that hash "home" for validation.

  • by Buffalo Al ( 7659072 ) on Monday August 09, 2021 @03:34PM (#61673489)
    Wow instead of the Apple woman throwing the Sledgehammer at the screen, in this case the Apple hand held screen throws the sledgehammer at you. Problem is Apple actually thinks they are Big Brother. Wrong. Big Government always wins
  • by Solandri ( 704621 ) on Monday August 09, 2021 @03:38PM (#61673503)
    They already acquiesce to the Chinese government's demands, so that China can better find and track dissidents. If they're refusing the U.S. government's demands, it just means the U.S. government isn't pushing hard enough.

    It's your f-ing phone. You should have the final say on what runs on it, period. Not the the government, not the manufacturer, not the service provider, not the OS author. When you buy a product (as opposed to lease or rent it), I feel that's a line that just shouldn't be crossed. If the manufacturer wants to insist on post-sale control over the device, then they should be required to structure it as a lease, not a sale; with the manufacturer being responsible for repair costs (warranty extended through the term of the lease) and disposal costs. Not the end-user.
  • by _xeno_ ( 155264 ) on Monday August 09, 2021 @03:44PM (#61673533) Homepage Journal

    If you've read the way the system works [apple.com], there's absolutely not way to determine if any given photo matches the hashes they're checking against, let alone what hashes they're checking against.

    The way it works is that your device receives a hashtable of NeuralHashes to blinded NeuralHashes - that is, a hashtable where the keys are based on hashes of the original NeuralHashes (but the original hashes are NOT stored) and the values are the blinded NeuralHashes that were encrypted with a encryption key only Apple knows.

    At this point, the phone creates a "safety voucher." The safety voucher has two parts: an unencrypted header that indicates the row in the hash table it used (I assume - Apple is extremely unclear on this) and then an encrypted payload that's encrypted with a key based on a combination of the original NeuralHash and the blinded key. If the image does, in fact, match the hash that the blinded secret referred to, it will decrypt. If it doesn't, it'll produce garbage.

    What this means that your phone can never know if a given image matched a hash or not, and you can never know if the hashes Apple are checking against are, in fact, the hashes Apple claims they're checking against.

    • It's a moot point since no way they find anything useful to anyone among all those false positives anyway.
    • by Dan East ( 318230 ) on Monday August 09, 2021 @08:42PM (#61674435) Journal

      and you can never know if the hashes Apple are checking against are, in fact, the hashes Apple claims they're checking against.

      Correct, and here is just one way this can be abused. Say the FBI is looking for someone (for whatever reason they want - maybe you haven't been vaccinated enough times yet), and they find some online image - any image - that they know came from that person. For example, a Facebook profile picture. They then add the hash of that image to the database. Since the majority of people move their photos with them from phone to phone, as soon as a phone with that exact image in the photo gallery has a hash match, there you go. You've got your person. Friends of that person wouldn't be downloading that picture into there phone's photo gallery - only viewing it in the various FB apps. So there's an extremely high probability that a phone with that image in the actual photo gallery is the person who took the photo.

      • Funny how your vision of possible tyranny is a public health measure. I would have used the example of being labelled a "terrorist" or being someone who has a pro-Palestinian protest poster in my Downloads folder.

  • Do they believe someone believes it? Or are they just saying it for plausible deniability?

  • But the fuckers are still going to do it
  • You wouldn't even need to force Apple to do anything. The database is outsourced. Hack, pay or threaten the the database manager and you can insert any images you want. Next time Apple updates from the DB your target is found.

    • by gweihir ( 88907 )

      Indeed. And it may not even need to be an image. Hashes can be applied to any type of file. Apple for sure cannot verify what the hashes will match until they have a match.

  • ... is concerning enough that I feel it is likely that they'd be compelled to say the exact same thing even *IF* they were complying with such a request.
  • by gweihir ( 88907 ) on Monday August 09, 2021 @04:06PM (#61673641)

    Because Apple cannot really "reject" anything here. As soon as they get an NSL or a FISA court order, they are backed against the wall. And they are not even allowed to tell anybody.

    Seriously, creating this capability means it _will_ be abused and Apple can do fuck-all about it.

  • by CoolDiscoRex ( 5227177 ) on Monday August 09, 2021 @04:12PM (#61673665) Homepage

    Attention all Frogs:

    This notice is to inform you that, effective September 1, 2021, we will be increasing your water temperature by 1 degree. This minor change, which should be completely un-noticeable by you, is necessary for your safety. Think of the tadpoles!

    Thank you for your understanding,
    Management

  • Image recognition is not exact science. The outcome of this is that your personal photos will be flagged for review by "internal team". Think twice before taking selfies in the shower.
  • by ukoda ( 537183 ) on Monday August 09, 2021 @04:23PM (#61673697) Homepage
    Firstly Apple will give the governments full access upon request, it will be a legal requirement. I don't see any debate on that point here.

    However it is much worst, the request will start now. Having told the world it is possible to scan user's photos with some future feature to governments means the data they want is available today. With all those public announcements the cat is out of the bag with respect to what is possible and there is no putting it back now.
  • Checksums/values get into a database one way or another. One needs to calculate the checksum for a few iOS system files and then get them loaded into the system - suddenly everyone is a false positive at the same time.

  • Comment removed based on user account deletion
  • Or anyplace required by law.
  • Shouldn't they be saying that they will reject ALL REQUESTS?
  • I thought apple was concerned about privacy. They might not always have lived up to it, but at least they were saying the right things. I was considering making my next phone an iphone.

    This technology will be misused. There's no question about it.

    Now they aren't even saying the right things. I no longer have any desire to switch from android.

  • But when a country passes a law requiring it? Apple will fold like a cheap suit.

  • poor journalism (Score:4, Insightful)

    by fulldecent ( 598482 ) on Monday August 09, 2021 @08:39PM (#61674427) Homepage

    As always, the journalist asked the wrong question.

    The correct question is:

    "Apple, will you stop doing business in a country when they require you to add images to your image-checking program which you have not confirmed is actually child porn?"

    • There's no way Apple will drop such a market on their own volition. If they did it will only give other companies that market, harming Apple and not making the situation better for anyone. Do you think the companies that would take over that market share would be superior to Apple?

      What is needed is coordinated action by governments to block trade with such authoritarian markets on human rights and/or strategic/security grounds. And maybe not just the US government. One government blocking that market has si
  • I am VERY certain that a company with a comprehensive list of rich and powerful people who consume child pornography is not going to be receiving much in the way of governmental pressure. Over anything. Ever again.
  • Unless apple is willing to have its executives arrested for failing to comply with legal court orders, this is a completely meaningless statement.
  • I've already stated that Apple has painted itself into a corner it cannot get out of. Dropping the whole scheme would imply that their profits trump the sexual safety of children and they'll therefore continue with it even when they're losing customers hand over fist.

    I wonder what Steve Jobs would've done?
  • Why didn't you reject apple in the first place?
  • So it is legal for Apple Employees to collect and view child porn, but it is evil if someone else does it?

    Why is that?

    Does someone check to make sure they do not get aroused or do we just take their word for it. Apple is not law enforcement. It should be just as illegal for them to possess.

  • I can't boycott Apple because I have never and never will use their products. This is yet another example of why. They think 'mother knows best' and wish to control every aspect of buyers using their product.

    The reason I decided to never use Apple products started when I discovered you couldn't build your own Apple based PC and that their license forbids usage of their OS on on-Apple products. That's just 100% BS. To paraphrase Henry Ford, we can have any color we want as long as they deem it good for u
  • It looks like the model was trained to look for cryptographic signatures of known caches of child abuse content. The technology, as far as I can tell, won't care or know anything about your actual personal data. I have zero problem with them scanning images stored in the cloud. These are their servers and you are renting them. They have an obligation to their shareholders not to host this content, and scanning in this way seems like the best way I can think of to preserve privacy while doing it. On the
  • Trust us, we definitely won't abuse it on behalf of a secret subpoena we can't tell you about. ~ Apple | China: Bend over, or GTFO.
  • on the last article about this i commented about how they will be doing it on-device not just in-icloud. "that correspond to known child abuse images Apple will distribute to iPhones to enable the system" seems like i called it.
  • How when they get a fisa letter etc?

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (10) Sorry, but that's too useful.

Working...